I got the matrix for the standard inner product space on polynomial space $\mathbb{P}_n$ as $$H_n=\begin{bmatrix}1&1/2&1/3&\cdots&1/(n+1)\\1/2&1/3&1/4&\cdots&1/(n+2)\\\cdots&\cdots&\cdots&\cdots&\cdots\\1/(n+1)&1/(n+2)&1/(n+3)&\cdots&1/(2n+1)\end{bmatrix}.$$ We know matrix representation of inner product is invertible. But I want to show explicitly that the obtained matrix is invertible. How to show that?
-
1What exactly do you mean by "show explicitly"? Would it be enough to show that (in general) the matrix associated with an inner product is necessarily invertible? – Ben Grossmann Nov 17 '21 at 21:08
-
without using the fact that the matrix representation of IP is invertible. – Madhan Kumar Nov 17 '21 at 21:11
-
Cauchy determinant ? – Start wearing purple Nov 17 '21 at 21:59
2 Answers
Define $\langle.,,\rangle$ on $\mathbb{P}_n$ as
$$ \langle p(x), q(x)\rangle = \int_0^1 p(x)q(x)dx. $$
It is easy to check that $\langle.,,\rangle$ is an inner product.
Notice that
$$ H_n=\begin{bmatrix} \langle 1,1\rangle&\langle 1,x\rangle&\langle 1,x^2\rangle&\cdots&\langle 1,x^n\rangle\\ \langle x,1\rangle&\langle x,x\rangle&\langle x,x^2\rangle&\cdots&\langle x,x^n\rangle\\ \cdots&\cdots&\cdots&\cdots&\cdots\\ \langle x^n,1\rangle&\langle x^n,x\rangle&\langle x^n,x^2\rangle&\cdots&\langle x^n,x^n\rangle\\ \end{bmatrix}$$
is the Gram matrix of $1,x,x^2,\dots,x^n$ with $\langle.,,\rangle$. Now, determinant of a Gram matrix is non-zero if and only if the vectors whose inner products are used in its construction are linearly independent. However, $1,x,x^2,\dots,x^n$ is a basis of $\mathbb{P}_n$ and hence linearly independent. Therefore, $\det H_n\ne 0$ and so we conclude that $H_n$ is invertible.
- 3,524
-
3This particular Gram matrix has a name: it is the Hilbert matrix ; it is well known in numerical analysis because its determinant is very close to $0$ even for moderate values of $n$, then is a good test for analysing stability of some algorithms. – Jean Marie Nov 17 '21 at 21:31
-
2
-
2@JeanMarie Numerical analyst here -- good point, although what matters is not its determinant, but rather its condition number. – Federico Poloni Nov 18 '21 at 08:07
It suffices to show that the equation $H_nx = 0$ has the unique solution $x = 0$ (i.e. that its columns are linearly independent). So, suppose that $x = (x_1,\dots,x_{n}, x_{n+1})$ is such a solution. It follows that $x^T(H_n x) = 0$, which is to say that \begin{align} 0 & = x^TH_n x = \sum_{i,j = 1}^{n+1} H_n[i,j] x_i x_j \\ & = \sum_{i,j = 1}^{n+1} x_i x_j \int_0^1 t^{i-1} t^{j-1}\, dt \\ & = \int_0^1 \sum_{i,j = 1}^{n+1} x_ix_j \,t^{i-1}t^{j-1}\,dt = \int_0^1 (x_1 + x_2 t + \cdots + x_{n+1} t^{n})^2\,dt. \end{align} This integral can only be zero if $x_1 + x_2 t + \cdots + x_{n+1} t^{n}$ is the zero-function over $[0,1]$ (in general, the integral of a continuous non-negative function over an interval is zero iff that function is identically zero). However, this only occurs if $x_1 = \cdots = x_{n+1} = 0$, which is to say that $x = 0$, which is what we wanted to show.
- 234,171
- 12
- 184
- 355
-
2+1 It's worth noting that this is essentially the proof that Grammian matrices are positive semi-definite (see Adam Zalcman's answer) applied to this particular matrix. – Theo Bendit Nov 17 '21 at 21:54
-
Shouldn't it be $x_1=...=x_n=x_{n+1}=0$? I mean, if only the first $n$ entries of $x$ are $0$ but the $n+1$ entry of $x$ is not necessarily $0$ then $x$ is not necessarily the vector $0$. Great answer btw! – user926356 Jun 18 '24 at 07:07
-
@user926356 that’s right, just have 2 typos in the last paragraph – Ben Grossmann Jun 18 '24 at 11:58