31

Let $l^2$ be the space of square summable sequences with the inner product $\langle x,y\rangle=\sum_\limits{i=1}^\infty x_iy_i$.
(a) show that $l^2$ is H Hilbert space.

To show that it's a Hilbert space I need to show that the space is complete. For that I need to construct a Cauchy sequence and show it converges with respect to the norm. However, I find it confusing to construct a Cauchy sequence of sequences?

  • 1
    Some related posts: http://math.stackexchange.com/questions/147446/completeness-of-ell2-space, http://math.stackexchange.com/questions/711075/understanding-a-proof-for-why-ell2-is-complete (Probably more posts about this can be found.) – Martin Sleziak Apr 30 '15 at 17:22
  • Also showing the Parallelogram law holds might be a shortcut. – arridadiyaat Aug 14 '24 at 14:39
  • I don't know how to prove that the linearity holds, so how can i prove that ⟨(zn)+(vn),(wn)⟩=⟨(zn),(wn)⟩+⟨(vn,wn)⟩ ? More specifically, I don't understand why we can split the infinite sum while proving this. How did you do this? – user33 Apr 05 '25 at 08:26

5 Answers5

22

In this answer, I will use $x_n$ as a sequence in $l^2$ and write $x_n(k)$ as the $k$-th member of that sequence.

The norm in the Hilbert space is given by $\|x\| = \sqrt{\langle x, x \rangle}$. We wish to show that if a sequence $\{ x_n \} \subset l^2$ is Cauchy, then it converges in $l^2$.

Suppose that $\{x_n\}$ is such a Cauchy sequence. Let $\{ e_k \}$ be the collection of sequences for which $e_k(i) = 1$ if $i=k$ and zero if $i\neq k$.

Then $\langle x_n, e_k \rangle = x_n(k)$. Notice that $$|x_n(k) - x_m(k)| = |\langle x_n - x_m, e_k \rangle| \le \|x_n-x_m\| \| e_k\| = \|x_n-x_m\|$$ for all $k$ (also note that this convergence is uniform over $k$). Therefore the sequence of real numbers given by $\{x_n(k)\}_{n\in \mathbb{N}}$ is Cauchy for each $k$, and thus converges. Call the limit of this sequence $\tilde x(k)$.

Let $\tilde x = (\tilde x(k))_{k\in\mathbb{N}}$. We wish to show that $\tilde x \in l^2$.

Consider $$\sum_{k=1}^\infty |\tilde x(k)|^2=\sum_{k=1}^\infty |\lim_{n\to\infty} x_n(k)|^2=\lim_{n\to\infty} \sum_{k=1}^\infty |x_n(k)|^2=\lim_{n\to\infty}\|x_n\|^2.$$

The exchange of limits is justified, since the convergence of $\lim_{n\to\infty} x_n(k)$ is uniform over $k$. Finally, since $\{ x_n \}$ is Cauchy, the inequality, $$| \|x_m\| - \|x_n\| | < \| x_m - x_n\|$$ implies that $\|x_n\|$ is a Cauchy sequence of real numbers, and so $\|x_n\|$ converges. Thus $\tilde x$ is in $l^2$.


Edit: Completing the proof as per the comments.

We have thus shown that $\tilde x$ is in $l^2$. $\tilde x$ is the most likely candidate for the Cauchy sequence to converge to, and it has been demonstrated to be in our space. What remains is to show that $$\| x_n - \tilde x\| \to 0$$ as $n \to \infty$.

We will utilize a generalized form of the dominated convergence theorem for series. This states that if $a_{n,k} \to b_k$ for all $k$, $a_{n,k} < d_{n,k}$ and $\sum_{k} d_{n,k} \to \sum_{k} D_k < \infty$, then $\lim_{n \to \infty} \sum_{k=0}^\infty a_{n,k} = \sum_{k=0}^\infty b_k$. (here $a_{n,k}, b_k, d_{n,k}, D_{k}$ are all real numbers)

Writing $$\| x_n - \tilde x\|^2 = \sum_{k=0}^\infty |x_n(k) - \tilde x(k)|^2.$$

We see that in this case $a_{n,k} = |x_{n}(k) - \tilde x(k)|^2$, $b_k = 0$, and we must find a $d_{n,k}$ that "dominates" $a_{n,k}$ to finish the proof.

Now note that $|x_n(k) - \tilde x(k)|^2 \le 2 |x_n(k)|^2 + 2 |\tilde x(k)|^2$ and $$\lim_{n \to \infty} \sum_{n=0}^\infty ( 2 |x_n(k)|^2 + 2 |\tilde x(k)|^2) = \sum_{k=0}^\infty (2 |\tilde x(k)|^2 + 2 | \tilde x(k)|^2).$$ Recall that we demonstrated $\lim_{n \to \infty} \sum_{n=0}^\infty |x_n(k)|^2 = \sum_{n=0}^\infty |\tilde x(k)|^2$ in the first half. Thus $D_k$ is played by $4|\tilde x(k)|^2$ in this case.

Thus by the dominated convergence theorem we may conclude that $$\sum_{k=0}^\infty |x_n(k)-\tilde x(k)|^2 \to 0.$$

Joel
  • 16,574
  • When you took the limit out of the sum, you gave a justification. But isn't that always true (property of limits)? @Joel – user235727 Apr 30 '15 at 17:49
  • Also, in the proof you said "We wish to show that if a sequence ${x_n}\in l^2$". Isn't it that $x_n\in l^2$Can ${x_n}$ be in $l^2$? – user235727 Apr 30 '15 at 18:02
  • The exchange of limits does not always hold. You need for one of the limits to be uniform in order to exchange it. There is a theorem in baby Rudin to this effect. The answer to your second question, I meant to type ${ x_n } \subset l^2$. My mistake. – Joel Apr 30 '15 at 19:01
  • A basic example in Rudin's Principles of Mathematical Analysis shows that the exchange of limits takes some care. For example, let $s_{n,m} = m/(m+n)$ and compare $\lim_{n\to\infty} \lim_{m\to\infty} s_{n,m}$ with $\lim_{m\to\infty} \lim_{n\to\infty} s_{n,m}$. – Joel Apr 30 '15 at 19:08
  • What are the two limits that you have exchanged in the proof? @Joel – user235727 Apr 30 '15 at 19:13
  • 1
    Specifically, this is what happened: $$\sum_{k=1}^\infty |\tilde x(k)|^2 = \sum_{k=1}^\infty \lim_{n\to\infty} |x_n(k)|^2 = \lim_{N\to\infty} \lim_{n\to\infty} \sum_{k=1}^N |x_n(k)|^2= \lim_{n\to\infty} \lim_{N\to\infty}\sum_{k=1}^N |x_n(k)|^2$$

    $$=\lim_{n\to\infty} \sum_{k=1}^\infty |x_n(k)|^2 = \lim_{n\to\infty} \sum_{k=1}^\infty |x_n(k)|^2 = \lim_{n\to\infty} |x_n|^2$$

    – Joel Apr 30 '15 at 19:21
  • 2
    You have showed that ${x_n}$ converges to $\tilde x$ point- wisely but not in norm. $\tilde x$ is the only candidate ${x_n}$ can converge but your proof lacks the part showing it. @Joel – Deniz Sargun Feb 06 '16 at 12:14
  • @denizsargun I'll take another look when I have the time. It's been a few months since I answered this question – Joel Feb 06 '16 at 14:16
  • @DenizSargun, sorry it took me so long to write these few lines. I'm sure that you knew how to resolve this yourself, since you spotted the error in the first place. – Joel Jan 16 '17 at 20:13
  • 1
    @Joel. No worries. I just wanted everyone to have a complete proof on this. – Deniz Sargun Jan 16 '17 at 20:27
  • @Joel, uniform convergence is not enough to guarantee that exchanging the limits won't affect the result: consider $x_n (k) = \frac{1}{\sqrt{n}}$ if $k\leq n$ and zero otherwise. However, since the limit of the norms of the $x_{n}$ exists and is finite, $\tilde{x}$ is square-summable. – user115624 Apr 02 '17 at 21:20
  • 1
    @user115624 What do you mean? Let $f_{n, k} = \frac{1}{\sqrt n} \chi_{[0, n]}(k)$. Then, $\lim_{k \rightarrow \infty} \lim_{n \rightarrow \infty} f_{n,k} = \lim_{n \rightarrow \infty} \lim_{k \rightarrow \infty} f_{n,k} = 0$ – James C Mar 08 '21 at 10:24
  • I know it's super late, but could you give some insight for $L^2(a, b)$ space also? @Joel I couldn't understand how to construct a Cauchy sequence from $L^2(a, b)$ ? – falamiw Aug 14 '22 at 20:06
  • Which result do you use when you claim that the exchange of limits is justified because of uniform convergence in $k$ of $\lim_n x_n(k)$? – Sha Vuklia Oct 31 '23 at 10:21
11

Let $(\mathbf{x_n})$ be a Cauchy sequence in $l^2$, where $\mathbf{x_n} = (x_1^{(n)},x_2^{(n)},\ldots)$, i.e., given $\epsilon >0$ there exists a natural number $N$ such that for all $m,n\geq N$ \begin{equation} \|\mathbf{x_n}-\mathbf{x_m}\| = \left(\sum_\limits{j=1}^{\infty}|x_j^{(n)}-x_j^{(m})|^2\right)^{\frac{1}{2}} <\epsilon \end{equation} In particular, it follows that for every $j=1,2,\ldots$ we have \begin{align} |x_j^{(n)}-x_j^{(m)}| < \epsilon && (m,n\geq N). \end{align} That is for each fixed $j$ the sequence $(x_j^{(n)},x_j^{(n)},\ldots)$ is a Cauchy sequence in the scalar field $\mathbb{R}$ or $\mathbb{C}$ and hence it converges. Let $x_j^{(n)} \to x_j$ as $n \to \infty$. Using these limits now define $\mathbf{x} = (x_1,x_2,\ldots)$. Now, with this basic setting try to show that $(\mathbf{x_n})$ converges to $\mathbf{x}$.

Urban PENDU
  • 2,499
  • Dear Urban, please forgive me for this silly question. (I am not a mathematician by training.) Why do you say that if for each $j$ the sequence $x^{(n)}_j$ is Cauchi in $\mathbb C$, then it converges? I may have forgotten some basic theorem from the undergraduate calculus... – Michael_1812 Jan 24 '25 at 19:31
9

A typical proof of the completeness of $\ell^2$ consists of two parts.

Reduction to series

Claim: Suppose $ X$ is a normed space in which every absolutely convergent series converges; that is, $ \sum_{n=1}^{\infty} y_n$ converges whenever $ y_n\in X$ are such that $ \sum_{n=1}^{\infty} \|y_n\|$ converges. Then the space $X$ is complete.

Proof. Take a Cauchy sequence $ \{x_n\}$ in $X$. For $ j=1,2,\dots$ find an integer $ n_j$ such that $ \|x_n-x_m\|<2^{-j}$ as long as $ n,m\ge n_j$. (This is possible because the sequence is Cauchy.) Also let $ n_0=1$ and consider the series $ \sum_{j=1}^{\infty} (x_{n_{j}}-x_{n_{j-1}})$. This series converges absolutely, by comparison with $\sum 2^{-j}$. Hence it converges. Its partial sums simplify (telescope) to $ x_{n_j}-x_1$. It follows that the subsequence $ \{x_{n_j}\}$ has a limit. It remains to apply a general theorem about metric spaces: if a Cauchy sequence has a convergent subsequence, then the entire sequence converges. $\quad \Box$

Convergence of absolutely convergent series in $\ell^2$

Claim:: Every absolutely convergent series in $ \ell^2$ converges

Proof. The elements of $ \ell^2$ are functions from $ \mathbb N$ to $ \mathbb C$, so let's write them as such: $ f_j: \mathbb N\to \mathbb C$. (This avoids confusion of indices.) Suppose the series $ \sum_{j=1}^{\infty} \|f_j\|$ converges. Then for any $ n$ the series $ \sum_{j=1}^{\infty} f_j(n)$ converges, by virtue of comparison $|f_j(n)| \le \|f_j\|$.

Let $ f(n) = \sum_{j=1}^{\infty} f_j(n)$. So far the convergence is only pointwise, so we are not done. We still have to show that the series converges in $ \ell^2$, that is, its tails have small $ \ell^2$ norm: $ \sum_{n=1}^\infty |\sum_{j=k}^{\infty} f_j(n)|^2 \to 0$ as $ k\to\infty$.

What we need now is a dominating function/sequence (sequences are just functions with domain $\mathbb{N}$), in order to apply the Dominated Convergence Theorem. Namely, we need a function $ g: \mathbb N\to [0,\infty)$ such that

$$ \sum_{n=1}^{\infty} g(n)^2<\infty \tag{1}$$ $$ \left|\sum_{j=k}^{\infty} f_j(n)\right| \le g(n) \quad \text{for all } \ k,n \tag{2} $$

Set $ g(n) = \sum_{j=1}^{\infty} |f_j(n)| $. Then (2) follows from the triangle inequality. Also, $ g$ is the increasing limit of functions $ g_k(n) = \sum_{j=1}^k |f_j(n)| $. For each $k$, using the triangle inequality in $ \ell^2$, we get $$ \sum_n g_k(n)^2 = \|g_k\|^2\le \left(\sum_{j=1}^k \|f_j\|\right)^2 \le S^2$$ where $S= \sum_{j=1}^\infty\|f_j\|$. Therefore, $ \sum_n g(n)^2\le S^2$ by the Monotone Convergence Theorem.

To summarize: we have shown the existence of a summable function $g^2$ that dominates the square of any tail of the series $\sum f_j$. This together with the pointwise convergence of said series yield its convergence in $\ell^2$. $\quad\Box$

YuiTo Cheng
  • 3,841
2

Pervin = Pervin, W. J.: Foundations of General Topology, New York: Academic press, 1964.
See the proof of Pervin, p.119, Theorem 7.1.2.
All you need to do is replace $H$ in the proof with $l^2$.
The proof is divided into two steps:
Step 1: ($\{x_n\}$ is Cauchy) $\Rightarrow$ ($x_n\to x$) [Pervin, p.119, l.$-$13--l.$-$1].
Step 2: $x\in l^2$ [Pervin, p.120, l.1--l.8].

1

Hint : Let $(x^n)$ be a Cauchy sequence. We can write $x^n=\sum _k x_k^ne_k$ where $(e_k)$ is the canonical basis. Show $|x_k^n-x_k^m|\leq \|x^n-x^m\|$ so $(x_k^n)_n$ is a Cauchy sequence in $\mathbb{C}$, so there exist $x_k$ such that $\lim x_k^n=x_k$ for every $k$. Define $x=\sum x_ke_k$ and prove that $\|x^n-x\|\to 0$.

Patissot
  • 2,635