0

How can I use the spectral theorem to prove this corollary ("spectral theorem for matrices")?

Let $M$ be a symmetric matrix or order $n$. Then $M$ is diagonalizable. Also, there exist an orthogonal matrix $P$ of order $n$ such that $M=P^{-1}DP$, where $D$ is a diagonal matrix.

2 Answers2

1

The latest version of Proofs from THE BOOK (5th edition) has a nice proof of this theorem (in chapter 7) orginally published by Herb Wilf.

In short, given symmetric matrix $A$, he defines the functions $$f_A: Q \mapsto Q^\top A Q$$ and $$\text{Od}: A \mapsto \sum_{i\neq j}a_{ij}.$$

He then proves that there exists an orthogonal matrix $Q_0$ such that $f_A\circ \text{Od}(Q_0) = 0$.

Suddenly our problem can be solved with analytical techniques! We can show that there exists a minimum for $f_A \circ \text{Od}$, and we can get a contradiction if we assume that this minimum would be greater than zero.

I have made this prezi from this proof: https://prezi.com/aexypzd25tmd/spectral-theorem-from-proofs-of-the-book/

This prezi was meant for a presentation, but I think it may be understandable on its own as well.

Kasper
  • 13,940
0

You can prove it using Schur's unitary triangularization theorem which states that for every square matrix there exists some unitary matrix $U$ such that $$U^*MU=T$$ wih $T$ an upper triangular matrix with diagonal elements the eigenvalues of $M$. The unitary matrix $U$ is created in an iterative manner as follows.

Assume $\lambda_1\cdots,\lambda_n$ eigenvalues of $M$. Let $v_1$ normalized eigenvector of $M$ that corresponds to eigenvalue $\lambda_1$. Then we can create an orthonormal basis comprising of $v_1,z_2,\ldots,z_n$ (Gram-Schmidt orthonormalization process). Define now $$V_1=\left[\matrix{v_1& z_2 & \cdots & z_n}\right]$$ Then $$V_1^*MV_1=\left[\matrix{\lambda_1 & *\\ 0 & M_1}\right]$$ with $M_1$ square with dimensions $(n-1)\times (n-1)$ having eigenvalues $\lambda_2\cdots,\lambda_n$.

Let $v_2\in\mathbb{C}^{n-1}$ normalized eigenvector of $M_1$ that corresponds to eigenvalue $\lambda_2$. Then we can similarly create an orthonormal basis comprising of $v_2,\cdots$. Those vector can be used as columns of a unitary matrix $U_2$ such that $$U^*_{2}M_1U_2=\left[\matrix{\lambda_2 & *\\ 0 & M_2}\right]$$ with $M_2$ square with dimensions $(n-2)\times (n-2)$ having eigenvalues $\lambda_3\cdots,\lambda_n$. If we now define $$V_2=\left[\matrix{1 & 0\\ 0 & U_2}\right]$$ then $$V_2^*V_1^*MV_1V_2= \left[\matrix{\lambda_1 & * & *\\ 0 & \lambda_2 & * \\ 0 & 0 & M_2}\right]$$ Repeating in this way we can prove $$V_{n-1}^*\cdots V_2^*V_1^*MV_1V_2\cdots V_{n-1}= \left[\matrix{\lambda_1 & * & * & *\\ 0 & \lambda_2 & * & *\\ 0 & 0 & \ddots & *\\ 0& 0 & \cdots & \lambda_n}\right]$$ i.e. $U^*MU=T$ with unitary $U=V_1V_2\cdots V_{n-1}$.

Then for a Hermitian $M$ (symmetric if $M$ has real elements) it is easy to prove that an upper triangular $T$ satisfying $$TT^*=U^*MUU^*M^*U=U^*MM^*U=U^*M^*MU=U^*M^*UU^*MU=T^*T$$ must be diagonal.

RTJ
  • 4,672