0

In linear algebra, we have the following well-known result.

Proposition. Every real symmetric matrix $A$ is congruent to a diagonal matrix with real eigenvalues on the diagonal. That is, $A=P^T \Lambda P$, where $P$ is orthogonal and $\Lambda:=\left(\lambda_i\right)_{n\times n}$.

Question: Is the spectral theorem absolutely indispensable when proving this result? Is it possible to bypass the spectral theorem?

It seems to me, matrix similarity relates different matrix forms ($A_{B_1}$ and $A_{B_2}$) of a same linear operator $A$ under different bases ($B_1$ and $B_2$): $A_{B_1}=P_{12}^{-1} A_{B_2} P_{12}$. In general, this has nothing to do with matrix congruence. The two concepts seem to only coincide when the operator involved is self-adjoint w.r.t a fixed inner product $\langle \cdot, \cdot \rangle$ and the change of basis is between two orthonormal bases. But if so, spectral theorem seems unavoidable. Then, such a change of basis induces a unitary (orthogonal) matrix $P$, which happens to satisfy $P^*=P^{-1}$ (or $P^T=P^{-1}$ in the real case).

Is this understanding correct? That is, we can only first get to matrix similarity through the spectral theorem, and then get to matrix congruence by the fact that we happen to have a change of orthonormal basis, which is unitary (orthogonal)?

Thanks in advance.

user760
  • 2,478
  • Your "Proposition" seems to be the [matrix form of] spectral theorem which is not the same thing as Sylvester's Law. In general congruence arguments only require $P\in GL_n(\mathbb R)$ and Sylvester's Law says something about the [number of positive and negative] entries of the diagonal matrix $A$ is congruent to. I've read this a few times and cannot tell what exactly your question is. – user8675309 Apr 25 '23 at 16:16
  • @user8675309 Maybe I should phrase it better. The question is, the (fin-dim) spectral theorem is about matrix similarity. But many results about real symmetric matrices talk about matrix congruence to the diagonal matrix of eigenvalues. But this diagonal matrix is obtained by applying the matrix similarity provided by spectral theorem. It then gets further specified this matrix similarity is also a matrix congruence, because the transform matrix happens to be unitary. So do we have to use the spectral theorem first, before getting to congruence? Can we do so directly without matrix similarity? – user760 Apr 25 '23 at 16:32
  • @user8675309 Like, can we get to matrix congruence without going through the middle step of matrix similarity provided by spectral theorem? Also, you are right. I removed the "Sylvester" sentence – user760 Apr 25 '23 at 16:33
  • "But many results about real symmetric matrices talk about matrix congruence to the diagonal matrix of eigenvalues" the inclusion of those last two words makes this a strange statement. If you study bilinear forms (especially when $\text{char }F \neq 2$) for finite dim $V$ then congruence arguments for symmetric matrices will make perfect sense and you won't insist on bringing up eigenvalues. What the spectral theorem for real symmetric matrices tells you is that you can do a congruence transform [to a diagonal matrix] that is also a similarity transform, and this is special. – user8675309 Apr 25 '23 at 16:43
  • @user8675309 Thanks. My question is exactly whether, in this special case of real symmetric matrices, we can arrive at the conclusion of congruence to real diagonal matrices without bringing up eigenvalues or spectral theorem or matrix similarity at all. I suspect it's not possible. But I'm not sure. Sorry if my question wasn't clear to the point. – user760 Apr 25 '23 at 16:51
  • @user8675309 Perhaps I should ask it this way. Before proving Sylvester's law of inertia, we should have already proved, for a bilinear form $\varphi: V\times V \to \mathbb{R}$, its associated real symmetric matrix $A$ under a base $B$ is congruent to $\left( \begin{array}{c c c} I_p \ & -I_q & \ & & 0_m \end{array} \right)$. Can we directly prove that without applying the spectral theorem? – user760 Apr 25 '23 at 17:06
  • To your most recent question: yes, absolutely when $\phi$ is a symmetric bilinear form. If you study bilinear forms (e.g. in Artin's Algebra) you will do this. What Sylvester's law then tells you is that $p$ and $q$ don't depend on choice of $B$. – user8675309 Apr 25 '23 at 17:51
  • @user8675309 I meant quadratic form, not bilinear form. Sorry. That was a typo. Is it possible if you can give a sketch of how to show congruence without using spectral theorem or any mention of eigenvalues/eigenvectors? As an answer below? Thanks – user760 Apr 25 '23 at 18:22
  • Take a look at case (i) in my answer here: https://math.stackexchange.com/questions/4370647/for-non-degenerate-symmetric-bilinear-form-there-is-a-basis-v-i-such-that/ ... I show congruence of a symmetric matrix to some diagonal matrix for any field not of characteristic 2... which makes it clear that it has nothing to do with spectral theorem which does not hold e.g. over positive characteristic. [Technical nit: to get the diagonal matrix to have $\pm 1$ on the diagonal requires square roots in your field.] – user8675309 Apr 25 '23 at 18:30
  • For completeness: the previous link is for invertible symmetric matrices. To deal with singular symmetric matrices, first do the congruence transform here https://math.stackexchange.com/questions/3718534/prove-the-existence-of-a-principal-submatrix-of-order-r-in-m-in-bbb-fn-time , then focus on that leading non-singular principal submatrix – user8675309 Apr 25 '23 at 18:39
  • @user8675309 Thank. So first take care of the null space, then prove congruence using induction on the largest submatrix with full rank. That's neat. I guess the reason why I think of it in terms of spectral theorem is because I see the matrix as merely the expression of a self-adjoint operator, rather than a quadratic form. Both arguments are valid simultaneously. But your method still makes use of eigenvalue implicitly. Because being invertible/noninvertible is equivalent to having/not-having $0$ as an eigenvalue. – user760 Apr 25 '23 at 19:37
  • Perhaps you could go the complex analysis route like here or here. Applying residue theorem to the contour integral gives $A$ as a sum of residues of the resolvent. Proving that roots of resolvent are real will give the result above. – Yaroslav Bulatov Apr 26 '23 at 00:28
  • @YaroslavBulatov That's unexpected but brilliant! Although my original question is trying to trim off any analysis involved in the steps, I like this. Thanks a lot! – user760 Apr 26 '23 at 08:13

0 Answers0