6

Let $A \in \Bbb R^{n \times n}$ be a symmetric matrix and let $\lambda \in \Bbb R$ be an eigenvalue of $A$. Prove that the geometric multiplicity $g(\lambda)$ of $A$ equals its algebraic multiplicity $a(\lambda)$.

We know that if $A$ is diagonalizable then $g(\lambda)=a(\lambda)$. So all we have to show is that $A$ is diagonalizable.

I found a proof by contradiction. Assuming $A$ is not diagonalizable we have

$$(A- \lambda_i I)^2 v=0, \ (A- \lambda_i I) v \neq 0,$$

where $\lambda_i$ is some repeated eigenvalue. Then

$$0=v^{\dagger}(A-\lambda_i I)^2v=v^{\dagger}(A-\lambda_i I)(A-\lambda_i I) \neq 0$$

which is a contradiction (where $\dagger$ stands for conjugate transpose).

OK but isn't there a better proof? I see it could be approached by the Spectral theorem or Gram Schmidt Prove that real symmetric matrix is diagonalizable. A hint for how to do so would be appreciated.

user26857
  • 53,190
JD_PM
  • 1,149
  • As the eigenvalues are real, one need only consider real (generalised) eigenvectors, and so one needs transpose, rather than conjugate transpose. Anyway, this does give a method to prove that the geometric and algebraic multiplicities ar the same. – Angina Seng Aug 31 '20 at 21:08
  • @AnginaSeng but I am looking for an alternative proof. – JD_PM Aug 31 '20 at 22:52
  • 4
    @JD_PM If you're looking for an alternative proof, then could you please explain what you feel is missing from the proof you have? It would be unpleasant to post a proof only to have you say "but I don't like this proof either" – Ben Grossmann Sep 01 '20 at 07:41
  • @BenGrossmann first, my apologies if my message sounded rude. It was not my intention at all. I should have used the word alternative instead of best in my original post. I do not dislike the proof above. It is just that I am curious to learn an alternative proof using either the spectral theorem or Gram Schmidt or another approach. – JD_PM Sep 01 '20 at 09:48
  • 1
    Should the last line be $v^{\dagger}(A-\lambda_i I)(A-\lambda_i I)v =\left((A-\lambda_i) v\right)^\dagger \left((A-\lambda) v\right) \neq 0$? – user760 Apr 23 '23 at 05:46

2 Answers2

4

The proof with the spectral theorem is trivial: the spectral theorem tells you that every symmetric matrix is diagonalizable (more specifically, orthogonally diagonalizable). As you say in your proof, "all we have to show is that $A$ is diagonalizable", so this completes the proof.

The Gram Schmidt process does not seem relevant to this question at all.

Honestly, I prefer your proof. If you like, here is my attempt at making it look "cleaner":


We are given that $A$ is real and symmetric. For any $\lambda$, we note that the algebraic and geometric multiplicities disagree if and only if $\dim \ker (A - \lambda I) \neq \dim \ker (A - \lambda I)^2$. With that in mind, we note the following:

Claim: All eigenvalues of $A$ are real.

Proof of claim: If $\lambda$ is an eigenvalue of $A$ and $x$ an associated unit eigenvector, then we have $$ Ax = \lambda x \implies x^\dagger Ax = x^\dagger (\lambda x) = \lambda. $$ However, $$ \bar \lambda = \overline{x^\dagger Ax} = (x^\dagger A x)^\dagger = x^{ \dagger } A^\dagger x^{\dagger\dagger} = x^\dagger A x = \lambda. $$ That is, $\lambda = \bar \lambda$, which is to say that $\lambda$ is real. $\square$

With that in mind, it suffices to note that for any matrix $M$, we have $\ker M = \ker M^\dagger M$. Indeed, it is clear that $\ker M \subseteq \ker M^\dagger M$, and we have $$ x \in \ker M^\dagger M \implies M^\dagger Mx = 0 \implies x^\dagger M^\dagger M x = 0 \\\implies (Mx)^\dagger (Mx) = 0 \implies Mx = 0 \implies x \in \ker M. $$ Now, taking $M = A - \lambda I$, we see that for any eigenvalue $\lambda$ of $A$, we have $$ \dim \ker(A - \lambda I)^2 = \dim \ker M^\dagger M = \dim \ker M, $$ which mean that the algebraic and geometric multiplicities are indeed the same for each eigenvalue $\lambda$.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • Appreciate your answer! Naive question: Could you please explain why $\ker M = \ker M^T M$? – JD_PM Sep 01 '20 at 10:48
  • 1
    @JD The sentence "Indeed,..." together with the series of implications is an answer to precisely this question. – Ben Grossmann Sep 01 '20 at 10:53
  • 1
    Why is $\dim \ker(A - \lambda I) \neq \dim \ker(A - \lambda I)^2 $ if and only if the two multiplicities disagree? – eggplant Mar 20 '22 at 05:41
  • @eggplant The answer to that question ultimately depends on your definitions of geometric/algebraic multiplicity, so perhaps you should ask a new question. That said, one quick answer is that an element of $\ker(A-\lambda I)^2$ that is not in $\ker(A-\lambda I)$ must be a generalized eigenvector. – Ben Grossmann Mar 20 '22 at 13:00
  • @Ben Grossmann Sorry for the late reply (as I didn't have enough reputation points to do so by then). Here, the algebraic and geometric multiplicity are taken to be the usual ones (I am not sure how the ambiguity can arise); namely, the geometric multiplicity refers to the dimension of an eigenspace, and the algebraic multiplicity refers to the number of time an eigenvalues repeats as the root of the characteristic polynomial. (I indeed attempted to ask this question with the links of this post provided. Yet for some reason it is downvoted and closed by the system, so here I am ) – eggplant Mar 27 '22 at 18:35
  • @eggplant Regarding the definitions, the point is that I'd like to use the fact that the geometric multiplicity is $\dim(\ker(A - \lambda I)^n)$, which means that starting from the characteristic polynomial based definition requires some extra work (unless you're willing to take this fact as a given) – Ben Grossmann Mar 28 '22 at 02:57
  • @eggplant So to clarify my earlier explanation, it always holds that $\ker(A - \lambda I) \subseteq \ker(A - \lambda I)^2$. When $\ker(A - \lambda I) \neq \ker(A - \lambda I)^2$, $A$ has generalized eignevectors (as I stated), which means that $A$ is not diagonalizable. Conversely, if $\ker(A - \lambda I) = \ker(A - \lambda I)^2$ , we can deduce (with a bit more work) that $\ker(A - \lambda I) = \ker(A - \lambda I)^n$, so that $\dim \ker(A - \lambda I) = \dim \ker(A - \lambda I)^n$, which implies that the geometric and algebraic multiplicities are equal. – Ben Grossmann Mar 28 '22 at 03:05
  • Incidentally, a "generalized eigenvector" can be defined in this context as any element of $\ker(A - \lambda I)^n$ not present in $\ker(A - \lambda I)$. – Ben Grossmann Mar 28 '22 at 03:06
0

We prove equivalentely that there exists an orthogonal matrix $P$ (we assume here, by simplicity, that it is orthogonal, but we can prove it) such that $ AP = PD $ or $ P^T AP = D $.

To show why $ AP = PD $ for a real symmetric matrix $ A $, we start by noting that $ A $ has real eigenvalues and orthogonal eigenvectors. Let $ P $ be the matrix whose columns are the eigenvectors of $ A $:

$$ P = [\mathbf{v}_1 \, \mathbf{v}_2 \, \ldots \, \mathbf{v}_n]. $$

Define the diagonal matrix $ D $ with the corresponding eigenvalues:

$$ D = \begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{bmatrix}. $$

Now, when we multiply $ A $ by $ P $:

$$ AP = A[\mathbf{v}_1 \, \mathbf{v}_2 \, \ldots \, \mathbf{v}_n] = [A \mathbf{v}_1 \, A \mathbf{v}_2 \, \ldots \, A \mathbf{v}_n]. $$

Since $ A \mathbf{v}_i = \lambda_i \mathbf{v}_i $ for each eigenvector, we have:

$$ AP = [\lambda_1 \mathbf{v}_1 \, \lambda_2 \mathbf{v}_2 \, \ldots \, \lambda_n \mathbf{v}_n]. $$

Now, consider $ PD $:

$$ PD = P \begin{bmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{bmatrix} = [\lambda_1 \mathbf{v}_1 \, \lambda_2 \mathbf{v}_2 \, \ldots \, \lambda_n \mathbf{v}_n]. $$

Since both $ AP $ and $ PD $ yield the same result, we conclude that $ AP = PD $. This leads to the diagonalization of $ A $:

$$ A = PD P^{-1}. $$

Since $ P $ is orthogonal (the eigenvectors are orthonormal), we can express this as:

$$ A = P D P^T. $$

Mark
  • 7,702
  • 6
  • 41
  • 80