2

Let $A$ be an $n\times n$ matrix that is real and symmetric: $A^T=A$.

We know that any such matrix is unitarily diagonalisable and has real eigenvalues. Is it always possible to find an eigenbasis for $A$ made of only real vectors?

If $A$ has unit rank, this is always the case: $A=\lambda \,vv^T, \lambda\in\mathbb R,$ implies $v=e^{i\phi}v'$ with $\phi\in\mathbb R$ and $v'\in\mathbb R^n$. This because $(vv^T)_{ij}=v_i \bar v_j$ while $(vv^T)_{ji}=(vv^T)_{ij}^*$, thus if $v_i=|v_i|e^{i\phi_i}$, then $v_i \bar v_j=|v_i v_j| e^{i(\phi_i-\phi_j)}$, and if $A$ is real then we must have $e^{i(\phi_i-\phi_j)}\in\mathbb R$, and therefore $v_j =|v_j|e^{i\pi n_{ij}}e^{i\phi_i}$ for all $i,j$, for some $n_{ij}\in\mathbb Z$. We conclude that $v$ is real, up to a global phase.

In higher dimensions, we can have real symmetric matrices with complex eigenvectors. A trivial case being the $2\times 2$ identity, which can be written as $I=P_{+i}+P_{-i}$ with $P_{\pm i}\equiv v_{\pm i}v_{\pm i}^\dagger$ and $\sqrt2 \, v_{\pm i}\equiv (1,i)^T$. Still, the identity obviously always also admits an eigenbasis of real vectors.

What about the general case? Is there an example of a real symmetric matrix for which there is no eigenbasis of real matrices?

glS
  • 7,963

3 Answers3

4

If $A$ is real and $\lambda$ is a real eigenvalue with geometric multiplicity $k$ over $\mathbb C$, then it also has geometric multiplicity $k$ over $\mathbb R$. This is because the geometric multiplicity is simply $n$ minus the rank of $A-\lambda I$ -- and the rank can be computed by for example Gaussian elimination, which goes exactly the same no matter what we take the ambient field to be.

Therefore we can find a real basis for the eigenspace of $\lambda$, comprising $k$ vectors. Concatenating all of those bases gives a real eigenbasis for $A$.

Troposphere
  • 7,637
1

The orthogonal projection $P_{\lambda}$ onto the eigenspace with eigenvalue $\lambda$ is the matrix limit $$ P_{\lambda}=\lim_{\mu\rightarrow\lambda\\ \mu\in\mathbb{R}}(\mu-\lambda)(\mu I-A)^{-1} $$ This is a real matrix because the matrices on the right are real. And $$ AP_{\lambda}=\lambda P_{\lambda}. $$ Every column of $P_{\lambda}$ is either $0$ or is a non-zero vector with real values that is an eigenvector with eigenvalue $\lambda$ of $A$.

Disintegrating By Parts
  • 91,908
  • 6
  • 76
  • 168
  • thanks. I didn't know about this expression for the projections. Does it have a name, or could you point to a reference discussing it? – glS Aug 13 '21 at 09:45
  • @glS : If you extend $\lambda$ into the complex numbers, then it is part of the holomorphic functional calculus for operators. The limits can be approached through real numbers because the singularities of the resolvent operator $(\mu I-A)^{-1}$ are isolated in the case of a matrix, because they are the eigenvalues of $A$. Try out the machinery assuming that $A$ is a diagonal matrix and you'll see it that way, too. – Disintegrating By Parts Aug 13 '21 at 13:50
1

If it’s diagonalizable, then the geometric multiplicity of every eigenvalue equals the algebraic multiplicity, so there’s always an eigenvector to any eigenvalue. If I have a complex eigenvector for a real eigenvalue, then the real part and imaginary part of that eigenvector will be purely real eigenvectors for that eigenvalue.

But also, the fact that symmetric matrices are orthogonally diagonalizable (by a real orthogonal matrix) tells you that in fact, you can pick an orthonormal basis of real eigenvectors for your eigenvalues, and the eigenvectors will just be the columns of the orthogonal matrix you use to diagonalize.

  • "that symmetric matrices are orthogonally diagonalizable (by a real orthogonal matrix) tells you that you can pick an orthonormal basis of real eigenvectors", sure, but the question translates into whether one can find such a real orthogonal matrix. Your first argument proves it for non-degenerate eigenvectors. More generally, what's the argument? Given $m$ linind complex eigenvectors for some $\lambda$, their real/imaginary components are also eigenvectors, but how do you make sure you can get $m$ linearly independent real vectors this way? – glS Aug 13 '21 at 09:43
  • Here’s a rigorous argument why a real symmetric matrix is orthogonally diagonalizable with a real basis of eigenvectors: https://www.math.wustl.edu/~freiwald/309orthogdiag.pdf Any high school or early-undergraduate linear algebra textbook should cover this material as well, it’s pretty standard. – Rivers McForge Aug 13 '21 at 16:11
  • thanks. Just for better accessibility, here's a very brief summary of (my understanding of the ideas underlying) the argument made in that pdf: (1) $A$ symmetric implies a real eigenvalue and thus a real eigenvector, $Av=\lambda v$ (2) $A$ symmetric implies $Av^\perp\subseteq v^\perp$ with $v^\perp$ the orthogonal complemenet of $v$ (3) $A$ acts on $v^\perp$ again as a symmetric matrix, and argument can be iterated. – glS Aug 13 '21 at 16:41
  • Yep, that’s the gist. – Rivers McForge Aug 13 '21 at 16:43