0

Given matrix $A$, we square it. We found the eigenvalues and eigenvectors for $A$. If we square $A$ (or it to a power of any number $n$), do the eigenvalues become squares (or to the power of any number $n$), while the eigenvectors remain the same?

Kenta S
  • 18,181

2 Answers2

11

The eigenvalues remain the same. Indeed, if $\lambda$ is an eigenvalue for $A$ with eigenvector $v$, then we have $$A^2 v = A(Av) = \lambda Av = \lambda^2 v $$ So $v$ is an eigenvector of $A^2$ with eigenvalue $\lambda^2$.

But here is another question : are all the eigenvalues of $A^2$ of the form $\lambda^2$ where $\lambda$ is an eigenvalue of $A$? We know some of the eigenvalues of $A^2$ are squares of the eigenvalues of $A$, but nothing stops there from being more than these.

Indeed, this is not the case : one uses the Jordan canonical form here. Indeed, we can write $A = VJV^{-1}$ where $V$ is invertible and $J$ is a Jordan block matrix, with the eigenvalues along the diagonals. Simply squaring gives that $A^2 = VJ^2 V^{-1}$, so by uniqueness(under permutation of blocks, but we can assume some fixed ordering of blocks for $J$) $J^2$ is the Jordan block matrix of $A^2$, and it contains the eigenvalues of $A^2$ on its diagonal. But by multiplication of $J$ with itself, the diagonals will consist exactly of $\lambda^2$.

Hence, the eigenvalues of $A^2$ are exactly $\lambda^2$ for $\lambda$ an eigenvalue of $A$.

However, the same does not hold with eigenvectors!

Indeed, the example given above is typical : the right angle rotation matrix in $2D$ would be written as $\begin{pmatrix} 0 \ \ -1 \\ 1 \quad 0 \end{pmatrix}$, whose eigenvalues are $i$ and $-i$, and the eigenvectors have imaginary entries. On the other hand, $A^2 = -I$ which has only one eigenvalue $-1$(note : $i^2 = (-i)^2 = -1!$) with eigenvectors $(1,0)$ and $(0,1)$.

Hence, eigenvectors need not match.

However, if $A$ is symmetric, then by the spectral theorem for symmetric matrices, indeed $A$ and $A^2$ have exactly the same set of eigenvectors as well. This is because we see that $A = VDV^{-1}$ where $V$ consists of the eigenvectors of $A$, then $A^2 = VD^2 V^{-1}$ for the same $V$. Hence uniqueness of spectral decomposition gives the result.

Finally, two things:

  • In general, for any polynomial $f$ we have that the eigenvalues of $f(A)$ are exactly $f(\lambda)$ for $\lambda$ an eigenvalue of $A$. Here we used $f(A) = A^2$. Eigenvectors need not match.

  • Even more generally, this property can be generalized to much more than matrices. Indeed, on unital Banach algebras over the complex numbers (the set of matrices forms a Banach algebra under standard addition, multiplication and matrix norm), where we have the Riesz functional calculus available to us, and the result $f( \sigma(A)) = \sigma(f(A))$ for any element $A$ of the Banach algebra, and $f$ holomorphic on $\sigma(A)$ (where $\sigma(A)= \{\lambda\}$ is the set on which $A-\lambda I$ is not invertible. This includes, and sometimes is exactly, the set of eigenvalues) , is called the spectral mapping theorem. It is a highly desirable result.

Of course, if $f$ is a polynomial then it is entire, so the result applies in our (baby) case.

2

No. The linear transformation $\left[\begin{array}{rr} 0& 1 \\ -1& 0\end{array}\right] $ that sends $x$ to $-y$ and $y$ to $x$ in $\Bbb{R}^2$ has no eigenvectors. But its square has every vector as an eigenvector.

mathreadler
  • 26,534
C Monsour
  • 8,476