Given matrix $A$, we square it. We found the eigenvalues and eigenvectors for $A$. If we square $A$ (or it to a power of any number $n$), do the eigenvalues become squares (or to the power of any number $n$), while the eigenvectors remain the same?
-
1see this – Chinnapparaj R Apr 27 '18 at 03:59
-
Think about the case when $A$ is the matrix which rotates $\mathbb{R}^2$ by $90$ degrees. But your intuition isn't far off, as the linked answer shows. – Elle Najt Apr 27 '18 at 04:00
-
So the eigenvectors do remain the same. wow. – user3238863 Apr 27 '18 at 04:01
-
You have to be cautious with the word " the same" -- $A$ may not have eigenvectors, but $A^2$ can. (See the example I gave you.) – Elle Najt Apr 27 '18 at 04:07
-
Let's say A does have eigenvectors, would A^2 have the same – user3238863 Apr 27 '18 at 04:09
-
The short answer is to remember that $A^2v = A(Av)$, and that $A(\alpha v)=\alpha (Av)$ for any scalar $\alpha$ and vector $v$. This is true regardless the nature of $\alpha$ and $v$ and is in particular true for eigenvalues and eigenvectors respectively. – JMoravitz Apr 27 '18 at 04:18
-
@user3238863 Please see the answer below for details. – Sarvesh Ravichandran Iyer Apr 27 '18 at 04:32
2 Answers
The eigenvalues remain the same. Indeed, if $\lambda$ is an eigenvalue for $A$ with eigenvector $v$, then we have $$A^2 v = A(Av) = \lambda Av = \lambda^2 v $$ So $v$ is an eigenvector of $A^2$ with eigenvalue $\lambda^2$.
But here is another question : are all the eigenvalues of $A^2$ of the form $\lambda^2$ where $\lambda$ is an eigenvalue of $A$? We know some of the eigenvalues of $A^2$ are squares of the eigenvalues of $A$, but nothing stops there from being more than these.
Indeed, this is not the case : one uses the Jordan canonical form here. Indeed, we can write $A = VJV^{-1}$ where $V$ is invertible and $J$ is a Jordan block matrix, with the eigenvalues along the diagonals. Simply squaring gives that $A^2 = VJ^2 V^{-1}$, so by uniqueness(under permutation of blocks, but we can assume some fixed ordering of blocks for $J$) $J^2$ is the Jordan block matrix of $A^2$, and it contains the eigenvalues of $A^2$ on its diagonal. But by multiplication of $J$ with itself, the diagonals will consist exactly of $\lambda^2$.
Hence, the eigenvalues of $A^2$ are exactly $\lambda^2$ for $\lambda$ an eigenvalue of $A$.
However, the same does not hold with eigenvectors!
Indeed, the example given above is typical : the right angle rotation matrix in $2D$ would be written as $\begin{pmatrix} 0 \ \ -1 \\ 1 \quad 0 \end{pmatrix}$, whose eigenvalues are $i$ and $-i$, and the eigenvectors have imaginary entries. On the other hand, $A^2 = -I$ which has only one eigenvalue $-1$(note : $i^2 = (-i)^2 = -1!$) with eigenvectors $(1,0)$ and $(0,1)$.
Hence, eigenvectors need not match.
However, if $A$ is symmetric, then by the spectral theorem for symmetric matrices, indeed $A$ and $A^2$ have exactly the same set of eigenvectors as well. This is because we see that $A = VDV^{-1}$ where $V$ consists of the eigenvectors of $A$, then $A^2 = VD^2 V^{-1}$ for the same $V$. Hence uniqueness of spectral decomposition gives the result.
Finally, two things:
In general, for any polynomial $f$ we have that the eigenvalues of $f(A)$ are exactly $f(\lambda)$ for $\lambda$ an eigenvalue of $A$. Here we used $f(A) = A^2$. Eigenvectors need not match.
Even more generally, this property can be generalized to much more than matrices. Indeed, on unital Banach algebras over the complex numbers (the set of matrices forms a Banach algebra under standard addition, multiplication and matrix norm), where we have the Riesz functional calculus available to us, and the result $f( \sigma(A)) = \sigma(f(A))$ for any element $A$ of the Banach algebra, and $f$ holomorphic on $\sigma(A)$ (where $\sigma(A)= \{\lambda\}$ is the set on which $A-\lambda I$ is not invertible. This includes, and sometimes is exactly, the set of eigenvalues) , is called the spectral mapping theorem. It is a highly desirable result.
Of course, if $f$ is a polynomial then it is entire, so the result applies in our (baby) case.
- 77,817
-
1If $A$ is diagolizable then $A$ and $A^{2}$ would also have the same eigenvectors, right? – epsilon Nov 20 '19 at 03:54
-
@epsilon I have to think, but first thought is yes. In fact, it IS the case, from the definition of diagonalizable. – Sarvesh Ravichandran Iyer Nov 20 '19 at 05:34
No. The linear transformation $\left[\begin{array}{rr} 0& 1 \\ -1& 0\end{array}\right] $ that sends $x$ to $-y$ and $y$ to $x$ in $\Bbb{R}^2$ has no eigenvectors. But its square has every vector as an eigenvector.
- 26,534
- 8,476
-
-
-
well it does have imaginary eigen values which on squaring gives required result. – Shiv Tavker Sep 28 '20 at 10:42