7

In the MIT linear algebra online lecture, when doing SVD, Gilbert Strang said that the eigenvalues of $AB$ and $BA$ are the same. I was trying to prove this as follows:

Let $A$ be $m \times n$ matrix and $B$ be $n \times m$ matrix. Then $AB$ is $m \times m$ and $BA$ is $n \times n$.

Let $$ABx=\lambda x$$ Then $$BA(Bx)=\lambda(Bx)$$ and $\lambda$ is an eigenvalue of $BA$ as well, and vice versa. Q.E.D.

However, after a second thought I think the above proof has a pitfall. Namely, if $x$ is in the null-space of $B$ then $BA$ needs not have eigenvalue $\lambda$.

So my questions is:

Is the statement that $AB$ and $BA$ have the same eigenvalues true for general $m \times n$ matrix $A$ and $n \times m$ matrix $B$? If yes, how to prove it?

If no, is it true for the special case when $B=A^\dagger$? And how to prove it?

user26857
  • 53,190
velut luna
  • 10,162

2 Answers2

10

Your proof is correct for $\lambda\neq 0$, because then it isn't possible that you get $Bx=0$ for an eigenvector $x$ of $AB$ to the eigenvalue $\lambda$.

And this is also the general statement: All non zero eigenvalues are the same. That it doesn't works with $\lambda=0$ you see for $A=\begin{pmatrix}1\\0\end{pmatrix}, B=\begin{pmatrix}1&0\end{pmatrix}$.

For another proof look at the characteristic polynomial. See en.wikipedia.org/Characteristic polynomial of a product of two matrices.

0

$BA=BABB^-1$
$BA=B\cdot AB\cdot B^-1$
Let $M=B^-1$
$BA=M^-1\cdot AB\cdot M$
So, $AB$ and $BA$ are similar matrices. Hence they have same eigenvalues.

Edit 1: As rightly pointed by Hanul Jeon, this holds only if $A$ and $B$ are invertible matrices