5

I saw many proofs but they all use advanced techniques and are impossible to understand. I'm looking for a proof that $AB$ and $BA$ have the same characteristic polynomial for any square matrices $A$ and $B$ over $\mathbb C$.

It's really easy when dealing with invertible matrices, but hard to prove for singular matrices.

I found several solutions that I could not understand:

This solution says

it is not too difficult to show that $AB$, and $BA$ have the same characteristic polynomial ... If the matrices are in $M_n(\mathbb C)$, you use the fact that $GL_n(\mathbb C)$ is dense in $M_n(\mathbb C)$ and the continuity of the function which maps a matrix to its characteristic polynomial. There are at least 5 other ways to proceed

I've bolded every term that I am not familiar with.

This solution I could not understand as well (it uses the limit definition when $\lambda$ approaches zero but I hardly understand how that solves the issue).

I'm looking for a simpler solution using more basic linear algebra.

asaf92
  • 1,311

2 Answers2

11

Late remark. I think the cleanest solution is the following purely algebraic one, which had been posted here and elsewhere many times, e.g. in Bill Dubuque's proof of Syvester's determinant identity

The proof is really simple and it works for square matrices over any commutative ring. First, suppose $x$ and the entries of $A$ and $B$ are (algebraically independent) indeterminates. Then $$ \det(xI-AB)\det(A)=\det(xA-ABA)=\det(A)\det(xI-BA). $$ Cancel out $\det(A)$ on both sides (it's cancellable because it is a non-zero polynomial in the indeterminate entries of $A$, i.e., a non-zero member in the integral domain $\mathbb Z[x,a_{ij},b_{ij}]$), we get $\det(xI-AB)=\det(xI-BA)$. Spealialise the entries of $A$ and $B$ to concrete values in the commutative ring of interest, we get the final result.

Proofs like this are usually not taught at universities, probably because students are taught basic linear algebra before they learn the formal definitions of "indeterminate" or "polynomial" in some abstract algebra courses.

user1551
  • 149,263
7

My preferred proof is as follows: it suffices to note that for any $\lambda \neq 0$, we have by Sylvester's determinant identity that $$ \det(\lambda I - AB) = \lambda^n\det\left(I - \frac 1{\lambda}AB\right) = \lambda^n\det\left(I - \frac 1{\lambda}BA\right) = \det(\lambda I - BA) $$ Thus, the two polynomials on $\lambda$ are identical for all $\lambda \neq 0$. We may conclude that the polynomials are exactly the same.


Here's the gist of your proof: for any $A,B$, there are sequences $(A_n)_{n \in \Bbb N}, (B_n)_{n \in \Bbb N}$ of invertible matrices such that $A_n \to A$ and $B_n \to B$ (the existence of such sequences is equivalent to density). We note that by the continuity of the function that maps a matrix to its characteristic polynomial (and by the continuity of matrix multiplication), we have $$ \det(\lambda I - AB) = \lim_{n \to \infty} \det(\lambda I - A_nB_n)\\ \det(\lambda I - BA) = \lim_{n \to \infty} \det(\lambda I - B_nA_n) $$ However, because the statement holds for invertible matrices, these two sequences are exactly the same. So, they have the same limit. So, $\det(\lambda I - AB) = \det(\lambda I - BA)$, which is what we wanted.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355