How do i show that the trace and determinants are equal the sum and product of eigenvalues of a square matrix? I just need some hints or references here. It is not a homework problem.
-
For me the easiest way would be to note that it holds for diagonalizable matrices and then use that these are dense. – Tobias Kildetoft Feb 15 '20 at 13:10
-
thanks @TobiasKildetoft. What do you mean by dense? – manav Feb 15 '20 at 13:22
-
Do You know the Jordan normal form of a matrix? – Peter Melech Feb 15 '20 at 13:50
-
The question about the determinant has been asked here before, please look there first. – ViktorStein Feb 15 '20 at 13:52
4 Answers
From
$$M=T\Lambda T^{-1}$$ you draw $$|M|=|T||\Lambda||T^{-1}|=|T|\prod\lambda_k\frac1{|T|}.$$
More generally,
$$|M-\lambda I|=|T||\Lambda-\lambda I||T^{-1}|=|T|\prod(\lambda_k-\lambda)\frac1{|T|}$$ is a polynomial in $\lambda$, and all coefficients in the developments will match. In particular, the coefficient of $\lambda^{n-1}$ is the sum of the Eigenvalues by Vieta, and is also the trace of $M$ (it is the sum of the products of all $\lambda$'s in $n-1$ diagonal elements and the constant coefficient in the remaining diagonal element).
By this reasoning, you will find $n-2$ other invariants (of which the trivial leading coefficient $1$).
Hint:
You can prove, by induction, that in the characteristic polynomial of a matrix $A$: $$ \lambda^n+a_{n-1}\lambda^{n-1}+\cdots + a_1\lambda+a_0 $$ we have: $$a_{n-1}=-tr(A) \qquad a_0=(-1)^n det(A)$$
Then use Viète's formulas.
- 64,377
I'll expand on @TobiasKildetoft's comment.
The result is trivial for diagonal matrices, and applies also to diagonalizable ones because any invertible $S$ satisfies$$\operatorname{tr}SAS^{-1}=\operatorname{tr}AS^{-1}S=\operatorname{tr}A$$and$$\det SAS^{-1}=\det S\det A\det(S^{-1})=\det S\det A(\det S)^{-1}=\det A.$$For $n\times n$ matrices, the difference between $\operatorname{tr}A$ and the sum of $A$'s eigenvalues is a polynomial function of its entries. (This relies on the diagonalizable matrices being "dense", and this property is defined and proven here.) And the only way for this polynomial to vanish on all diagonalizable $A$ is for it to vanish in general. The same argument applies to determinants.
- 118,053
-
+1 Don't You mean $\operatorname{tr}(SAS^{-1})=\operatorname{tr}(ASS^{-1})=...$? – Peter Melech Feb 15 '20 at 15:24
-
@PeterMelech No, because $\operatorname{tr}UVW$ is equal to $\operatorname{tr}VWU$ but not necessarily $\operatorname{tr}VUW$. Of course, your comment was pointing out I'd accidentally left out the $W$, so thanks! Fixed. – J.G. Feb 15 '20 at 16:34
-
Every matrix $A\in\mathbb{C}^{n\times n}$ is similar to its Jordan normal form J, i.e. there exists a matrix $T\in \operatorname{Gl}_n(\mathbb{C})$ so that $$A=T^{-1}JT$$ which implies: $$\det(A)=\det(T)^{-1}\det(J)\det(T)=\det(J)$$ and $$\operatorname{tr}(A)=\operatorname{tr}(T^{-1}JT)=\operatorname{tr}(TT^{-1}J)=\operatorname{tr}(J)$$ where we used that $\det:\mathbb{C}^{n\times n}\rightarrow\mathbb{C}$ is a monoid-homomorphism and that $\operatorname{tr}(AB)=\operatorname{tr}(BA)$ for all $A,B\in\mathbb{C}^{n\times n}$. For the Jordan matrix $J$ the claim is obvious.
- 4,470