3

I recently did this exercise and I was hoping to get some feedback on my proof.

Given an real matrix $A$ such that $A^2=A$. Show that A is diagonalizable.

Proof: Assume that $A \in R^{nxn}$, and assume that we have the linear map $V \longrightarrow V$ such that $x \mapsto Ax$, where $V$ is an arbitraty vector space with $\dim V = n < \infty$.

$A$ is diagonalizable if there exists a matrix $T$ such that $D = T^{-1}AT$, where $T$ is the matrix with $n$ linearly independent eigenvectors of $A$ and $D=diag(\lambda_1,...\lambda_k)$ if we assume $k$ eigenvalues to $A$.

Note that for any vector $x\neq0 \in V$ we have $Ax = \lambda x$ but also $Ax=A^2x=A(Ax)=A(\lambda x)=\lambda(Ax) = \lambda^2x$ and thus $(\lambda-\lambda^2)x=0$

and so the only eigenvalues to $A$ is $\lambda_0=0$ and $\lambda_1=1$.

Call the corresponding eigenspaces $E_0 = \{x\in V | Ax=0\} =: \ker(A)$ and $E_1=\{x\in V | Ax=x\}$

Thus, we can express $V = E_0 \bigoplus E_1$ and it follows that $\dim V = n =\dim E_0 + \dim E_1$ and thus we know that we have a set of $n$ linearly independent eigenvectors.

Therefore $A$ is diagonalizable. $\blacksquare$

The last step is where I'm a bit unsure. I'm thinking that because we only have two eigenvalues, the space $V$ must be a direct sum of the two corresponding eigenspaces, am I right? Also, am I right when I say that because the sum of the dimensions of the two eigenspaces is $n$, we know for sure that we have $n$ linearly independent eigenvectors? I think that makes sense.

Cheers

  • 3
    That "Thus, we can express..." is far from being clear from what you did above...Do you know a matrix is diagonalizable (over some field containing all its eigenvalues, of course) iff its minimal polynomial is the product of different linear factors? Because this would solve your problem at once... – DonAntonio Apr 29 '18 at 17:51
  • 1
    Also, the map $;\begin{pmatrix}0&1&0\0&0&0\0&0&1\end{pmatrix};$ has only two eigenvalues, yet it isn't diagonalizable... – DonAntonio Apr 29 '18 at 17:53
  • 1
    The flaw is saying that $n=\dim E_0+\dim E_1$. – chhro Apr 29 '18 at 17:55

3 Answers3

1

Here's a concise proof: A matrix is diagonalizable if and only its minimal polynomial has no repeated roots. Since $A^2-A=A(A-1)=0$, the minimal polynomial divides $t(t-1)$, and therefore has no repeated roots.

Exit path
  • 4,571
  • 1
  • 16
  • 31
0

At some point we have to use that $\lambda^2-\lambda$ is the minimal polynomial of $A$, and that it splits in simple factors.

The above taken mot-a-mot, but with $A^5-A^2=0$ could not work, e.g. because of Jordan blocks like $$ \begin{bmatrix} 0 &1\\0&0\end{bmatrix}\ , $$ for the eigenvalue zero, and the eigenspace is one-dimensional, and/or $$ \begin{bmatrix} 1 &1&0\\0&1&0\\ 0&0&1\end{bmatrix} \text{ or } \begin{bmatrix} 1 &1&0\\0&1&1\\ 0&0&1\end{bmatrix} \ , $$ for the eigenvalue one, and the eigenspace is respectively two, and one-dimensional.

dan_fulea
  • 37,952
0

The biggest part of the proof is missing - you've given no indication why $\Bbb R^n$ is the sum of the two eigenspaces.

Hint: Any $x$ can be written $$x=y+z,$$where $y=Ax$ and $z=x-Ax$. Now show that $y$ and $z$ are both eigenvectors, by calculating $Ay$ and $Az$ and using $A^2=A$.