2

Let $V$ a vector space of finite dimension. Let $E_1,...,E_n$ Eigenspace associate to the eigen value $\lambda _1,...,\lambda _n$. I want to prove that $$E_1+\cdots+E_n=E_1\oplus\cdots\oplus E_n.$$

Let $v_1\in E_1,...,v_n\in E_n$ s.t. $v_1+\cdots +v_n=0$. I have to prove that $v_i=0$ for all $i$. I know that if $v_1\in E_1,...,v_n\in E_n$ are non zero vector, then they are free. So if I suppose that there is $v_i\neq 0$ (suppose WLOG $v_1=0$), then $$v_1=-v_2-...-v_n,$$ and thus, there is at least an other vector (let say $v_2$) that is non zero. Therefore $v_1=-v_2$ which is a contradiction.

Question 1 : Is my proof working ? If not, what's wrong ?

Question 2 : I find my proof not elegant at all. Is there a more elegant proof ?

Henri
  • 331
  • 1
  • 2
  • 9
  • Where did you use that the vectors $v_i$ are eigen vectors of diferent eigenvalue? What do you mean by "the vectors are free"? – xarles Aug 06 '18 at 14:11
  • @xarles: free mean linearly independant. I used the fact that $v_i$ are eigenvectors to have that $(v_1,...,v_n)$ free. – Henri Aug 06 '18 at 14:13
  • 1
    But this what do you have to prove, that the $v_i$ are linearly independent. The direct sum property is automatic from this fact. – xarles Aug 06 '18 at 14:16
  • There is also some linear transformation in the background, because you can't have eigenvalues and eigenspaces sitting around when there is no linear operator to which they belong. Please clarify this. – Sarvesh Ravichandran Iyer Aug 06 '18 at 14:34
  • Where is your linear mapping? – xbh Aug 06 '18 at 14:39

2 Answers2

1

The flaw in your proof is that you assumed for the sake of contradiction that the vectors $v_1, \dots, v_n$ are linearly dependent ($v_1 + \dots + v_n = 0$), and used it to derive the consequence that the vectors are linearly dependent ($v_1 = -v_2$). So you didn't really prove anything.

When reviewing your own proofs, you can ask yourself where you used each of the hypotheses given. For instance, as other commenters have pointed out, you didn't use at all the fact that the vectors $v_i$ are eigenvectors for some linear transformation. That's a red flag that you're skipping something important.

Like xarles says, the proof comes down to the essential fact that eigenvectors of a linear transformation corresponding to distinct eigenvalues are linearly independent. Can you show that?

  • I don't understand what you mean "The flaw in your proof..." I make a prove by contradiction, i.e. suppose the list is linearly independent s.t. $v_1+...+v_n=0$ and conclude that they are dependent... it's a proof by contradiction... – Henri Aug 06 '18 at 16:02
  • @Henri However you want to prove the linear independence. If you assume it and you aim to prove it, then this is not a valid deduction. – xbh Aug 06 '18 at 16:16
  • @Henri vectors $v_1, \dots, v_n$ satisfying $v_1 + \dots + v_n = 0$ are not linearly independent. – Matthew Leingang Aug 06 '18 at 17:19
0

Q1: If you know that $(v_j)_1^n$ are independent, then the direct sum decomposition holds naturally, because now the expression of $0$ as a sum of vectors from $E_j$ would be unique, then by definition the sum is a direct sum. If you want to prove the decomposition from the square one, you might use my answer as a reference.

Q2: I could give a proof.

We assume that $\{\lambda_j\}_1^n$ are distinct eigenvalues of a linear operator $\mathcal T \in \mathcal L(V)$.

Proof.$\blacktriangleleft$ Suppose $v_j \in E_j$ satisfy that $v_1 + v_2 + \cdots + v_n =0$. By definition, $\mathcal T - \lambda_j \mathcal I$ is zero mapping on $E_j$. Therefore apply $\mathcal T - \lambda_1 \mathcal I$ to $\sum_1^n v_j = 0$ yields $$ (\lambda_2 - \lambda_1) v_2 + (\lambda_3 - \lambda_1) v_3 + \cdots + (\lambda_n - \lambda_1) v_n = 0. $$ Now apply $\mathcal T - \lambda_2 \mathcal I$ to it and obtain $$ \sum_3^n (\lambda_j - \lambda_2) (\lambda_ j - \lambda_1) v_j = 0. $$ Repeatedly we could know that if we apply $$ (\mathcal T - \lambda_{n-1} \mathcal I)(\mathcal T -\lambda_{n-2} \mathcal I) \cdots (\mathcal T - \lambda_1 \mathcal I) $$ to $$ v_1 + v_2 + \cdots + v_n = 0, $$ then we obtain $$ \prod_{j=1}^{n-1} (\lambda_n - \lambda_j) v_n = 0. $$ Since all $\lambda_j$ are distinct, $v_n = 0$.

Similarly, apply $$ \prod_{j \neq k}(\mathcal T -\lambda_j\mathcal I) \quad [k = 1,2, \ldots, n-1] $$ to $v_1 + \cdots + v_n =0 $ would yield similar expression $$ \prod_{j \neq k} (\lambda_k - \lambda_j) v_k = 0, $$ hence $v_k = 0$.

Conclusively, $v_j = 0$ for all $j$, as we desired. $\blacktriangleright$

xbh
  • 9,033