3

Assuming that $\lambda_1$ and $\lambda_2$ are distinct and nonzero eigenvalues $T: \Bbb{R}^2 \rightarrow \Bbb{R}^2$. To show that the corresponding $v_1$ and $v_2$ eigenvectors are LI (linearly independent) and $T(v_1)$ and $T(v_2)$ are LI I made a proof by contradiction. But I would like to know if there would be another way to solve this problem.

Update

I) By contradiction, let us consider that $v_2 = a v_1$, where $a$ belongs to the sets of reals. So $v_2$ and $v_1$ are LD.

As $\lambda_1$ and $\lambda_2$ are eigenvalues ​​of $T$ associated with the eigenvectors $v_1$ and $v_2$ respectively, we have:

$T(v_1) = λ_1 v_1$ and $T(v_2) = λ_2 v_2$.

As, by hypothesis, $v_1$ ​​and $v_2$ are LD:

$T(v_2) = T(av_1)=aT(v_1)=λ_1 (a v_1) = λ_1 v_2$ and $T(v_2) = λ_2 v_2$

Soon:

$λ_1 v_2 = λ_2 v_2$.

Since $v_2\neq0$, we have $λ_1 = λ_2$.

This is a contradiction, as the eigenvalues ​​are distinct, so $v_1$ and $v_2$ are LI.

II) The reasoning is analogous to the previous question: By contradiction, if $T(v_1)$ and $T(v_2)$ are LD, then:

$T(v_2) = a T(v_1)$

$λ_2 v_2 = a λ_1 v_1$

$v_2 = (a λ_1 / λ_2) v_1$

Making $b = (a λ_1 / λ_2)$, $v_2 = b v_1$

But from item (I) we know that $v_2$ and $v_1$ are LI, so there is no value for $b$. We come to another contradiction and therefore $T(v_1)$ and $T(v_2)$ are LI.

kabenyuk
  • 12,395
user995243
  • 33
  • 4
  • 1
    Unfortunately, this result is not true as stated, e.g. if $T$ is the $0$ transformation. How did your proof go? – Theo Bendit Nov 19 '21 at 21:57
  • I forgot to inform some data that we must assume – user995243 Nov 19 '21 at 22:01
  • 2
    Good to know, and thank you for clarifying. You should still include your proof, especially when asking for alternate proofs (if we just accidentally give you the same proof you had, then we have wasted our time). Generally speaking, the site's guidelines for asking questions state that you should be including your own thoughts and efforts into questions, and questions that do not include this tend to be down-voted or closed. – Theo Bendit Nov 20 '21 at 00:12
  • 2
    +1 for including your proof. – Theo Bendit Nov 20 '21 at 06:26

2 Answers2

1

Your proof for part I is good, in that it's quick and relatively clean. The actual write up could use a little touching up, but the thrust is good. The only alternate proof I'd suggest is a more general one, since this particular result holds for more than just two vectors. That is, one can show that if we have $m$ eigenvectors in $m$ distinct eigenspaces, then they are automatically linearly independent. This takes more time, so I would probably reach for your argument unless I needed the more general result.

Suppose that $v_1, \ldots, v_m$ are eigenvectors corresponding to distinct eigenvalues $\lambda_1, \ldots, \lambda_m$. We wish to show that $v_1, \ldots, v_m$ are linearly independent, which we can do so by induction.

If $m = 1$, then we have one (non-zero) eigenvector $v_1$, so we are done.

Suppose that we know $v_1, \ldots, v_k$ is linearly independent for $1 \le k < m$. Then, the only way we can have $v_1, \ldots, v_{k+1}$ be linearly dependent is if $v_{k+1} \in \operatorname{span}\{v_1,\ldots, v_k\}$, i.e. $$v_{k+1} = a_1 v_1 + \ldots + a_k v_k$$ for some $a_1, \ldots, a_k$. Now, apply $T - \lambda_{k+1} I$ to both sides (note: it annihilates the left hand side). We get: \begin{align*} 0 &= T(a_1 v_1) - a_1 \lambda_{k+1} v_1 + \ldots + T(a_k v_k) - a_k \lambda_{k+1} v_k \\ &= a_1 \lambda_1 v_1 - a_1 \lambda_{k+1} v_1 + \ldots + a_k \lambda_k v_k - a_k \lambda_{k+1} v_k \\ &= a_1(\lambda_1 - \lambda_{k+1}) v_1 + \ldots + a_k (\lambda_k - \lambda_{k+1}) v_k. \end{align*} This is a linear combination of the linearly independent $v_1, \ldots, v_k$, so we must have $$a_1(\lambda_1 - \lambda_{k+1}) = \ldots = a_k(\lambda_1 - \lambda_{k+1}) = 0.$$ But, the eigenvalues are distinct, so we can divide through by $\lambda_i - \lambda_{k+1} \neq 0$, giving us $$a_1 = \ldots = a_k = 0,$$ which in turn implies that $v_{k+1} = 0$, which contradicts $v_{k+1}$ being an eigenvector. So, $v_{k+1} \notin \operatorname{span}\{v_1, \ldots, v_k\}$, and so $v_1, \ldots, v_{k+1}$ is also linearly independent.

As you can see, it's a lot more work! But it's worth it, if you care about this result in a more general setting.


For part II, I would very simple conclude that $T(v_1)$ and $T(v_2)$ are also eigenvectors corresponding to $\lambda_1$ and $\lambda_2$ respectively (as they are just non-zero multiples of the original eigenvectors. So, the result from part I still applies, and $T(v_1), T(v_2)$ are linearly independent.

Theo Bendit
  • 53,568
  • 1
    The assumption that $\lambda_{i}\neq 0$ is important for the 2nd part of the question . Otherwise it trivially becomes linearly dependent . – Mr. Gandalf Sauron Nov 20 '21 at 08:31
1

In general the eigen vectors corresponding to distinct eigen values are linearly independent. Also I assume for the second part that each $\lambda_{i}$ is non zero. Otherwise every set containing the null vector is linearly dependent. In otherwords if $\lambda_{i}=0$ then $T(v_{i})=0v_{i}=0$

Let $\lambda_{i}\,1\leq i\leq n$ be distinct eigen values.

Then you consider the relation.

$$\sum_{i=1}^{n} c_{i}v_{i}=0$$ . Where $c_{i}$ are scalars and $v_{i}$ are the corresponding eigen vectors.

Then Applying $T$ to the above relation we get :-

$$T(\sum_{i=1}^{n} c_{i}v_{i})=\sum_{i=1}^{n}c_{i}T(v_{i})=\sum_{i=1}^{n}c_{i}\lambda_{i}v_{i}=0$$.

Now again applying $T$ to $\sum_{i=1}^{n}c_{i}\lambda_{i}v_{i}$ we get

$$\sum_{i=1}^{n}c_{i}\lambda_{i}^{2}v_{i}=0$$.

We apply $T$ a total of $n-1$ times.

We get $n$ equations .

Then the we try to find the solutions for $c_{i}$'s.

So if you look carefully the matrix of these system of n linear equations is say $A$ . Then

$$A^{T}=\begin{bmatrix} 1&\lambda_1&\lambda_1^2&\cdots & \lambda_1^{n-1}\\ 1&\lambda_2&\lambda_2^2&\cdots & \lambda_2^{n-1} \\ \vdots&\vdots&\vdots&\ddots&\vdots\\ 1&\lambda_{n-1}&\lambda_{n-1}^2&\cdots&\lambda_{n-1}^{n-1}\\ 1 &\lambda_n&\lambda_n^2&\cdots&\lambda_n^{n-1} \end{bmatrix}$$ i.e You are looking for the solution to

$$A\begin{bmatrix}c_1v_{1}\\c_2v_{2}\\\vdots\\c_nv_{n}\end{bmatrix}=\begin{bmatrix}0\\0\\\vdots\\0\end{bmatrix}$$

Now if you are familiar with the Vandermonde matrix then you will see that the determinant is nothing but $$\prod_{1\leq i<j\leq n}(\lambda_{j}-\lambda_{i})$$. So since $\lambda_{i}$ each are distinct we have the determinant is non-zero. So the system has the unique solution to $c_{i}$'s namely that all must be $0$'s. Hence linear independence is proved.

Now it is obvious that for a linearly independent set $\{v_{1},v_{2},...v_{n}\}$. The set of vectors $\{\lambda_{1}v_{1},\lambda_{2}v_{2},...\lambda_{n}v_{n}\}$ is also linearly independent. Otherwise we run into the same conundrum of having the equation as

$\sum_{i=1}^{n}c_{i}\lambda_{i}v_{i}=0$. Which if we consider

$d_{i}=c_{i}\lambda_{i}$ then the equation $\sum_{i=1}^{n}d_{i}v_{i}=0$ has the only solution $d_{i}=0\,\forall \,i$ .

Which implies $c_{i}=0$ as each $\lambda_{i}$ is non zero. Hence $T(v_{i})$'s are linearly independent. Note again that the assumption that $\lambda_{i}$'s are non zero is vital for the second part of the question.

  • How do you turn these linear combinations of the $v_i$s into a system of equations? I considered this proof method briefly, but I could not figure out how to navigate this step without (circularly) assuming the $v_i$s were linearly independent. – Theo Bendit Nov 20 '21 at 22:58
  • 1
    I'll just write the 2 dimensional case in the comment. Consider $c_{1}v_{1}+c_{2}v_{2}=0$ and $c_{1}T(v_{1})+c_{2}T(v_{2})=0\implies c_{1}\lambda_{1}v_{1}+c_{2}\lambda_{2}v_{2}$. Then we are looking at $$\begin{pmatrix} 1 & 1 \ \lambda_{1} & \lambda_{2} \end{pmatrix}\cdot \begin{pmatrix} c_{1}v_{1}\c_{2}v_{2}\end{pmatrix}=0$$ . Just repeat this to get the system of equation for n variables. By cramer's rule this has a unique solution as determinant is non zero. This is the method which we also apply to Wronskian. In that case we just take derivatives to generate n equations. – Mr. Gandalf Sauron Nov 21 '21 at 08:23
  • I'm convinced, but I kinda hate putting vectors into vectors like that. But you're right: the $2 \times 2$ matrix is invertible (as with any Vandermonde matrix with distinct columns), so there will be a sequence of elementary row operations that turn $x + y = \lambda_1 x + \lambda_2 y = 0$ into $x = y = 0$. Those same elementary row operations can be applied to the system of vector equations $c_1 v_1 + c_2 v_2 = \lambda_1 c_1 v_1 + \lambda_2 c_2 v_2 = 0$, to obtain the conclusion $c_1 v_1 = c_2 v_2 = 0$. So, it works, despite the liberties taken in notation. – Theo Bendit Nov 22 '21 at 03:01
  • The unique solution follows from Cramer's Rule. And we as high school students were introduced to cramer's rule first and then elementary row operations. So just showing the determinant of a system of equation is non-zero is the first thing that comes to mind rather than row operations. Obviously we are talking about square matrices . – Mr. Gandalf Sauron Nov 22 '21 at 16:32
  • I bring it back to row operations because Cramer's rule is about systems of linear equations of scalar variables. I try not to apply theorems like this when the premises do not hold. That's why I was looking for a more elementary reason why the conclusion should still hold. – Theo Bendit Nov 22 '21 at 20:25