1

I had a question about Sheldon Axler's proof of Theorem 5.21 in his book, Linear Algebra Done Right.

I checked this previous post, and had no trouble understanding the substitution part where $T$ is substituted for z.

This picture is from an older edition of the textbook, but it's the same proof.

Picture of the proof

My question is:

In the second line, Axler appears to choose any arbitrary non-zero vector $ v \in V $, and shows that at least one $(T - \lambda_j I) $ satisfies $ (T - \lambda_j I) v = \vec{0} $.

In other words, this proof appears to argue that for all non-zero vector $ v \in V $, there is at least one $(T - \lambda_j I) $ satisfies $ (T - \lambda_j I) v = \vec{0} $.

But this is clearly not true, as there are plenty of counterexamples even on $ C^2$.

Say, $ T(x,y) = [(3+5i)x, (1+2i)y] $, and choose $ v = ( 1+ i, 1-i) $. Then there is no single eigenvalue for that particular combination.

I know this proof is about the **existence ** of a particular (eigenvalue, eigenvector) combination, and not a proof about an eigenvalue existing for all arbitrary eigenvectors.

But the second line of the proof appears to claim otherwise.

user9487
  • 107
  • 1
    It's a bit subtle, but the existence of an eigenvector/value only depends on one of the $T-\lambda_jI$ NOT being injective. The implied contradiction is if they are all injective, then the composition is injective, but then we must have $v=0$. –  Mar 04 '23 at 15:03
  • 1
    It turns out this result is true even in infinite dimensions, if you replace "operator" with "compact operator". In finite dimensions, all linear operators are compact, so this really is a special case of the more general https://en.wikipedia.org/wiki/Spectral_theory_of_compact_operators . Just something to look forward to if you move up to functional analysis/infinite dimensions! – Alan Mar 04 '23 at 22:55
  • 2
    A new proof of this result will appear in the fourth edition of Linear Algebra Done Right. For this new proof, see page 142 in the new Chapter 5, which is freely available at https://linear.axler.net/. – Sheldon Axler Mar 05 '23 at 06:09
  • My first instinct was to show $(T-\lambda_j)v=0$, because it is the most straightforward way to conclude $T$ has an eigenvalue $\lambda_j$. But when I tried to prove it, I couldn’t. Eventually I reach desired conclusion which is essentially egreg’s answer. – user264745 Mar 05 '23 at 14:52

4 Answers4

3

You cannot conclude that $v$ is an eigenvector and indeed the author doesn't state it.

You can recover an eigenvector, though. For simplicity, let $S_k=T-\lambda_kI$ and note that these operators commute with each other. Now consider $$ S_1v,\quad S_2S_1v,\quad \dots,\quad S_{m-1}\dotsm S_2S_1v,\quad S_m\dotsm S_2S_1v=0 $$ Let $r$ be the first index such that $S_r\dotsm S_2S_1v=0$. If $r=1$, you're done. Otherwise $w=S_{r-1}\dotsm S_1v\ne0$ but $S_rw=0$. Hence $w$ is an eigenvalue relative to $\lambda_r$.

With your example, $$ Tv=(-2+8i,3+i),\qquad T^2v=(-46+14i,1+7i) $$ and it turns out that $$ T^2v=(7-11i)v+(4+7i)Tv $$ so the polynomial is $$ x^2-(4+7i)x-(7-11i)=(x-(3+5i))(x-(1+2i)) $$ Now $$ (T-(1+2i)I)v=Tv-(1+2i)v=(-2+8i,3+i)-(-1+3i,3+i)=(-1+5i,0) $$ which is clearly an eigenvector relative to $3+5i$.

egreg
  • 244,946
  • I see where my confusion came from--thanks for the detailed answer, and especially for the example. This helped me so much. – user9487 Mar 05 '23 at 04:00
2

The answer by BobaFett101 here answers this question. In short, $v$ need not be an eigenvector since the "multiplication" in the line you reference is composition of linear maps, and all you can conclude is that when you compose all of those maps you get 0, not that there is one particular map such that it outputs 0 when acting on $v$.

2

The proof doesn't argue that $(T-\lambda_j I)v=0$ for some $j$. It rather argues that in the composition of functions $$(T-\lambda_1 I)\cdots (T-\lambda_m I)$$ There exists a $j$ such that $(T-\lambda _jI)$ is not injective. This $(T-\lambda_j I)$ doesn't have to be not injective just with $v$, it could be not injective with $$(T-\lambda_nI)(T-\lambda_kI)v$$ for some $n$ and $k$. This is a different vector from $v$.

Seeker
  • 3,677
  • I get it now--I wish Sheldon could have been more specific about that line with T not being injective. That is, not with respect to v, but some composition that you mentioned – user9487 Mar 05 '23 at 04:03
0

For example you can consider the case of two matrices. If $A,\ B$ are two $n\times n$ matrices, $x$ a $n\times 1$ vector, then $ABx=0\not\Rightarrow Ax=0$ or $Bx=0$. Interpret it with the language of linear mapping, if $f:V\rightarrow V$ , $g: V\rightarrow V$, then $g(f(x_0))=0\not\Rightarrow$ $f(x_0)=0$ or $g(x_0)=0$

Asigan
  • 2,391