30

Suppose we have a basis $B$ for an endomorphism $f$ that has eigenvalues $\lambda_{1},\dots,\lambda_{k}$.

Do these eigenvalues change or stay the same if we change to another basis $B'$?

5 Answers5

45

Recall the definition:

Let $f$ be an endomorphism of a vector space $V$, then $\lambda$ is an eigenvalue of $f$ if there exists some non-zero $v \in V$ such that $f(v)=\lambda v$.

This does not involve a basis of the space at all. Thus it must be invariant under change of basis.

quid
  • 42,835
35

No, eigenvalues are invariant to the change of basis, only the representation of the eigenvectors by the vector coordinates in the new basis changes.

Indeed suppose that

$$Ax=\lambda x$$

and let consider the change of basis $x=My$ then

$$Ax=\lambda x\implies AMy=\lambda My\implies M^{-1}AMy=\lambda y \implies By=\lambda y$$

ysmartin
  • 103
user
  • 162,563
  • 18
    The eigenvectors do not change. Their coordinate vectors in different bases might be different though. – Christoph May 25 '18 at 08:23
  • 1
    @Christoph Yes of course, you are right we need to precise that their representation changes of course! – user May 25 '18 at 08:25
  • 5
    I think this answer misses the point. In order to consider whether or not eigenvalues are invariant under the change of basis, the eigenvalues would have to be defined in the terms of a basis in the first place, which is not the case, as quid points out in his answer. But this answer creates an illusion that the question is valid. – Adayah May 26 '18 at 11:39
14

The whole point of eigenvalues and eigenvectors is to produce a bunch of axes that define your skewy transformation, so that your skewy transformation becomes a scaling transformation on these axes. If anything, this gives you a nice basis (one in which your matrix is diagonal, i.e. scaling). Your eigenvalues are clearly the same in the eigenbasis as in any other basis (they're across the diagonal), so the eigenvalues are the same in all bases.

  • 4
    This isn't really an answer to the question, but it's an important observation, so I'm upvoting it. – Ethan Bolker May 25 '18 at 13:57
  • 1
    How is it not, though? The answer is in "the eigenvalues are the same as in the diagonalised form, thus they must be the same as each other". – Abhimanyu Pallavi Sudhir May 26 '18 at 05:04
  • 2
    Your argument doesn't really explain why the eigenvalues are the same (other than the word "clearly"). It doesn't cover the cases in which the eigenvectors don't span and the matrix doesn't diagonalize. But it really is the best discussion of what matters. – Ethan Bolker May 26 '18 at 13:02
3

I want to rephrase OP's question a bit:

Let $\mathbf{B}$ and $\mathbf{C}$ be any two bases of the vector space $V$, and let $\tau\in \mathscr L(V,V)=\text{End}(V)$ be a linear endomorphism.

Then is it true that the fact that $[\tau]_{\mathbf B}[v]_{\mathbf{B}}=\lambda [v]_{\mathbf{B}}$ for some nonzero vector $v$ implies the fact that $[\tau]_{\mathbf C}[v]_{\mathbf C}=\lambda [v]_{\mathbf C}$ for the same $\lambda$ and $v$?

The answer is yes: $$[\tau]_{\mathbf B}[v]_{\mathbf B}=\lambda [v]_{\mathbf B}\Rightarrow [\tau]_{\mathbf C}[v]_{\mathbf C}=\lambda [v]_{\mathbf C}$$

Recall the following change of basis formula (for me, a handful reference is Steven Roman's Advanced Linear Algebra (e3). See page 64 Corollary 2.17 for (2) below for example):

(1) $[v]_{\mathbf C}= M_{\mathbf B,\mathbf C}[v]_{\mathbf B}$

(2) $[\tau]_{\mathbf C}= M_{\mathbf B,\mathbf C}[\tau]_{\mathbf B}M^{-1}_{\mathbf B,\mathbf C}$

Then the assertion directly results from the following computation: $$ \begin{align*} [\tau]_{\mathbf C}[v]_{\mathbf C}&=M_{\mathbf B,\mathbf C}[\tau]_{\mathbf B}M^{-1}_{\mathbf B,\mathbf C}M_{\mathbf B,\mathbf C}[v]_{\mathbf B}\\ &=M_{\mathbf B,\mathbf C}[\tau]_{\mathbf B}[v]_{\mathbf B}\\ &=M_{\mathbf B,\mathbf C}\lambda [v]_{\mathbf B}\\ &=\lambda M_{\mathbf B,\mathbf C} [v]_{\mathbf B}\\ &=\lambda [v]_{\mathbf C}\\ \end{align*} $$

Anthony
  • 147
1

As @Christoph says, the definition of an eigenvalue does not involve a basis. Given a vector space $V$ and linear operator $f$, an eigenvector of $f$ is a vector v such that there exists a scalar $\lambda$ such that $f$(v) = $\lambda$ v. $\lambda$ is then an eigenvalue. A basis is a system of associating ordered tuples and vector. You take a basis set of vectors, then express every other vector as a linear combination of those vectors. You can then take those coefficients and represent the vector with an ordered tuple of those coefficients: v = $c^i$b$_i$. You can then write a matrix representing $f$ by taking $a_{ij}$ as being the coefficient of b$_i$ of $f$(b$_j$). That is, you apply $f$ to b$_j$, then look at the b$_i$ component of the answer. Doing this for all i,j gives you $A$. $A$ then represents $f$, but is not quite the same thing as $f$. Changing the basis will change what matrix represents $f$, and it will change what tuples represent the eignevectors, but it won't change what actual vectors are eigenvectors, and it won't change the eigenvalues.

Note that if you find an $A$ from one basis, and you want to use it to find out what $f$ does in terms of another basis, you have to first change the vector back to the original basis, then apply $A$, then go back to the new basis. That can be represented by

$S^{-1}AS$v

where S is made up of the vectors of the new basis, expressed in the old. The matrix $S^{-1}AS$ is known as a conjugation of $A$ by $S$. If one matrix can be obtained from another through conjugation, then the matrices are called "similar matrices". Similar matrices have the same eigenvalues, as they can be considered to represent the same operator in different bases.

Acccumulation
  • 12,864