3

So what can we say about the relationship between eigenvalues and eigenvectors of square invertible $A$ and its inverse $A^{-1}$?

We know that $A$ is invertible iff all its eigenvalues are nonzero, thus we have $Ax=\lambda x$ iff $A^{-1}x = \frac{1}{\lambda}x$.

But is this all? Does anyone see anything more? (Guessing you do...)

onimoni
  • 6,706
  • 2
    What else is there to know? The statement you have given is:

    For an invertible square matrix $A$, a nonzero vector $x$ is an eigenvector of $A$ with nonzero eigenvalue $λ$ if and only if $x$ is an eigenvector of $A^{-1}$ with nonzero eigenvalue $λ^{-1}$.

    You have already all the information for $A^{-1}$ if you have all the information for $A$ regarding eigenvalues and -vectors.

    – k.stm Sep 25 '13 at 09:08

1 Answers1

2

One conclusion you can make is that all eigenvectors of $A$ are eigenvectors of $A^{-1}$ as well, and vice versa. As you noted, the corresponding eigenvalues (for the same eigenvector) are inverses of one another.