So what can we say about the relationship between eigenvalues and eigenvectors of square invertible $A$ and its inverse $A^{-1}$?
We know that $A$ is invertible iff all its eigenvalues are nonzero, thus we have $Ax=\lambda x$ iff $A^{-1}x = \frac{1}{\lambda}x$.
But is this all? Does anyone see anything more? (Guessing you do...)
For an invertible square matrix $A$, a nonzero vector $x$ is an eigenvector of $A$ with nonzero eigenvalue $λ$ if and only if $x$ is an eigenvector of $A^{-1}$ with nonzero eigenvalue $λ^{-1}$.
You have already all the information for $A^{-1}$ if you have all the information for $A$ regarding eigenvalues and -vectors.
– k.stm Sep 25 '13 at 09:08