3

How to determine the Basis transformation and all the explicit lambdas with the matrix (let say $A$) below?

\begin{pmatrix}1&a&0\\ a&1&0\\ 0&0&b\end{pmatrix}

I know that we have to first find the characteristic polynomial from \begin{pmatrix}1-λ&a&0\\ a&1-λ&0\\ 0&0&b-λ\end{pmatrix}

and then calculate the eigenvalue and eigenvector, but im currently stuck at determining the eigenvalue and my result of the characteristic polynomial is $-\lambda^3+\lambda^2\left(2+b\right)+\lambda(-2b-1+a^2)-b(a^2-1)$. I tried to factorize it but for a cube function it should have 3 roots and 3 possible value for lambda but when i try to factorize from my characteristic polynomial and became $\lambda([-(\lambda-b)(\lambda-1)]+\lambda-b+a^2-1)$. So is the roots (0,b,1)? But b=...? How do I transform an eigenmatrix and form a basis transformation? Do i just do it as like normal basis transformation in real vector space?

  • 2
    Note that $b$ is a root of the polynomial that you have. This is no accident. From the matrix it is clear that $(0, 0, 1)$ is an eigenvector with eigenvalue $\lambda$. After you factor out by $(\lambda-b)$ then you are left with e degree two polynomial whihc you can easily find the roots of. – caffeinemachine May 26 '20 at 23:26
  • 1
    This matrix is block diagonal. Its eigenvalues are the eigenvalues of the blocks. In this case one eigenvalue is $b$ (from the $1\times 1$ block) and the other two are the solution of the characteristic equation of the $2 \times 2$ block,$$\det\begin{pmatrix}1 - \lambda & a \ a & 1-\lambda\end{pmatrix} = 0.$$ –  May 26 '20 at 23:28
  • You should show your work, especially if you’re ending up with a nonsensical result. You might just be making a simple algebraic error, but we can’t tell you where the error is without seeing what it is you’ve done. – amd May 26 '20 at 23:44
  • 2
    All of the eigenvalues and associated eigenvectors of this matrix can be found by inspection. As has already been noted by @caffeinemachine, $(0,0,1)$ is an eigenvector with eigenvalue $b$. If you add the first two columns, you get $(a+1,a+1,0)$, so $(1,1,0)$ is another eigenvector with eigenvalue $a+1$. The third eigenvalue comes “for free” from the trace, and its eigenvector is orthogonal to the other two. – amd May 26 '20 at 23:48
  • @caffeinemachine how did you observe that the roots are (0,0,1)? my final factorization from the above characteristic polynomial is λ[-[(λ-b)(λ-1)]+λ-b+a^2-1]. – Florence Wong May 27 '20 at 04:36
  • @amd okay ive edited it – Florence Wong May 27 '20 at 04:43
  • The roots aren’t $0$ and $1$. $(0,0,1)$ is an eigenvector of this matrix. – amd May 27 '20 at 04:45
  • Does it make sense for $0$ to always be an eigenvalue of this matrix? It has full rank for most values of $a$ and $b$. How did you end up with a factor of $\lambda$ for your polynomial when its constant term is nonzero? – amd May 27 '20 at 04:49
  • @FlorenceWong Look at amd's answer. I cannot explain it better. – caffeinemachine May 27 '20 at 05:26

1 Answers1

1

You’ve computed the characteristic polynomial correctly, but appear to be getting lost trying to factor it. The constant term is $b(a^2-1)$, which is nonzero for most values of $a$ and $b$, so $\lambda$ can’t be a factor. This goes hand in hand with the rank of the matrix: it’s obviously full-rank unless $a=\pm1$ or $b=0$, so zero can’t always be an eigenvalue.

You can make much less work for yourself by expanding the determinant along the last row or column. Two of the entries are zero, so your characteristic polynomial will already be partially factored. To wit, we get $(b-\lambda)((1-\lambda)^2-a^2)$. The second factor is a difference of two squares, which I assume that you know how to factor, so we get $(b-\lambda)(1-\lambda+a)(1-\lambda-a)$ with hardly any effort. The eigenvalues of the matrix therefore are $b$, $1+a$ and $1-a$. Compute corresponding eigenvectors in the usual way, taking care with the variables $a$ and $b$ in the matrix if you’re using Gaussian elimination.

As I noted in a comment, eigenvectors and eigenvalues of this matrix can be found pretty much by inspection. Recall that the columns of the matrix are the images of the standard basis. The last column is a multiple of $(0,0,1)^T$, so that gives you one eigenvector with eigenvalue $b$.

Next, recall that when you multiply a vector by this matrix, the result is a linear combination of its columns. Observe that the sum of the first two columns is $(1+a,1+a,0)^T$. Summing the first two columns is equivalent to multiplying the matrix by $(1,1,0)^T$, so you have another eigenvector, with eigenvalue $1+a$.

You can always get the last eigenvalue “for free,” since the trace is equal to the sum of the eigenvalue. Here, this gives us $(1+1+b)-b-(1-a) = 1-a$ for the last eigenvalue. The matrix is symmetric, so eigenvectors that have different eigenvalues are orthogonal. Assuming that the three eigenvalues are distinct, that means we can take the cross product of the two other eigenvectors found so far to get a third linearly-independent eigenvector. Alternatively, observe that subtracting the second column from the first results in a multiple of $(1,-1,0)^T$, so that’s another eigenvector.

A curious thing about this family of matrices is that even though the eigenvalues depend on $a$ and $b$, and in some cases you’ll even end up with repeated eigenvalues, these matrices have a common orthogonal eigenbasis that’s independent of these parameters.

amd
  • 55,082