1

Suppose $A$ is a $2\times 2$ matrix with the complex eigenvalues $\lambda = \alpha \pm i \beta$.

I need to show that $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}V^{-1}$, where $V$ is an invertible matrix with columns $v_{1}$ and $v_{2}$. Then, I need to show that $v_{1}+iv_{2}$ is an eigenvector of $\lambda = \alpha + i\beta$.

First of all, this is for a differential equations course, and it has been a very long time since I've done any serious linear algebra. I think what this problem is saying is that I need to show that $A$ is similar to a matrix with entries the $\pm$ real and imaginary parts of its eigenvalues, but I'm not sure how to do that.

Secondly, how do I show that $v_{1}+iv_{2}$ is an eigenvalue of the $\alpha + i\beta$ eigenvalue without explicitly knowing what $v_{1}$ and $v_{2}$ are? (Or will I know? I'm very confused).

I am in a bit over my head with this problem and could really use some guidance. I thank you for your time and patience.

4 Answers4

2

Your result stands only if $(*)$ $\beta\not= 0$. We assume $(*)$ and that $A\in M_2(\mathbb{R})$ with $spectrum(A)=\{\alpha \pm i\beta\}$. Since $A$ is diagonalizable over $\mathbb{C}$, then it is similar over $\mathbb{C}$ to $diag(\lambda,\overline{\lambda})$ and then, to $B=\begin{pmatrix}\alpha&\beta\\-\beta&\alpha\end{pmatrix}$ (since $spectrum(B)=spectrum(A))$. Since $(1)$ two real matrices that are similar over $\mathbb{C}$ are also similar over $\mathbb{R}$, there is an invertible $V\in M_2(\mathbb{R})$ s.t. $A=VBV^{-1}$.

EDIT. The proof of $(1)$ is standard. Let $A,B\in M_n(\mathbb{R}),P=P_1+iP_2\in M_n(\mathbb{C})$ be s.t. $A=PBP^{-1}$. Then $AP=PB$ and moreover $AP_1=P_1B,AP_2=P_2B$; then, for every $t\in\mathbb{C}$, $A(P_1+tP_2)=(P_1+tP_2)B$. Consider the function $f:t\in \mathbb{C}\rightarrow \det(P_1+tP_2)$; $f$ is a polynomial that is not identically $0$ because $f(i)\not= 0$; then there is $t\in\mathbb{R}$ s.t. $f(t)\not= 0$ and we are done.

  • is there any way to show this without using this result? Or perhaps you could point me to somewhere where I could find said result in its entirety? –  Apr 16 '18 at 12:44
  • If you are OK with the fact that $A$ and $B$ can be diagonalized, then, it is obvious that they are similar to the same diagonal matrices since they share the same eigenvalues. Since similarity is an equivalence relation, it is transitive, thus, $A$ and $B$ are similar which is what you are asked. – Bill O'Haran Apr 16 '18 at 13:01
  • @ Bill O'Haran , by your reasoning $A=VBV^{-1}$ where $V$ is complex because the eigenvalues are not in $\mathbb{R}$; in fact, we can choose a real matrix $V$. cf. my edit. –  Apr 16 '18 at 13:06
  • @loupblanc which edit are you referring to? –  Apr 16 '18 at 13:12
  • @BillO'Haran I thought matrices with complex eigenvalues could not be diagonalized. –  Apr 16 '18 at 13:13
  • @BillO'Haran , just now. –  Apr 16 '18 at 13:15
  • @loupblanc possibly a stupid question, but how do you know that there even exists a $P=P_{1}+i P_{2} \in M_{n}(\mathbb{C})$ s.t. $A=PBP^{-1}$? –  Apr 16 '18 at 13:19
  • @loupblanc , I agree but I fail to see the need to solve into $\mathbb{R}$ (even $v_1 + i v_2$ being an eigenvector does not prevent $V$ from being complex). – Bill O'Haran Apr 16 '18 at 13:24
  • @ALannister , as long as $\beta \neq 0$, the two eigenvalues of $A$ are distinct hence, the characteristic polynomial only has simple roots and $A$ can thus be diagonalized. – Bill O'Haran Apr 16 '18 at 13:24
  • @BillO'Haran can you answer the last follow-up question that I directed at the White Wolf (aka the King in the North)? –  Apr 16 '18 at 13:36
  • @ALannister ,sure, $P$ verifies the definition of similarity between $A$ and $B$ and since $P$ is complex, it can be separated into its real ($P_1$) and imaginary ($P_2$) parts like any complex matrix. Does that answer your question? – Bill O'Haran Apr 16 '18 at 13:42
2

As loup blanc noticed, your assertions hold only if $\beta \neq 0$. Then you are able to prove $A = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}V^{-1}$ by using the spectrum of $A$.

For your second question, let $V=\begin{pmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{pmatrix}$ and look at $AV = V \begin{pmatrix} \alpha & \beta \\ - \beta & \alpha \end{pmatrix}$ column by column since $V= (v_1, v_2)$: $$ Av_1=\begin{pmatrix} \alpha v_{11} - \beta v_{12} \\ \alpha v_{21} - \beta v_{22} \end{pmatrix} $$

$$ Av_2=\begin{pmatrix} \alpha v_{12} + \beta v_{11} \\ \alpha v_{22} + \beta v_{21} \end{pmatrix} $$

Then, $Av_1 + i Av_2 = \lambda (v_1+iv_2)$

Bill O'Haran
  • 3,032
  • if $\beta=0$, then it degenerates into the case where we have a real, repeated eigenvalue. What would we do in that case? –  Apr 16 '18 at 20:21
  • Well, to be able to diagonalize $A$, you would have to prove that $\text{Ker}(A-\lambda I)$ has dimension 2. Otherwise, $A$ is trigonalizable (as any complex matrix) but not diagonalizable. – Bill O'Haran Apr 16 '18 at 20:24
  • is that something we can do if we don't know explicitly what $A$ is? –  Apr 16 '18 at 20:31
  • I guess the easiest way would be to check that the rank of $A-\lambda I$ is 0. But other than that, I do not think it is something one can do without further knowledge of $A$. Maybe someone more knowledgeable than me will find some way. – Bill O'Haran Apr 16 '18 at 20:44
  • can you check if rank of $A-\lambda i=0$ without knowing what $A$ is in this case? –  Apr 16 '18 at 20:46
  • The only matrix of rank $0$ is the matrix with all entries set to $0$ so the only case where $\beta = 0$ and $A$ is diagonalizable is when $A$ is already diagonal. Basically, the only case where $A$ is not diagonalizable is when $\text{det}A= 0$ and $A\neq 0$ and $A$ is not diagonal. – Bill O'Haran Apr 16 '18 at 20:51
1

Assume that $\beta\ne 0$, otherwise we’re dealing with a repeated real eigenvalue, which requires a different analysis. A $2\times2$ matrix with eigenvalues $\alpha\pm i\beta$ has characteristic equation $(\lambda-\alpha)^2+\beta^2=0$. Let $B=A-\alpha I$. By the Cayley-Hamilton theorem, $(A-\alpha I)^2+\beta^2I=0$, therefore $B^2=-\beta^2I$. Let $\mathbf v_1$ be any nonzero real vector and define $\mathbf v_2 = -\frac1\beta B\mathbf v_1$. Then $$B\mathbf v_2 = \frac1\beta B^2\mathbf v_1 = \beta\mathbf v_1$$ and by definition $$B\mathbf v_1=-\beta\mathbf v_2.$$ Let $V = \begin{bmatrix}\mathbf v_1&\mathbf v_2\end{bmatrix}$. Then $$BV\begin{bmatrix}1\\0\end{bmatrix} = B\mathbf v_1 = -\beta\mathbf v_2 = -\beta V\begin{bmatrix}0\\1\end{bmatrix}$$ and $$BV\begin{bmatrix}0\\1\end{bmatrix} = B\mathbf v_2 = \beta\mathbf v_1 = \beta V\begin{bmatrix}1\\0\end{bmatrix}$$ therefore $$BV = V\begin{bmatrix}0&\beta\\ -\beta&0\end{bmatrix}.$$ Finally, $$A = \alpha I+B = \alpha I+V\begin{bmatrix}0&\beta\\-\beta&0\end{bmatrix}V^{-1} = V\begin{bmatrix}\alpha&\beta\\-\beta&\alpha\end{bmatrix}V^{-1}.$$

Proving the second part is a matter of multiplying out $A(\mathbf v_1+i\mathbf v_2) = (\alpha I+B)(\mathbf v_1+i\mathbf v_2)$ for this choice of $\mathbf v_1$ and $\mathbf v_2$ and doing a bit of straightforward algebra.

amd
  • 55,082
  • might be a silly thing to ask, but we know that $V$ is invertible because? –  Apr 17 '18 at 15:54
  • @ALannister It’s a fair question. $\mathbf v_1$ and $\mathbf v_2$ are linearly independent because $A$, and so also $B$, has no real eigenvalues. – amd Apr 17 '18 at 16:14
0

You can use the following $$ \begin{pmatrix} 1 & i \\ 1 & -i \end{pmatrix} \begin{pmatrix} A & -B \\ B & A \end{pmatrix} \begin{pmatrix} \frac12 & \frac12 \\-\frac i2& \frac i2\end{pmatrix} = \begin{pmatrix} A+Bi &\\ &A-Bi \end{pmatrix} $$

percusse
  • 562