2

I have the matrix $$A =\begin{pmatrix} 0 & 1 \\ - 1 & 0 \end{pmatrix}$$

and I have to find $e^A$. I've found two complex-conjugate eigenvalues $\lambda_{1,2} = \pm i$, so substracting $\lambda_1 = i$ from the matrix's diagonal I got:

$$A_1 = \begin{pmatrix} -i & 1 \\-1 & i \end{pmatrix}$$

and therefore. to find eigenvector I have to solve the system:

$$A_1 = \begin{pmatrix} -i & 1 \\-1 & i \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}$$

so the first eigenvector is $h_1 = \begin{pmatrix}1 \\ i\end{pmatrix}$ and the second one is $h_2 = \begin{pmatrix}1 \\ -i\end{pmatrix}$ so the general solution is $$x(t) = C_1e^{it}\begin{pmatrix} 1 \\i\end{pmatrix} + C_2e^{-it}\begin{pmatrix} 1 \\-i\end{pmatrix}$$

I know that now I have to solve two Cauchy's problems for the standard basis $\mathbb{R}^2$ with vectors $v_1 = (1, 0)$ and $v_2 = (0,1)$ But I do not know how to approach it for complex numbers.

M.Mass
  • 2,696
  • 3
    recommend taking more general $$B =\begin{pmatrix} 0 & x \ - x & 0 \end{pmatrix}$$ and carefully writing out the four entries in $$ I + B + \frac{B^2}{2} + \frac{B^3}{6} + \frac{B^4}{24} + \frac{B^5}{120} $$ and see if the entries remind you of anything – Will Jagy Dec 05 '17 at 18:15
  • 3
    To add to Will Jagy's hint, $A^2=-I$. – Batominovski Dec 05 '17 at 18:16

5 Answers5

2

If the matrix have real coefficients then it complex eigenvalues comes paired, that is, if $\lambda$ is an eigenvalue of $A$ then it is also $\bar\lambda$. For a $2\times 2$ matrix this mean that exists some change of basis $S$ such that $SAS^{-1}=\left[\begin{smallmatrix}\alpha&-\beta\\\beta&\alpha\end{smallmatrix}\right]$, where $\lambda=\alpha+i\beta$.

Then it can be shown that the matrices $B=\left[\begin{smallmatrix}\alpha&0\\0&\alpha\end{smallmatrix}\right]$ and $C=\left[\begin{smallmatrix}0&-\beta\\\beta&0\end{smallmatrix}\right]$ commute, then, from the definition of the exponential map, it can be checked that

$$e^A=S^{-1}e^{SAS^{-1}}S=S^{-1}e^{B+C}S=S^{-1}e^{\alpha I}\cdot e^CS=e^\alpha S^{-1}\begin{bmatrix}\cos\beta&-\sin\beta\\\sin\beta&\cos\beta\end{bmatrix}S$$


This result can be extended to find what is called the extended normal Jordan form of any real-valued matrix, what make easy the computation of it exponential. Take a look here.


ADDITION: in the simple case that $A=\left[\begin{smallmatrix}0&1\\-1&0\end{smallmatrix}\right]$ we can use directly the definition of the exponential map noticing that $A^2=-I$, $A^3=-A$, $A^4=I$ and $A^5=A$. Thus

$$\begin{align}e^A:&=\sum_{k=0}^\infty\frac{A^k}{k!}\\&=\sum_{k=0}^\infty (-1)^k\frac{I}{(2k)!}+\sum_{k=0}^\infty(-1)^k\frac{A}{(2k+1)!}\\&=\begin{bmatrix}\sum_{k=0}^\infty(-1)^k\frac1{(2k)!}&0\\0&\sum_{k=0}^\infty(-1)^k\frac1{(2k)!}\end{bmatrix}+\begin{bmatrix}0&\sum_{k=0}^\infty(-1)^k\frac1{(2k+1)!}\\\sum_{k=0}^\infty(-1)^k\frac{-1}{(2k+1)!}&0\end{bmatrix}\\&=\begin{bmatrix}\cos 1&0\\0&\cos 1\end{bmatrix}+\begin{bmatrix}0&\sin 1\\\sin(-1)&0\end{bmatrix}\\&=\begin{bmatrix}\cos 1&\sin 1\\-\sin 1&\cos 1\end{bmatrix}\end{align}$$

  • Performing a complete Jordan decomposition is unnecessary in the general case. If the eigenvalues are $\alpha\pm i\beta$, then the traceless matrix $B=A-\alpha,I$ satisfies $B^2=-\beta^2I$, so decomposing $A$ as $\alpha I+B$ allows a simple direct computation of $e^{tA}$. – amd Dec 05 '17 at 21:54
  • @amd but $\alpha I+B$ is an extended Jordan matrix... –  Dec 05 '17 at 22:02
  • 1
    $\tiny{\begin{bmatrix}0&-2\1&2\end{bmatrix}}$, for instance, is not an extended Jordan matrix, nor is the corresponding $B=\tiny{\begin{bmatrix}−1&-2\1&1\end{bmatrix}}$. My point is that this decomposition allows a straightforward calculation of the exponential of such a matrix without having to first find the change-of-basis matrix $S$. In particular, a similar computation to the one you have above gives $\exp(tB)=\cos\beta t,I+[(\sin \beta t)/\beta],B$ and $\exp(tA)=\exp(\alpha t)\exp(tB)$. – amd Dec 05 '17 at 22:34
  • @amd oh, I see. Thank you for the explanation, I was not getting the point of your previous comment. –  Dec 05 '17 at 22:59
1

Let$$T=\begin{pmatrix}1&1\\i&-i\end{pmatrix};$$its columns are the eigenvectors that you found. Then$$T^{-1}.A.T=\begin{pmatrix}i&0\\0&-i\end{pmatrix}.$$Therefore,$$T^{-1}.A^n.T=\begin{pmatrix}i&0\\0&-i\end{pmatrix}^n=\begin{pmatrix}i^n&0\\0&(-i)^n\end{pmatrix}$$and so$$T^{-1}.e^A.T=\sum_{n=0}^\infty\frac1{n!}\begin{pmatrix}i^n&0\\0&(-i)^n\end{pmatrix}=\begin{pmatrix}e^i&0\\0&e^{-i}\end{pmatrix}.$$Therefore,\begin{align}e^A&=T.\begin{pmatrix}e^i&0\\0&e^{-i}\end{pmatrix}.T^{-1}\\&=\begin{pmatrix}\frac{e^i+e^{-i}}2&\frac{e^i-e^{-i}}{2i}\\-\frac{e^i-e^{-i}}{2i}&\frac{e^i+e^{-i}}2\end{pmatrix}\\&=\begin{pmatrix}\cos 1&\sin 1\\-\sin1&\cos1\end{pmatrix}.\end{align}

1

Answer

Let a,b $\in \mathbb{R}$ and $$ \begin{align} I &= \begin{pmatrix} 1 & 0 \\ 0 & 1 \\ \end{pmatrix} \\ J &= \begin{pmatrix} 0 & 1 \\ -1 & 0 \\ \end{pmatrix} \end{align} $$

Then the following formula is true,

$$ \exp(aI + bJ) = \exp(a)[I \cos(b) + J \sin(b)]$$

In the special case where a = 0 and b = 1,

$$ \exp(J) = \begin{pmatrix} \cos(1) & \sin(1) \\ -\sin(1) & \cos(1) \\ \end{pmatrix} $$


Proof

We will need the following lemmas to go through the chain of equalities. The lemmas are not going to be proved or explained here.

Let n $\in \mathbb{N}$, x $\in \mathbb{R}$, A and B two matrices

  1. The product of $aI$ and $bJ$ is commutative
  2. $\exp(A + B) = \exp(A)\exp(B)$ if $AB = BA$
  3. $J^{2n} = (-1)^n I$
  4. $J^{2n+1} = (-1)^n J$

Therefore,

$$ \begin{align} \exp(aI + bJ) &= \exp(aI)\exp(bJ) \\ &= \exp(a)[I + bJ + (bJ)^2/2! + (bJ)^3/3! + \cdots] \\ &= \exp(a)[I(1 - b^2/2! + b^4/4! - \cdots) + J(b - b^3/3! + b^5/5! - \cdots)] \\ &= \exp(a)[I\cos(b) + J \sin(b)] \end{align} $$

1

Note that ${\bf Q} := \frac1i{\bf A}$ is involutory. Using Euler's formula,

$$ \exp(t {\bf A}) = \exp(i t {\bf Q}) = \cos (t {\bf Q}) + i \sin (t {\bf Q}) = \cos (t) \, {\bf I}_2 + i \sin (t) \, {\bf Q} = \color{blue}{\cos(t) \, {\bf I}_2 + \sin (t) \, {\bf A}} $$

Verifying using SymPy:

>>> from sympy import * 
>>> t = Symbol('t', real=True)
>>> A = Matrix([[ 0, 1],
                [-1, 0]])
>>> exp(t * A)
Matrix([[ cos(t), sin(t)],
        [-sin(t), cos(t)]])

Related: Calculating matrix exponential

0

Like here For any $a \in \mathbb{R}$ evaluate $ \lim\limits_{n \to \infty}\left(\begin{smallmatrix} 1&\frac{a}{n}\\\frac{-a}{n}&1\end{smallmatrix}\right)^{n}.$ Employing the Identification $$ 1 \equiv \left(\begin{matrix} 1& 0\\0&1 \end{matrix}\right)~~~\text{and}~~~~ i \equiv \left(\begin{matrix} 0& 1\\-1&0 \end{matrix}\right)=A.$$

we get, $$\color{red}{A^n = i^n ~~~and~~~~ e^A =\sum_{n=0}^{\infty} \frac{i^n}{n!} = e^i = \cos 1+i \sin 1 = \begin{pmatrix}\cos 1&\sin 1\\-\sin1&\cos1\end{pmatrix}.}$$

Guy Fsone
  • 25,237