7

I am currently trying to solve the following problem from the 2020 Tokyo entrance exam (math department):

第一問

正方行列A,Bおとすう。$$A=\begin{pmatrix}1&\sqrt2&0\\\sqrt2&1&\sqrt2\\0&\sqrt2&1\end{pmatrix}\quad B=\begin{pmatrix}0&-2/3&1/3\\2/3&0&-2/3\\-1/3&2/3&0\end{pmatrix}$$また、行列$I$は単位行列とする。実正方行列$X$に対して、$\exp(X)$$$\exp(X)=\sum_{k=0}^\infty=I+X+\dfrac1{2!}X^2+\dfrac1{3!}X^3+\cdots$$と定義するとき、以下の問いに答えよ。
...
(4) $\alpha$を実数とするとき、$\exp(\alpha B)$が次式のように表せることを示せ。$$\exp(\alpha B)=I+(\sin\alpha)B+(1-\cos\alpha)B^2$$ただし、ケーリー・ハミルトンの定理を用いてもよい。
...

In English:

Question 1

Let $A,B$ be square matrices$$A=\begin{pmatrix}1&\sqrt2&0\\\sqrt2&1&\sqrt2\\0&\sqrt2&1\end{pmatrix}\quad B=\begin{pmatrix}0&-2/3&1/3\\2/3&0&-2/3\\-1/3&2/3&0\end{pmatrix}$$Also, assume that the matrix $I$ is the identity matrix. For a real square matrix $X$, define $\exp(X)$ as$$\exp(X)=\sum_{k=0}^\infty=I+X+\dfrac1{2!}X^2+\dfrac1{3!}X^3+\cdots$$Using this definition, answer the following questions.
...
(4) Show that $\exp(\alpha B)$ is represented by the following equation:$$\exp(\alpha B)=I+(\sin\alpha)B+(1-\cos\alpha)B^2$$where $\alpha$ is a real number. You may use the Cayley-Hamilton theorem.
...

My thinking (for $\alpha=1$):

Aside from noting that the equation trivially holds when $\alpha=0$, I know that $B$ has characteristic polynomial$$-\lambda^3-\lambda$$with eigenvalues$$\begin{align}\lambda_1=&\;i\\\lambda_2=&\;-i\\\lambda_3=&\;0\end{align}$$These have corresponding eigenvectors$$\begin{align}v_1=&\;\begin{pmatrix}(-4-3i)/5\\(-2+6i)/5\\1\end{pmatrix}\\v_2=&\;\begin{pmatrix}(-4+3i)/5\\(-2-6i)/5\\1\end{pmatrix}\\v_3=&\;\begin{pmatrix}1\\1/2\\1\end{pmatrix}\end{align}$$Now, where $B=SJS^{-1}$, we have$$\begin{align}S=&\;\begin{pmatrix}1&\frac{-4+3i}5&\frac{-4-3i}5\\\frac12&\frac{-2-6i}5&\frac{-2+6i}5\\1&1&1\end{pmatrix}\\J=&\;\begin{pmatrix}0&0&0\\0&-i&0\\0&0&i\end{pmatrix}\\S^{-1}=&\;\begin{pmatrix}\frac49&\frac29&\frac49\\-\frac29-\frac i6&-\frac19+\frac i3&\frac5{18}\\-\frac29+\frac i6&-\frac19-\frac i3&\frac5{18}\end{pmatrix}\end{align}$$and now taking $e^B=e^{SJS^{-1}}$and using the identity$$e^{YXY^{-1}}=Ye^XY^{-1}\iff Y\text{ is invertible}$$we find$$\begin{align}e^B=&\;e^{SJS^{-1}}\\=&\;Se^JS^{-1}\\=&\;S\begin{pmatrix}e^0&0&0\\0&e^{-i}&0\\0&0&e^i\end{pmatrix}S^{-1}\\=&\;S\begin{pmatrix}1&0&0\\0&\cos1-\sin1&0\\0&0&\cos1+\sin1\end{pmatrix}S^{-1}\end{align}$$This now simplifies to the matrix$$\exp(B)=\exp(SJS^{-1})=S\exp(J)S^{-1}\\=\begin{pmatrix}1-\frac59(1-\cos1)&\frac23(-\sin1+\frac13(1-\cos1))&\frac13(\sin1+\frac43(1-\cos1))\\\frac23(\sin1+\frac13(1-\cos1))&1-\frac89(1-\cos1)&\frac23(\frac13(1-\cos1)-\sin1)\\\frac13(\frac43(1-\cos1)-\sin1)&\frac23(\sin1+\frac13(1-\cos1))&1-\frac59(1-\cos1)\end{pmatrix}$$which we can see is equal to $I+(\sin1)B+(1-\cos1)B^2$ (I myself did so by computing this by hand). So that verifies the case when $\alpha=1$.


However my question is: How can I verify that this is the case for all $\alpha\in\mathbb R$? Also, how could I implement the Cayley-Hamilton theorem here?

CrSb0001
  • 2,820
  • Is there anything we're supposed to use in the 3 parts you omitted? (Also, wow, uni entrance test requiring Cayley-Hamilton) – Benjamin Wang Dec 03 '24 at 21:36
  • @BenjaminWang No, those are previous questions unrelated to (4). And actually I also omitted (5) for the same reason. – CrSb0001 Dec 03 '24 at 21:38
  • 1
    Hint: use the characteristic polynomial and Cayley-Hamilton to write a relation involving B, then use this to evaluate the power series – mikefallopian Dec 03 '24 at 21:41
  • Pretty much exactly the same works for any $\alpha.$ Scalars commute with everything. $\exp(J)$ becomes $\exp(\alpha J).$ – Thomas Andrews Dec 03 '24 at 21:45
  • https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula – Will Jagy Dec 03 '24 at 21:54
  • @WillJagy Oh, I get it. It's just what this question is asking with $\bf{R}$$= \exp(B)$ and $\theta=\alpha$, right? – CrSb0001 Dec 03 '24 at 21:56
  • right. Given a square matrix $M$ and a function $f(x)$ given by a power series that converges everywhere, the recurrence coming from the characteristic polynomial can be used to write $f(M)$ in terms of $I, M, M^2,...M^{n-1}$ – Will Jagy Dec 03 '24 at 22:01

2 Answers2

4

For your question all you need to know is that $B$ is a skew-symmetric matrix. Doing some simple matrix multiplications we find that $$ B^2=\begin{pmatrix}0&-\frac{2}{3}&\frac{1}{3}\\\frac{2}{3}&0&-\frac{2}{3}\\-\frac{1}{3}&\frac{2}{3}&0\end{pmatrix}\begin{pmatrix}0&-\frac{2}{3}&\frac{1}{3}\\\frac{2}{3}&0&-\frac{2}{3}\\-\frac{1}{3}&\frac{2}{3}&0\end{pmatrix} = \begin{pmatrix}- \frac{5}{9} & \frac{2}{9} & \frac{4}{9}\\\frac{2}{9} & - \frac{8}{9} & \frac{2}{9}\\\frac{4}{9} & \frac{2}{9} & - \frac{5}{9}\end{pmatrix}$$ and $$ B^3=\begin{pmatrix}- \frac{5}{9} & \frac{2}{9} & \frac{4}{9}\\\frac{2}{9} & - \frac{8}{9} & \frac{2}{9}\\\frac{4}{9} & \frac{2}{9} & - \frac{5}{9}\end{pmatrix}\ \begin{pmatrix}0&-\frac{2}{3}&\frac{1}{3}\\\frac{2}{3}&0&-\frac{2}{3}\\-\frac{1}{3}&\frac{2}{3}&0\end{pmatrix}= \begin{pmatrix}0 & \frac{2}{3} & - \frac{1}{3}\\- \frac{2}{3} & 0 & \frac{2}{3}\\\frac{1}{3} & - \frac{2}{3} & 0\end{pmatrix}=-B. $$ We therefore have, $$ B^4=-B^2, \; B^5=-B^3=B, \; B^6=B^2, \; B^7=-B \; \text{etc.} $$ So using the formula for $\exp(\alpha B)$, $$ \exp(\alpha B)=I+\alpha B +\frac{1}{2!} \alpha^2 B^2 - \frac{1}{3!} \alpha^3 B - \frac{1}{4!} \alpha^4 B^2 + \frac{1}{5!} \alpha^5 B+ \ldots. $$ Separate the $B$ and $B^2$ terms to write $$ \exp(\alpha B)= I+\left(\alpha - \frac{1}{3!} \alpha^3 + \frac{1}{5!} \alpha^5 -\ldots \right)B +\left( \frac{1}{2!} \alpha^2- \frac{1}{4!} \alpha^4 +\ldots\right)B^2. $$ Use the Maclaurin series \begin{align} \cos\alpha & = 1 - \frac{\alpha^2}{2!} + \frac{\alpha^4}{4!} - \ldots \\ \sin\alpha & = \alpha -\frac{\alpha^3}{3!} + \frac{\alpha^5}{5!} - \ldots \end{align} to get $$ \exp(\alpha B)= I + (\sin\alpha )B + (1-\cos\alpha)B^2. $$

Ted Black
  • 1,639
2

Here's an alternative proof. This uses the fact that $B^3 = -B$, which is proven in the other answer so I won't repeat the proof here.

Define the matrix-valued functions $$ f(\alpha) = \exp(\alpha B) $$ and $$ g(\alpha) = I + (\sin\alpha)B + (1-\cos\alpha)B^2 $$ Clearly, $f(0) = I$. Furthermore, $f'(\alpha) = B\exp(\alpha B) = B f(\alpha)$. We also have $g(0) = I$. The derivative of $g(\alpha)$ is: \begin{align} g'(\alpha) &= (\cos\alpha)B + (\sin\alpha)B^2\\ &= B\left[(\sin\alpha)B + (\cos\alpha)I\right]\\ &= B\left[g(\alpha) - (1-\cos\alpha)(I + B^2)\right]\\ &= B g(\alpha) - (1-\cos\alpha)(B + B^3)\\ &= B g(\alpha) \end{align} (In the last step above, we have used the fact that $B + B^3 = 0$.) Thus the functions $f(\alpha)$ and $g(\alpha)$ satisfy the same well-behaved differential equation with the same initial condition, and are therefore equal.

John Barber
  • 4,576
  • Since OP showed the characteristic polynomial of $B$ was $x^3+x$ and the problem says you can use Cayley-Hamilton, you don't even need to rely on the other solution to show that $B^3=-B$, you can simply use OP's work and apply CH. – Aaron Dec 04 '24 at 05:34