I am currently trying to solve the following problem from the 2020 Tokyo entrance exam (math department):
第一問
正方行列A,Bおとすう。$$A=\begin{pmatrix}1&\sqrt2&0\\\sqrt2&1&\sqrt2\\0&\sqrt2&1\end{pmatrix}\quad B=\begin{pmatrix}0&-2/3&1/3\\2/3&0&-2/3\\-1/3&2/3&0\end{pmatrix}$$また、行列$I$は単位行列とする。実正方行列$X$に対して、$\exp(X)$を$$\exp(X)=\sum_{k=0}^\infty=I+X+\dfrac1{2!}X^2+\dfrac1{3!}X^3+\cdots$$と定義するとき、以下の問いに答えよ。
...
(4) $\alpha$を実数とするとき、$\exp(\alpha B)$が次式のように表せることを示せ。$$\exp(\alpha B)=I+(\sin\alpha)B+(1-\cos\alpha)B^2$$ただし、ケーリー・ハミルトンの定理を用いてもよい。
...
In English:
Question 1
Let $A,B$ be square matrices$$A=\begin{pmatrix}1&\sqrt2&0\\\sqrt2&1&\sqrt2\\0&\sqrt2&1\end{pmatrix}\quad B=\begin{pmatrix}0&-2/3&1/3\\2/3&0&-2/3\\-1/3&2/3&0\end{pmatrix}$$Also, assume that the matrix $I$ is the identity matrix. For a real square matrix $X$, define $\exp(X)$ as$$\exp(X)=\sum_{k=0}^\infty=I+X+\dfrac1{2!}X^2+\dfrac1{3!}X^3+\cdots$$Using this definition, answer the following questions.
...
(4) Show that $\exp(\alpha B)$ is represented by the following equation:$$\exp(\alpha B)=I+(\sin\alpha)B+(1-\cos\alpha)B^2$$where $\alpha$ is a real number. You may use the Cayley-Hamilton theorem.
...
My thinking (for $\alpha=1$):
Aside from noting that the equation trivially holds when $\alpha=0$, I know that $B$ has characteristic polynomial$$-\lambda^3-\lambda$$with eigenvalues$$\begin{align}\lambda_1=&\;i\\\lambda_2=&\;-i\\\lambda_3=&\;0\end{align}$$These have corresponding eigenvectors$$\begin{align}v_1=&\;\begin{pmatrix}(-4-3i)/5\\(-2+6i)/5\\1\end{pmatrix}\\v_2=&\;\begin{pmatrix}(-4+3i)/5\\(-2-6i)/5\\1\end{pmatrix}\\v_3=&\;\begin{pmatrix}1\\1/2\\1\end{pmatrix}\end{align}$$Now, where $B=SJS^{-1}$, we have$$\begin{align}S=&\;\begin{pmatrix}1&\frac{-4+3i}5&\frac{-4-3i}5\\\frac12&\frac{-2-6i}5&\frac{-2+6i}5\\1&1&1\end{pmatrix}\\J=&\;\begin{pmatrix}0&0&0\\0&-i&0\\0&0&i\end{pmatrix}\\S^{-1}=&\;\begin{pmatrix}\frac49&\frac29&\frac49\\-\frac29-\frac i6&-\frac19+\frac i3&\frac5{18}\\-\frac29+\frac i6&-\frac19-\frac i3&\frac5{18}\end{pmatrix}\end{align}$$and now taking $e^B=e^{SJS^{-1}}$and using the identity$$e^{YXY^{-1}}=Ye^XY^{-1}\iff Y\text{ is invertible}$$we find$$\begin{align}e^B=&\;e^{SJS^{-1}}\\=&\;Se^JS^{-1}\\=&\;S\begin{pmatrix}e^0&0&0\\0&e^{-i}&0\\0&0&e^i\end{pmatrix}S^{-1}\\=&\;S\begin{pmatrix}1&0&0\\0&\cos1-\sin1&0\\0&0&\cos1+\sin1\end{pmatrix}S^{-1}\end{align}$$This now simplifies to the matrix$$\exp(B)=\exp(SJS^{-1})=S\exp(J)S^{-1}\\=\begin{pmatrix}1-\frac59(1-\cos1)&\frac23(-\sin1+\frac13(1-\cos1))&\frac13(\sin1+\frac43(1-\cos1))\\\frac23(\sin1+\frac13(1-\cos1))&1-\frac89(1-\cos1)&\frac23(\frac13(1-\cos1)-\sin1)\\\frac13(\frac43(1-\cos1)-\sin1)&\frac23(\sin1+\frac13(1-\cos1))&1-\frac59(1-\cos1)\end{pmatrix}$$which we can see is equal to $I+(\sin1)B+(1-\cos1)B^2$ (I myself did so by computing this by hand). So that verifies the case when $\alpha=1$.
However my question is: How can I verify that this is the case for all $\alpha\in\mathbb R$? Also, how could I implement the Cayley-Hamilton theorem here?