I was trying to remember how to show that any invertible matrix has a (possibly complex) logarithm. I thought what I came up with was kind of cool, so I thought I'd post my answer here.
1 Answers
It suffices to show that each $\lambda$-Jordan block has a logarithm if $\lambda \neq 0$.
First note that the exponential of a Jordan block is $$ \left[\begin{matrix} \lambda & 1 & 0 & \dotsb & 0 \\ & \ddots & \ddots & \ddots & \vdots \\ & & \ddots & \ddots & 0 \\ & & & \ddots & 1 \\ & & & & \lambda\\ \end{matrix}\right] \overset{\exp}{\mapsto} \left[\begin{matrix} e^{\lambda} & e^{\lambda} & \frac{e^{\lambda}}{2!} & \dotsb & \frac{e^{\lambda}}{(k-1)!} \\ & e^{\lambda} & \ddots & \ddots & \vdots \\ & & \ddots & e^{\lambda} & \frac{e^{\lambda}}{2!} \\ & & & e^{\lambda} & e^{\lambda} \\[7pt] & & & & e^{\lambda} \\ \end{matrix}\right]. $$ So, reversing the process, given a Jordan block $$J=\left[\begin{matrix} a & 1 & \dotsb & 0 \\ & a & \ddots & \vdots \\ & & \ddots & 1 \\ & & & a \end{matrix}\right] $$ with $a\neq 0$, we want to show that $J$ is similar to $$ \left[\begin{matrix} a\quad & a & \frac{a}{2!} & \dotsb & \frac{a}{(k-1)!} \\ & a & \ddots & \ddots & \vdots \\ & & \ddots & a & \frac{a}{2!} \\ & & & a & a \\[7pt] & & & & a \\ \end{matrix}\right], $$ since we know how to find the logarithm of the above matrix, and since $\log(UMU^{-1})=U\log(M)U^{-1}$. Now, since the scalar matrix $aI$ has the same form with respect to any basis, we can neglect this part, and just ask if we can conjugate $$\left[\begin{matrix} 0 & 1 & \dotsb & 0 \\ & 0 & \ddots & \vdots \\ & & \ddots & 1 \\ & & & 0 \end{matrix}\right] \tag{$M_1$} $$ into $$ \left[\begin{matrix} 0\quad & a & \frac{a}{2!} & \dotsb & \frac{a}{(k-1)!} \\ & 0 & \ddots & \ddots & \vdots \\ & & \ddots & a & \frac{a}{2!} \\ & & & 0 & a \\[7pt] & & & & 0 \\ \end{matrix}\right]. \tag{$M_2$} $$ First of all, we can see algebraically that these two matrices should be similar, since $M_2= aN + aN^2 + \dots + \dfrac{a}{(k-1)!}N^{k-1}$, where $N$ is the elementary nilpotent matrix of size $k$ (here $N=M_1$, incidentally). Taking powers of $M_2$ shows that $M_2^{k-1}\neq 0 \,\, M_2^k=0$. Therefore $M_1$ and $M_2$ have the same minimal polynomial $x^k$. I claim they also have the same characteristic polynomial, also $x^k$. The only possibility for the invariant factors of $M_1$ and $M_2$ is $1, 1, \dotsc, 1, x^k$. Therefore they are similar.
That said, I thought I would tackle the more general problem: Suppose we want to conjugate $$\left[\begin{matrix} 0 & 1 & \dotsb & 0 \\ & 0 & \ddots & \vdots \\ & & \ddots & 1 \\ & & & 0 \end{matrix}\right] \tag{$M_3$}$$ into $$\left[\begin{matrix} 0 & a_{12} & a_{13} & \dotsb & a_{1k} \\ & 0 & a_{23} & \dotsb & a_{2k} \\ & & \ddots & \ddots & \vdots \\ & & & \ddots & a_{k-1,k}\\ & & & & 0 \end{matrix}\right]\tag{$M_4$}$$
Given that $\mathcal{B}=\{v_1, \dotsc, v_k\}$ is an ordered basis such that $M_3 = [T]_{\mathcal{B}\mathcal{B}}$, conjugating $M_3$ into $M_4$ is equivalent to finding an ordered basis $\mathcal{C} = \{w_1, \dotsc, w_k\}$ such that $[T]_{\mathcal{C}\mathcal{C}}=M_4$.
Let $v_1, \dotsc, v_k$ be the basis in $M_1$. Define $w_1, \dotsc, w_k$ as follows: let $w_1=v_1$. Now suppose $$w_{i-1}=\sum_{1\leq j\leq i-1}c_jv_{j}$$ Then let $$w_i = \sum_{1\leq j\leq i-1}a_{n-j}c_jv_{j+1}.$$
We need to make sure $w_i$ are linearly independent. I claim they will be iff $M_3$ is similar to $M_4$ iff $M_4^{k-1}\neq 0$. One condition which will guarantee this is: $$\text{All elements on the superdiagonal of $M_4$ are nonzero.}$$ Perhaps someone can come up with some better conditions for it.
- 28,997
-
What about the $1\times 1$ invertible matrix $[-1]$? What is the logarithm? – Owen Sizemore Apr 07 '14 at 02:10
-
@OwenSizemore $\pi i$, for instance. – Eric Auld Apr 07 '14 at 02:12
-
Ok sure. But then it seems you must be taking a branch of the log at some point, which if you want to do this for any matrix in a consistent way you run into analyticity problems for expressing the log. Then the usual argument that $\log(UMU^{-1})=U\log(M)U^{-1}$ doesn't seem to work for all matrices. Maybe this is only a minor difficulty though... – Owen Sizemore Apr 07 '14 at 02:17
-
@OwenSizemore Why should the branch need to be consistent? I can choose $\pi i$ for one Jordan block and $3 \pi i$ for another Jordan block if I like. (Or perhaps I'm misunderstanding your question.) I don't understand what you mean about analyticity. – Eric Auld Apr 07 '14 at 02:18
-
Yes you're probably right. Either way I just realized that since $0$ is not in the spectrum (which is finite) you can just choose the cut line for your branch to miss all of the point and choose that as your branch. – Owen Sizemore Apr 07 '14 at 02:22