0

I read this problem from Kostrikin's algebra problem book. Here is my solution:

By complex spectral theorem, we know the normal operator $A$ can be diagonalised with respect to an orthonormal basis consisting of $A$'s eigenvectors, therefore we have \begin{equation*} \mathcal{M}(A)= \begin{bmatrix} \lambda_{1} && 0\\ & \ddots &\\ 0 && \lambda_{n} \end{bmatrix} \end{equation*} Given $AB=BA$, we have \begin{equation*} AB_{ij}=\sum_{k=1}^{n}a_{ik}b_{kj}=\lambda_{i}b_{ij} \end{equation*} \begin{equation*} BA_{ij}=\sum_{k=1}^{n}b_{ik}a_{kj}=\lambda_{j}b_{ij} \end{equation*} Then we yield $\lambda_{i}b_{ij}=\lambda_{j}b_{ij}$, which just implies $\lambda_{i}=\lambda_{j}$. We conclude that $A$ is just a scalar multiple of the identity matrix. Since identity matrix commutes with any matrix, we conclude that $AB^{*}=B^{*}A$. Q.E.D

Is my idea correct?

  • This is also true for infinite-dimensional inner product space. No, it is not true that $A$ is a scalar multiple of the identity. I guess your proof fails because $b_{ij}$ could be $0$ for some $i,j$. – GEdgar Jul 24 '21 at 23:06
  • The "we let $B$ be a matrix with respect to a basis" and all the follows is not clear. The matrix $B$ is fixed and is already given to you. You can't choose it. – levap Jul 24 '21 at 23:31
  • @GEdgar Would you like to offer a counterexample to show that $A$ can be a matrix other than a scalar multiple of the identity? I felt my proof is somehow problematic and I think a coutnerexample could help most. Thank you. – Mr.Infinity Jul 24 '21 at 23:34
  • @levap Ok I will change my wording, deleting "we let B be a matrix with respect to a basis" – Mr.Infinity Jul 24 '21 at 23:36
  • 1
    A counterexample is that a diagonal matrix $B$ will always commute with $A$ if $A$ is diagonal. – Daniel Schepler Jul 24 '21 at 23:50

1 Answers1

1

Your argument cannot be right, since you are supposedly proving that every normal matrix is a scalar multiple of the identity. There is also a problem of point of view here: $A$ and $B$ are given to you.

The existence of an orthonormal basis of eigenvectors for $A$ can be conveniently expressed as saying that $A=V^*DV$, with $D$ diagonal and $V$ unitary. So we have that $$ V^*DVB=BV^*DV. $$ Multiplying on the left by $V$ and on the right by $V^*$, we get $$ D(VBV^*)=(VBV^*)D. $$ Which is the equality you had, with $B$ replaced by $C=VBV^*$.

Now your equality is $$ \lambda_iC_{ij}=\lambda_jC_{ij}. $$ You don't get to choose the $C_{ij}$, so what you can say is that if $\lambda_i\ne\lambda_j$, then $C_{ij}=0$. From this you can deduce that if $\alpha_1,\ldots,\alpha_r$ are the distinct eigenvalues of $A$, with multiplicities $n_1,\ldots,n_r$, then $D$ and $C$ have block form $$ D=\begin{bmatrix}\alpha_1 I_{n_1}\\ & \alpha_2 I_{n_2}\\ &&\ddots\\ &&& \alpha_r I_{n_r} \end{bmatrix},\qquad C=\begin{bmatrix} C_1\\ & C_2\\ && \ddots\\ &&&C_r\end{bmatrix}, $$ where $C_k$ is an $n_k\times n_k$ matrix. As $$ C^*=\begin{bmatrix} C_1^*\\ & C_2^*\\ && \ddots\\ &&&C_r^*\end{bmatrix}, $$ we have $$ DC^*=\begin{bmatrix} \alpha_1C_1^*\\ & \alpha_2C_2^*\\ && \ddots\\ &&&\alpha_rC_r^*\end{bmatrix}=\begin{bmatrix} C_1^*\alpha_1\\ & C_2^*\alpha_2\\ && \ddots\\ &&&C_r^*\alpha_r\end{bmatrix}=C^*D. $$ Then, as $C^*=VB^*V^*$, $$ AB^*=V^*DVB^*=V^*DC^*V=V^*C^*DV=B^*V^*DV=B^*A. $$

Martin Argerami
  • 217,281