1

Let $M$ be the algebra consisting of all $3 \times 3$ complex matrices (with the usual operations of matrix addition, scalar multiplication, and matrix multiplication). For $A$ in $M$, let $Λ_A : M →M$ be the complex linear transformation on the complex vector space $M$ defined by $X$ goes to $AX -XA$. What are the possible values of the rank of $Λ_A$, as $A$ varies over all of $M$?

Here is an argument

The rank of the linear transformation $Λ_A$ is equal to the dimension of its image. We can calculate the image of $Λ_A$ by finding the null space of the matrix $[Λ_A]$, where $[Λ_A]$ is the matrix of $Λ_A$ with respect to the standard basis of $M$.

Let $A$ be a $3×3$ complex matrix. Then $[Λ_A]$ is a $9×9$ complex matrix, where the rows and columns correspond to the entries of a $3×3$ matrix $X$. The $(i,j)$-entry of $[Λ_A]$ is given by the $(i,j)$-entry of the matrix $AX - XA$. In other words, if

$$A = \pmatrix{a_{11} &a_{12}& a_{13}\\ a_{21}& a_{22}& a_{23}\\ a_{31}& a_{32} &a_{33}}$$

Now how to approach...is it in correct way ?. Please guide me from here....

Sonu
  • 610
  • Consider what occurs in the case that $u$ is an eigenvector of $A$, $v$ is an eigenvector of $A^T$, and $X = uv^T$. Deduce that the eigenvalues of $\Lambda_A$ are given by $\lambda_i - \lambda_j$ for eigenvalues $\lambda_1,\lambda_2,\lambda_3$ of $A$ and all pairs $1 \leq i,j \leq 3$. From there, it's easy to deduce the rank of $A$ in the case that $A$ is diagonalizable, from which it follows that $\Lambda_A$ is diagonalizable. – Ben Grossmann Apr 03 '23 at 05:33
  • Note also that $\ker(\Lambda_A) = {X : AX = XA}$. The dimension of this subspace can be systematically determined based on $A$. However, for the case that $A$ is $3 \times 3$, we can more easily handle things by considering the 3 possible Jordan forms $$ \pmatrix{\lambda_1 & 1\&\lambda_1&\ &&\lambda_2}, \quad \pmatrix{\lambda & 1\&\lambda&1\ &&\lambda}, \quad \pmatrix{\lambda & 1\&\lambda\ &&\lambda}, $$ where $\lambda_1 \neq \lambda_2$. For the first two cases, the dimension of the kernel is 3. – Ben Grossmann Apr 03 '23 at 05:38
  • For the third case, it can be shown that the dimension of the kernel will be 5. – Ben Grossmann Apr 03 '23 at 05:48
  • In particular, the kernel for the third case consists of matrices of the form $$ M = \pmatrix{ a&b&c\ 0&a&0\ 0&d&e } $$ – Ben Grossmann Apr 03 '23 at 05:56
  • @BenGrossmann: That looks like an answer to me. – joriki Apr 03 '23 at 06:00
  • 1
    @joriki It wasn't going to be, but it became one. I'll try to make it one (officially) when I find the time – Ben Grossmann Apr 03 '23 at 06:01

1 Answers1

2

In short: the possible ranks are $0, 4, 6$.

The easiest approach to this problem, I believe, is to characterize $\ker(\Lambda_A)$ and compute $\operatorname{rank}(\Lambda_A) = 3^2 - \dim \ker (\Lambda_A)$ accordingly. Notably, we have $$ \ker(A) = \{X \in \Bbb C^{3 \times 3}: AX = XA\}, $$ and the dimension of this set can be systematically determined using the properties of $A$. That said, for our case where $A$ is "small", it is easier to simply consider all of the possibilities directly and exhaustively.

First of all, note that for similar matrices $A,B$, the maps $\Lambda_A$ and $\Lambda_B$ will have the same rank. Indeed, if $A = SBS^{-1}$, then we find that $\Lambda_A = \phi_S \circ \Lambda_B \circ \phi_S^{-1}$, where $$ \phi_S(X) = SXS^{-1}. $$ Thus, $\Lambda_A$ will be similar to $\Lambda_B$ and therefore have the same rank. Thus, it suffices to consider all possible $3 \times 3$ Jordan form matrices $A$. Moreover, we have $\Lambda_{A - t I} = \Lambda_A$ for all $t \in \Bbb C$, so it suffices to consider cases where $A$ has $0$ among its eigenvalues (which will be useful for handling the non-diagonalizable cases).

For the diagonalizable cases: in the case that $A$ is diagonal with distinct diagonal entries, it is easy to deduce that $AX = XA$ if and only if $X$ is a diagonal matrix. Conclude that the rank of $\Lambda$ in this case is $9 - 3 = 6$. On the other hand, $$ A = \pmatrix{\lambda_1\\ & \lambda_1 \\ & & \lambda_2}, \quad AX = XA \implies X = \pmatrix{x_{11} & x_{12}\\ x_{21} & x_{22}\\ && x_{33}}, $$ where the unwritten entries are zeros. Conclude that the rank of $\Lambda$ in this case is $9 - 5 = 4$. Finally, if $A$ is a multiple of the identity matrix, deduce that all matrices $X$ are in the kernel of $\Lambda_A$, so that the rank of $\Lambda_A$ is $0$.

For the non-diagonalizable cases, it is convenient to consider the case where the eigenvalue associated with a Jordan block (of size at least 2) is 0. For the possible Jordan forms $$ \pmatrix{0 & 1\\&0&\\ &&\lambda}, \quad \pmatrix{0 & 1\\&0&1\\ &&0}, \quad \pmatrix{0 & 1\\&0\\ &&0}, $$ where $\lambda$ is non-zero, we find that the dimension of the kernel is $3,3,5$ respectively. The corresponding ranks of $\Lambda_A$ are $6,6,4$. For the first two cases (as well as for the previously addressed case where $A$ has 3 distinct eigenvalues), we can deduce that $\dim \ker(\Lambda_A) = 3$ using the fact that $A$ is non-derogatory. For the third case above, one can verify that a matrix $X$ satisfies $AX = XA$ if and only if it is of the form $$ X = \pmatrix{ a&b&c\\ 0&a&0\\ 0&d&e }, \quad a,b,c,d,e \in \Bbb C. $$


Incidentally, if you were interested in calculating the matrix of $\Lambda_A$ relative to a standard basis, one could conveniently do so using the properties of vectorization. In particular, we find that $$ \operatorname{vec}(AX - XA) = (I \otimes A - A^T \otimes I)\operatorname{vec}(X), $$ where $\otimes$ denotes a Kronecker product. It follows that $I \otimes A - A^T \otimes I$ is the matrix of $\Lambda_A$ relative to the basis $$ \left\{\left[\begin{matrix}1 & 0 & 0\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\1 & 0 & 0\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\0 & 0 & 0\\1 & 0 & 0\end{matrix}\right]\right., \\ \left[\begin{matrix}0 & 1 & 0\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\0 & 1 & 0\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\0 & 0 & 0\\0 & 1 & 0\end{matrix}\right], \\ \left.\left[\begin{matrix}0 & 0 & 1\\0 & 0 & 0\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\0 & 0 & 1\\0 & 0 & 0\end{matrix}\right], \left[\begin{matrix}0 & 0 & 0\\0 & 0 & 0\\0 & 0 & 1\end{matrix}\right] \right\}. $$

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355