3

Matrix $A$ has $n$ distinct non-zero eigenvalues, $\lambda_1, \dots, \lambda_n$. Find the eigenvalues and eigenvectors of the linear operator $$ L : X \mapsto AX^{T}A $$


I tried to use a decomposition: $A = P\Lambda P^{-1}$, where $\Lambda$ is diagonal matrix with $\lambda_1 \dots \lambda_n$ on its diagonal. Then, I wrote equation $L : (PXP^{-1})^{T} \rightarrow P\Lambda X\Lambda P^{-1} = \lambda (PXP^{-1})^{T}$. So, I've got $C\Lambda X\Lambda C^{-1} = \lambda X^{T}$, where $C= P^{T}P$ and stuck there.

Edited: $X, A \in \mathbb C^{n \times n}$

  • What makes you think that a closed form for this operator given the information above is enough to proceed? Does $X$ or $A$ have any other properties? At the moment this seems similar to asking what eigenvalues of $AB$ are given you know eigenvalues of $A$ – Gregory Feb 03 '22 at 14:01
  • 2
    @Gregory It seems to me that $L$ is a linear map from the vector space of $n\times n$ matrices to itself and the question is about the eigenvalues and eigenvectors of this linear map, i.e., we are trying to solve $AX^\top A = \lambda X$. – Gary Feb 03 '22 at 14:21
  • @Gregory Matrixes $A, X \in \mathbb C^{n\times n} $ – chereshnya Feb 03 '22 at 14:51
  • I suspect that the eigenvalues of $A$ are not enough to determine the eigenvalues of this linear operator. – Ben Grossmann Feb 03 '22 at 19:17
  • 1
    Since $A$ is invertible, so is $L$. So, by taking determinants on both sides of $AX^TA=\lambda X$, we at least know that $\lambda$ is an $n$-th root of $\det(A)^2$ if $X$ is invertible. – user1551 Feb 03 '22 at 19:58
  • 1
    @Paravozik: The linear map in the OP is in essence a linear map from $\mathbb{C}^{n^2}$ to itself. $A$ as a square matrix of dimension $n$ has at most $n$ different eigen values. Where are, the matrix representing the operator in the OP (writing $n\times n$-matirces as $n^2$-vectors has potentially $n^2$ different eigenvalues. Thus, as Ben Grossmann said, it would to be surprising if the eigenvalues if $A$ in general provide all the information about the Eigen-values of $L$. – Mittens Feb 03 '22 at 20:14
  • Is the sup-T meaning to take $X$-transpose? Or to take $A$-transpose? Please clarify? – paul garrett Feb 03 '22 at 22:37
  • @paulgarrett $X^T$ means to take $X$-transpose – chereshnya Feb 04 '22 at 17:03
  • Ok, thanks, ... sometimes people do write the sup-T on the left... :) – paul garrett Feb 04 '22 at 17:23

1 Answers1

5

In general, the eigenvalues of $A$ are insufficient to determine the eigenvalues of $L$. As an example, consider the cospectral matrices $$ A = \left(\begin{array}{cc} 1 & 0\\ 0 & 2 \end{array}\right), \quad B = \left(\begin{array}{cc} 1&1\\0&2 \end{array}\right). $$ We find that the eigenvalues of $X \mapsto AX^TA$ are $\pm 2, 1, 4$, whereas the eigenvalues of $X \mapsto BX^TB$ are $\pm 2, 3 \pm \sqrt{5}$. The corresponding eigenvectors of $B$ are $$ -2: \pmatrix{1&-3\\3&1}\qquad 2: \pmatrix{-1&-1\\-1&1}\\ 3 - \sqrt{5}: \pmatrix{3\sqrt{5} + 7 & -\sqrt{5}-1\\ -2\sqrt{5}-4&2}\qquad 3 + \sqrt{5}: \pmatrix{7-3\sqrt{5} & \sqrt{5}-1\\ 2\sqrt{5}-4& 2} $$ It is interesting that the eigenvectors associated with $3 \pm \sqrt{5}$ are non-invertible.

If $A$ is real and symmetric, then the eigenvalues of $L$ can be deduced. In particular, $L$ will have eigenvalues $\lambda_j$ with associated eigenvector $xx^T$ (where $Ax = \lambda_j x$) and $\pm \lambda_j\lambda_k$ with associated eigenvector $xy^T \pm yx^T$ (where $Ax = \lambda_j x$ and $Ay = \lambda_k y$).

Note 1: From experiment, it seems that $\pm \det(A)$ is an eigenvalue in the case that $A$ has size $2 \times 2$; I'm not sure why this would be the case.

Note 2: By the result described on this post, a sum of eigenspaces that contains no invertible elements can have dimension at most $n^2 - n$. One the other hand, by user1551's comment, it holds that if $X$ is an invertible eigenvector of $L$ with eigenvalue $\lambda$, then it must be that $\lambda^{n} = \det(A)^2$. I suspect that these ideas can be combined in an interesting way.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355