7

Is there any theorem to find the eigenvalues of any anti-circulant matrix using the equivalent (with the same first row) circulant matrix. I found out that, for any anti-circulant matrix, the eigenvalues (taken as $\mu$) of the anti-circulant matrix can be written as, \begin{equation} \mu = \pm \mid{\lambda_j}\mid \label{mu_alpha} \end{equation} where $\lambda_j$ is an eigenvalue of 1-circulant matrix with the same first row. This seems valid since any anti-circulant matrix should be symmetric resulting in real eigenvalues.

Can anyone send me a link to any reference which has this proof..? or can you please comment if you think that this should not be correct ?

Udara
  • 339

2 Answers2

6

Edit. Presumably the matrix is real, otherwise the claim is false and it is easy to generate a counterexample by computer.

Let $A$ be a circulant matrix and $B$ be the anticirculant matrix such that $A$ and $B$ have identical first rows. Then $B=PA$ for some permutation matrix $P$ (more specifically, $P=1\oplus J_{n-1}$, where $J_{n-1}$ is the reversal matrix obtained by flipping $I_{n-1}$ from left to right). Hence $A$ and $B$ have identical singular values. However, circulant matrices and real anticirculant matrices (which are also real symmetric) are normal matrices, so they can be unitarily diagonalized and their singular values are the moduli of their eigenvalues. Therefore, the eigenvalues $A$ and $B$ have identical moduli. Finally, as real anticirculant matrices are real symmetric, $B$ has real eigenvalues. Hence the assertion.

user1551
  • 149,263
  • Thanks for your explanation. Is there any reference (book, research paper) which I can look into this proof? – Udara Sep 07 '11 at 06:08
  • I don't know about such references, sorry. – user1551 Sep 07 '11 at 07:31
  • +1. Two points. 1) The eigenvectors of $A$ and $B$ have identical eigenvectors, since they are both normal matrices the eigenvectors and singular vectors of which are identical. Correct? 2) This claim regarding both the eigenvalues and eigenvectors is valid for any permutation matrix $P$ beyond this particular one generating the anticirculant matrix. That means the same conclusion can be drawn for any matrix the rows of which is an arbitrary permutation of the circulant matrix. Do you agree? – Hans Oct 23 '19 at 21:50
  • @Hans No. Every eigenvector of a normal matrix is a singular vector, but the converse is not true, because a repeated singular value can give rise to a pair of eigenvalues of opposite signs. As such, while a circulant matrix and its anti-circulant counterpart share the same singular vectors, they can possess different eigenvectors. – user1551 Oct 24 '19 at 00:36
  • You are right. What you said was caused by the degeneracy of a singular space, right? The direct sum of the eigenspaces of the eigenvalues with the same moduli of $A$ and $B$ should be the same, though. Right? – Hans Oct 24 '19 at 02:06
  • @Hans Yes, as shown in Var's answer. – user1551 Oct 24 '19 at 02:08
5

Let $C$ be the circulant matrix whose first row is $(c_0,c_1,,\ldots,c_{n-1})$ and let $P(x)=\sum_{k=0}^{n-1}c_kx^k$. For each $n$-th root of unity $\omega$, let $e_{\omega}=(1, \omega, \ldots, \omega^{n-1})^\top$. Then $Ce_{\omega}=P(\omega)e_{\omega}$ and hence the eigenvalues of $C$ are $P(\omega)$.

For the anti-circulant matrix $A$ with the same first row, we have $Ae_{\omega}=P(\omega)e_{\bar\omega}$. Therefore, for each $\omega\ne\pm1$, the restriction of $A$ to the invariant subspace with ordered basis $\{e_{\omega}, e_{\bar\omega}\}$ has the matrix representation $$A\begin{bmatrix}e_\omega&e_\bar\omega\end{bmatrix}= \begin{bmatrix}e_\omega&e_\bar\omega\end{bmatrix}\begin{bmatrix}0&P(\omega)\\P(\bar\omega)&0\end{bmatrix}, $$ then $$A^2\begin{bmatrix}e_\omega&e_\bar\omega\end{bmatrix}= \begin{bmatrix}e_\omega&e_\bar\omega\end{bmatrix}\begin{bmatrix}P(\omega)P(\bar\omega)&0\\0&P(\omega)P(\bar\omega)\end{bmatrix}. $$ Thus the eigenvalues of $A$ are $P(1),\,\pm \sqrt{P(\omega)P(\bar\omega)}$ for each $\omega\neq\pm1$, and when $n$ is even, also $P(-1)$.

When $A$ is real, so are the coefficients of $P$. Therefore $\pm\sqrt{P(\omega)P(\bar\omega)}=\pm|P(\omega)|$ for each $\omega\ne\pm1$ and $P(1),P(-1)$ are also real.

Hans
  • 10,484
Var
  • 51
  • 1
  • 2
  • 1
    Very nice observation. +1 – user1551 May 02 '17 at 16:02
  • @user1551: Are you sure you set the indices of $(c_0,c_{n-1},\ldots,c_2,c_1)$ right? It is not consistent with the definition of $P(x)$ and $Ce_{\omega}=P(\omega)e_{\omega}$. Should it be instead $(c_0,c_1,c_2,\dots,c_{n-1})$? – Hans Oct 26 '19 at 01:37
  • @Hans Yes, you are right. I switched the arrangement of the coefficients to the usual one without verifying its correctness. It's fixed now (along with two other mistakes). – user1551 Oct 26 '19 at 05:42
  • @user1551: Cool. But do you not like the explicit expression of the $A$ operated on $[e_\omega, e_\bar\omega]$ and that of $A^2$? I think this explicit presentation of these operations make the derivation much easier to comprehend. – Hans Oct 26 '19 at 08:57
  • @Hans That may be helpful, but isn't it easier to solve the characteristic equation for that 2-by-2 matrix directly? Anyway, if you wish, please feel free to add your previous explanations back. – user1551 Oct 26 '19 at 16:01
  • @user1551: I do not quite understand your question. The expression I add is to make the intermediate steps explicit and facilitate reading comprehension for solving the characteristic equation of $2\times 2$ block matrix since it shows where $\pm\sqrt{P(\omega)P(\bar\omega)}$ comes from. – Hans Oct 26 '19 at 18:08