7

Let $A\in M_n(\mathbb{R})$ such that $A^2=-I_n$ and $AB=BA$ for some $B\in M_n(\mathbb{R})$. Prove that $\det(B)\geq0$.

All the information I could extract from the relation $A^2=-I_n$ are as follows:

$(a)$ $A$ is not diagonalizable.

$(b)$ $\det(A)=1$.

$(c)$ $n$ must be even.

Now how to conclude that $\det(B)$ is nonnegative using these $3$ informations alongwith $AB=BA$ is not clear to me. Any help is appreciated.

user26857
  • 53,190
am_11235...
  • 2,206
  • I'm not sure why you think $A$ is non-diagonalizable. For instance, $\left( \begin{array}{cc} i & 0 \ 0 & -i \ \end{array} \right)$ satisfies $A^2=-I_2$ and is diagonalizable. – march Sep 27 '21 at 15:51
  • 2
    I think they meant real diagonalizable. – Breaking Bioinformatics Sep 27 '21 at 15:52
  • @march , did you notice that $A\in M_n(\mathbb{R})$ ? – am_11235... Sep 27 '21 at 16:01
  • One idea: Let $E$ be an eigenspace corresponding to a negative eigenvalue of $B$. Then $A$ acts on $E$. The minimal polynomial of the restriction of $A$ to $E$ must divide $X^2+1$. Hence, $A$ has no real eigenvalue on $E$. In particular, $\dim E$ is even. However, I'm not sure if $\dim E$ is the algebraic multiplicity of the eigenvalue. – Brauer Suzuki Sep 27 '21 at 16:33
  • @am_11235... I did not read that carefully! – march Sep 27 '21 at 16:49
  • This question is from the math contest SEEMOUS 2021. – user26857 Sep 28 '21 at 14:42

5 Answers5

9

Proof Outline: Using the fact that $A^2 = -I_n$, conclude that $n$ must be even and that there exists some invertible matrix $P \in M_n(\Bbb R)$ such that $$ P^{-1}AP = J := \pmatrix{0 & -I_k\\ I_k & 0}, $$ where $k = n/2$. With that, we can conclude that $\det(A) = 1$.

Now without loss of generality, we can assume that $A = J$ (note that $A$ commutes with $B$ iff $P^{-1}AP$ commutes with $P^{-1}BP$). Partition $B$ into four $k \times k$ blocks: $$ B = \pmatrix{B_{11} & B_{12} \\ B_{21} & B_{22}}. $$ From the fact that $AB = BA$ (that is, $JB = BJ$), conclude that we have $B_{11} = B_{22}$ and $B_{12} = -B_{21}$. That is, we have $$ B = \pmatrix{F & -G\\ G & F} $$ for some matrices $F,G \in M_k(\Bbb R)$. Now, find a matrix $Q \in M_n(\Bbb C)$ such that $$ Q^{-1}BQ = \pmatrix{F + i G & 0\\0 & F - i G}. $$ Conclude that $$ \begin{align} \det(B) &= \det(F + i G) \det(F - i G) = \det(F + i G) \det(\overline{F + i G}) \\ &= \det (F + i G) \overline{\det(F + i G)} = |\det(F + i G)|^2 \geq 0. \end{align} $$

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • I can't get why $A$ is similar to $\begin{bmatrix}0 & -I_k\I_k & 0\end{bmatrix}$. Can you please explain? –  Dec 30 '23 at 18:28
  • @IlkayBurak The explanation is too long for a comment, consider posting a new question. In brief: one option is to note that the matrices are similar over $\Bbb C$ (since they are normal matrices with the same eigenvalues) and then use the fact that real matrices are similar over $\Bbb C$ iff they are similar over $\Bbb R$. The other option is to find the matrix of $A$ relative to the basis $v_1,\dots, v_k, Av_1,\dots Av_k$ for suitably selected $v_1,\dots , v_k$. – Ben Grossmann Dec 31 '23 at 02:03
2

There is nothing to do when $\det\big(B\big)=0$ so we consider the case when $B\in GL_n\big(\mathbb R\big)$.

$A':= \left[\begin{matrix}0 & -1\\1 & 0\end{matrix}\right]$

$A \in GL_n(\mathbb R)$ has eigenvalues in (the extension field $\mathbb C$) $\lambda \in \big\{i,-i\big\}$ which must come in conjugate pairs hence $n=2\cdot m$ . $A$ is similar to its Rational Canonical Form given by, for some $S \in GL_n(\mathbb R)$
$S^{-1}AS = \left[\begin{matrix}A' & \mathbf 0&\cdots&\mathbf 0\\\mathbf 0 & A'&\cdots &\mathbf 0\\ \vdots&\vdots &\ddots &\vdots \\ \mathbf 0&\mathbf 0 &\mathbf 0 &A'\end{matrix}\right]$

which is permutation similar to the symplectic matrix $J$
$J=\left[\begin{matrix}\mathbf 0 & I_m\\-I_m & \mathbf 0\end{matrix}\right]= (SP)^{-1}A(SP)=W^{-1}AW $

$Z:= W^{-1}BW$
and conjugation preserves commutativity so
$ZJ= JZ\implies Z^TJ= JZ^T$
Justification: transposing, then negating each side (or applying Fuglede's Theorem)

$\implies J\big(Z^TZ\big) = \big(JZ^T\big)Z = \big(Z^TJ\big)Z= Z^T\big(JZ\big)= Z^T\big(ZJ\big)=\big(Z^TZ\big) J$
which implies, when working over $\mathbb C$, that $J$ and $\big(Z^TZ\big)$ are simultaneously diagonalizabile which implies $J$ also commutes with the square root $(Z^TZ)^\frac{1}{2}$. (Alternatively, staying in $\mathbb R$, Spectral Theorem for real symmetric matrices, and e.g. interpolating with a Vandermonde matrix, tells us $(Z^TZ)^\frac{1}{2}$ can be written as a polynomial in $(Z^TZ)$ hence it commutes with $J$.)

applying Polar Decomposition, we have
$Z=Q\big(Z^TZ\big)^\frac{1}{2}$
$JQ\big(Z^TZ\big)^\frac{1}{2}=JZ=ZJ=Q\big(Z^TZ\big)^\frac{1}{2}J=QJ\big(Z^TZ\big)^\frac{1}{2}\implies JQ=QJ$

finish 1: via symplectic group:
via left multiplication by $Q^T$
$\implies Q^T J Q =J$
Thus $Q\in SP_{2n}\big(\mathbb R\big)$
i.e. $Q$ is in the symplectic group (which is path connected) so $\det\big(Q\big) =1$ and
$\det\Big(B\Big)=\det\Big(W^{-1}BW\Big) = \det\Big(Z\Big)= \det\Big(Q\big(Z^TZ\big)^\frac{1}{2}\Big) = 1 \cdot \det\Big(\big(Z^TZ\big)^\frac{1}{2}\Big)\geq 0$

finish 2: J-invariance:
Suppose for contradiction that $\det\big(Q\big) = -1$. This implies $Q$ has an odd amount of eigenvalues equal to $-1$ so $\dim \ker \big(Q+I\big) = r$ which is odd.

$J\big(Q+I\big)= \big(Q+I\big)J$ so $\ker \big(Q+I\big)$ is a $J-$ invariant subspace of odd dimension. Let $\mathbf B$ and $\mathbf B'$ be two different bases for $\ker \big(Q+I\big)$. $\mathbf B$ is created the typical way by collecting $r$ linearly independent vectors from $\ker \big(Q+I\big)$ -- these coordinate vectors necessarily have all real components. Now working over $\mathbb C$, we create $\mathbf B'$, also a basis for $\ker \big(Q+I\big)$, this time using eigenvectors from $J$ (ref e.g. here For a real symmetric matrix $A$, are the subspaces given by the span of eigenvectors the only $A$-invariant subspaces? ).

So $J\mathbf B = \mathbf B M$ and $J\mathbf B' = \mathbf B' M'$, for $M,M' \in GL_{r}\big(\mathbb C\big)$. Then $M$ and $M'$ are similar so $\text{trace}\big(M\big)=\text{trace}\big(M'\big)$.

$M$ is real (because $J$ and $\mathbf B$ are) so $\text{trace}\big(M\big)\in \mathbb R$. But $M'$ is a diagonal matrix with all entries equal $\pm i$ so $\text{trace}\big(M'\big)\in i\cdot\mathbb R \implies \text{trace}\big(M\big)=\text{trace}\big(M'\big)\in \mathbb R \cap i\cdot\mathbb R =\big\{0\big\}$, which gives the linear system
$\left[\begin{matrix}i & -i\\ 1 &1 \end{matrix}\right]\left[\begin{matrix}x_1\\ x_2 \end{matrix}\right]=\left[\begin{matrix}0\\ r \end{matrix}\right]\implies \left[\begin{matrix}x_1\\ x_2 \end{matrix}\right]=\left[\begin{matrix}\frac{r}{2}\\ \frac{r}{2} \end{matrix}\right]$
which is a contradiction since the the algebraic multiplicities ($x_i$) of eigenvalues must be natural numbers. [Alternatively note $M$ has real components but is similar to $M'$ which has all non-real eigenvalues which cannot come in conjugate pairs since $r$ is oddd.]

Thus $\det\big(Q\big)=1$ and once again $\det\Big(B\Big)=\det\Big(W^{-1}BW\Big) = \det\Big(Z\Big)= \det\Big(Q\big(Z^TZ\big)^\frac{1}{2}\Big) = 1 \cdot \det\Big(\big(Z^TZ\big)^\frac{1}{2}\Big)\geq 0$

user8675309
  • 12,193
  • For a lighter weight proof that does not involve Rational Canonical Form, one can just observe that $A$ is similar to a real orthogonal matrix $U$ using the argument here: https://math.stackexchange.com/questions/3792992/a-is-real-matrix-and-for-some-k-geq-2-ak-is-similar-to-an-orthogonal-matr/ so $S^{-1}AS=U$ where $U^TU= I$ and $U^2=S^{-1}(-I)S=-I\implies U=-U^T$ so $U$ is skew symmetric. Then apply this https://math.stackexchange.com/questions/4556807/orthogonal-skew-symmetric-matrices-are-orthogonally-conjugate/ to conclude $U$ is similar to symplectic matrix $J$ – user8675309 Dec 30 '23 at 18:34
2

Since $A^2=-Id$, its eigenvalues are $\pm i$. So $A$ does not have a real eigenvector.

By contradiction assume that $\det(B)<0$. Hence $B$ has a negative eigenvalue $\lambda_1$. (Since the complex eigenvalues of $B$ come in conjugate pairs, if the real eigenvalues of $B$ are non negative then $\det(B)$ would be non negative too).

Let $v\in \mathbb{R}^n$ be such that $Bv=\lambda_1 v$. So $ABv=\lambda_1 Av$. Thus, $BAv=\lambda_1 Av$.

Since $Av,v$ are linearly independent (A does not have a real eigenvector) and $A$ leaves invariant span$\{v,Av\}$ then there is an invertible real matrix $P_1$ such that

$P_1BP_1^{-1}=\begin{pmatrix}\lambda_1 Id_{2\times 2} & C_{2 \times n-2} \\ 0_{n-2\times 2} & (B_1)_{n-2\times n-2} \end{pmatrix}$ and $P_1AP_1^{-1}=\begin{pmatrix}A_2 & E_{2 \times n-2} \\ 0_{n-2\times 2} & (A_1)_{n-2\times n-2} \end{pmatrix}$.

These matrices still commute. So $B_1A_1=A_1B_1$. Of course $A_1^2=-Id_{n-2\times n-2}$.

In addition, $0>\det(B)=\lambda_1^2\det(B_1)$. So $\det(B_1)<0$.

We can repeat this argument $m=n/2$ times to obtain

$P_mBP_m^{-1}= \begin{pmatrix}\lambda_1 Id_{2\times 2}& C'_{2\times 2} &\ldots & C''_{2\times 2}\\ 0_{2\times 2}& \lambda_2 Id &\ldots & D''_{2\times 2}\\ \vdots & \vdots &\ddots & \vdots\\ 0_{2\times 2}& 0_{2\times 2} &\ldots & \lambda_m Id \\ \end{pmatrix}$.

Now, $\det(B)=\prod_{i=1}^m\lambda_i^2\geq 0$. Absurd!

Daniel
  • 6,057
  • 1
    Nice. Some minor caveats that do not affect the correctness of your proof: (1) $P_1AP_1^{-1}$ may be block upper triangular rather than block diagonal; (2) $B$ can be similar to $(\lambda_1I_2\oplus\cdots\lambda_kI_2)\oplus B'$ where $B'$ has not any real eigenvalue. – user1551 Sep 28 '21 at 04:58
  • Wait. $P_1BP_1^{-1}$ should also be block upper triangular rather than block diagonal. Consider $B=\pmatrix{-I_2&-I_2\ 0&-I_2}$ and $A=\pmatrix{A_2&0\ 0&A_2}$ for instance. – user1551 Sep 28 '21 at 10:54
  • @user1551 Thanks for catching these mistakes. I am not sure about your second assertion in your first comment. If we assume that det(B)<0 then det(B')<0. So B' must have a negative eigenvalue. Right? – Daniel Sep 28 '21 at 13:31
1

Inspired by Daniel’s answer: let $\lambda$ be any real negative eigenvalue of $B$ and $m$ be its algebraic multiplicity. Since $A$ and $B$ commute, $V=\ker\big((B-\lambda I)^m\big)$ is an invariant subspace of $A$. Yet, as $A^2=-I$, $m=\dim V$ must be even. Since this is true for any negative eigenvalue $\lambda$, the total number of negative eigenvalues of $B$ (counting multiplicity) is even, too. Therefore $\det(B)\ge0$.

user1551
  • 149,263
0

An extremely short analytic answer is as follows:

$A_k :=\frac{1}{k} A$
If $\lambda_j\in \mathbb C$ is an eigenvalue of $B$, then the matrix $B+A_k$ has eigenvalue $\lambda_j + \frac{\alpha_j}{k}$ for some $\alpha_j \in \big\{-i,i\big\}$ (simultaneous triangularization). Each eigenvalue $(\lambda_j + \frac{\alpha_j}{k})\in \mathbb R$ for at most one $k \in \mathbb N$. Since there are finitely many eigenvalues, this means every $(\lambda_j + \frac{\alpha_j}{k})\in \mathbb C-\mathbb R$ any $k\geq K$ (large enough) hence the eigenvalues of $\big(B+A_k\big)$ come in conjugate pairs when $k\geq K$
$\implies 0\leq \det\big(B+A_k\big)$ for $k\geq K$ $\implies 0 \leq \lim_{k\to \infty}\det\big(B+A_k\big)=\det\big(B\big)$

user8675309
  • 12,193