8

Hello: I need help with this problem:

Let $V = (V,b)$ be a finite-dimensional vector space equipped with a symmetric and positive definite bilinear form $b$. And let $\{e_1,…,e_n\}$ be a orthonormal basis for the subspace $\ker((P_A)^t)$ ($P_A$ is defined below).

For a matrix $A \in \mathrm{O}(V)$, let $\mathrm{O}_*(V)$ the subset of $\mathrm{O}(V)$ such that be the matrix $P_A:=\frac{A-JAJ}{2}$ is invertible, where $J$ is a complex structure (a matrix such that $J^2=-1$ and $J^t=J^{-1}=-J$).

Let $n=\dim \ker(P_A)$. For every $j \in \{1,…,n\}$ we define the reflexions $r_j$ such that $r_j(e_j)=-Je_j$, $r(Je_j)=-e_j$ and $r_j(v)=v$ for and $v \in V$ such that $b(v,e_j)=b(v,Je_j)=0$. Finally, let $$R:=r_1r_2\dots r_n \in \mathrm{O}(V).$$

I need to prove that $$RA \in \mathrm{SO}_*(V),$$ where similarly as $\mathrm{O}(V)$: $\mathrm{SO}_*(V)$ is the subset of $\mathrm{SO}(V)$ such that $P_B:=\frac{B-JBJ}{2}$ is invertible.

I already proved that $RA \in \mathrm{SO}(V)$; the only thing that I haven’t been able to figure out is to prove that $\frac{1}{2}(RA-JRAJ)$ is invertible, since $n$ can be even or odd.

Also, $P_{r_j}$ is not invertible since $\det(r_j)=-1$.

¿What is good and optimized approach to deal with the product of reflections $$R=r_1r_2\cdots r_n?$$

In summary: I’m trying to prove that $RA$ (where $R$ the product of reflections $r_1r_2\dots r_n$) is a ortogonal matrix with $\det(RA)=+1$ and that the matrix $\frac{1}{2}(J-J(RA)J)$ is invertible.

Any help will be greatly appreciated. Thanks :).


UPDATE: Two things:

  1. I made a typo, $\{e_1,…,e_n\}$ be a orthonormal basis for the subspace $\ker((P_A)^t)$, not on $\ker(P_A)$.

  2. On the main reference I'm using, the author establishes the following:

This operators (each reflection $r_j$, with $j \in \{ 1,\dots, n \}$) has the identity restricted to the subspace ortogonal to $e_j$ as $p_{r_j}$. Let $R:=r_1\dots r_n$, then the operator $R$ is in block form, its lower right corner being the identity on $(\ker((P_A)^t))^{\perp}$. And so $RA \in \mathrm{SO}_*(V)$.

And that's it, I think the author's argument has many gaps or things that I'm not getting :(. He says he's following this article, but I have read any multiply times and I don't see anything like what I'm trying to prove (or at least eith this notation).

Ben
  • 7,321

1 Answers1

2

EDIT: There were errors in a previous version. I think they are amended now, and that this answer shows invertibility of $P_{RA}$ (but not $RA \in SO(V)$).


Start by noting some properties of $P$ and $Q$. Denote by $\iota$ the linear transformation on the space of matrices given by conjugation with $J$, that is $$\iota: X \mapsto JXJ^{-1} = - JXJ$$ Since this is an involution, there is a decomposition of any matrix $X$ into its linear/anti-linear parts $X = P_X + Q_X$ where $$\begin{aligned}P_X &= (X +\iota X)/2\\ Q_X &= (X - \iota X)/2\end{aligned}$$ These are also the projections in the $\pm 1$ eigenspaces of $\iota$, and elements of the positive (resp. negative) eigenspace are those matrices which commute (resp. anticommute) with $J$. In particular, the kernel and image of $P$ and $Q$ are $J$-invariant subspaces.

These satisfy a composition rule:

$$ \begin{aligned} P_{XY} &= P_XP_Y + Q_XQ_Y\\ Q_{XY} &= P_XQ_Y + Q_XP_Y \end{aligned} $$

Since we assume $J$ is orthogonal, then $\iota$ respects the transpose operator, hence so do $P$ and $Q$, i.e. $P_{X^t} = (P_X)^t$ and $Q_{X^t} = (Q_X)^t$.

In the case of an orthogonal matrix $A$, we obtain:

$$ \begin{aligned} P_A^tP_A + Q_A^tQ_A &= P_AP_A^t + Q_AQ_A^t = I\\ Q_A^tP_A + P_A^tQ_A &= P_AQ_A^t + Q_AP_A^t = 0 \end{aligned} $$

These equations are crucial since the result would not hold for any $A \in GL(V)$ without orthogonality.


Now we review the definition of $R$.

Proposition. Suppose $J$ is orthogonal such that $J^2 = -1$, and $W$ is a $J$-invariant subspace with projection $\pi: V \to W^\perp$. Then

  • There exists an orthonormal basis of $W$ of the form $\{e_i, Je_i\}$
  • The operator $S \in O(W)$ swapping each $e_i \longleftrightarrow Je_i$ anticommutes with $J$
  • The operator $R \in O(V)$ extending by identity on $W^\perp$ has $P_R = \pi$ and $Q_R = S(1-\pi)$

Note: The Gram-Schmidt process requires orthogonality of $J$. (Namely $x \perp Jx$ for all $x$ is equivalent to the antisymmetry of $(x, Jy)$, hence $J^t = -J = J^{-1}$. The inductive part of the process, that if $x$ is orthogonal to $\{v_i, Jv_i\}$ then so is $Jx$, also follows easily from orthogonality.)

Now let $W = (\text{ker}\, P_A^t) = (\text{im}\, P_A)^\perp$ and $\pi: V \to \text{im}\, P_A$ is the projection. By definition $(1-\pi)P_A = 0$, so $Q_RP_A = 0$. The linear and antilinear parts of $RA$ are then:

$$ \begin{aligned} P_{RA} &= \pi P_A + S(1-\pi)Q_A\\ Q_{RA} &= \pi Q_A \end{aligned} $$

For the special case when $\pi Q_A = 0$ (i.e. orthogonality of images of $P_A$ and $Q_A$) then $P_{RA} = RA$, which is in particular invertible, and we haven't even used orthogonality of $A$.


Finally we address the main problem. We must consider both decompositions $V = C\oplus C^\perp = K^\perp \oplus K$ where $C = \text{im}\, P_A$ and $K = \text{ker}\, P_A$ with projections $\pi_C,\pi_C^\perp: V\to C,C^\perp$ and inclusions $\theta_K, \theta_K^\perp: K,K^\perp \to V$. Then we can decompose matrices into 4 terms arranged in a matrix like so: $$ \text{End}(V) = \text{Hom}(K^\perp\oplus K, C\oplus C^\perp) = \begin{pmatrix} \text{Hom}(K^\perp, C)&\text{Hom}(K, C)\\ \text{Hom}(K^\perp, C^\perp)&\text{Hom}(K, C^\perp) \end{pmatrix} $$

This is done so that the decomposition of $A = P_A + Q_A$ looks as follows: $$ A = \begin{pmatrix} \pi_CP_A\theta_K^\perp&0\\ 0&0 \end{pmatrix} + \begin{pmatrix} \pi_C Q_A\theta_K^\perp&\pi_C Q_A\theta_K\\ \pi_C^\perp Q_A\theta_K^\perp&\pi_C^\perp Q_A\theta_K \end{pmatrix} $$ Since $R$ is defined with respect to the decomposition $V = C\oplus C^\perp$ it composes easily in this notation to give $$ RA = \begin{pmatrix} \pi_CP_A\theta_K^\perp&0\\ S\pi_C^\perp Q_A\theta_K^\perp&S\pi_C^\perp Q_A\theta_K \end{pmatrix} + \begin{pmatrix} \pi_C Q_A\theta_K^\perp&\pi_C Q_A\theta_K\\ 0&0 \end{pmatrix} $$

By definition $\pi_C P_A\theta_K^\perp$ is full rank, so to show $P_{RA}$ is invertible, our problem is precisely to see that $\pi_C^\perp Q_A\theta_K$ is full-rank as well. For this we must make use of orthogonality. From the equation $P^tQ = - Q^tP$ we may see that $$Q_A(\text{ker}\, P_A) \leq \text{ker}\, P^t_A = (\text{im}\, P_A)^\perp$$ or in other words $\pi_C Q_A\theta_K = 0$. Comparing with the above decomposition of $A$, it now follows from invertibility of $A$ that $\pi_C^\perp Q_A\theta_K$ is full rank, as desired.

Ben
  • 7,321