2

I want to prove the following theorem.

For arbitrary self-adjoint matrix $A \in M_{n,n}(\mathbb{C})$, $\exists$ self-adjoint matrix $B,C$ whose eigenvalues are non-negative, such that $BC=0$ and $A=B-C$.


I know for any $A\in M_{n,n}(\mathbb{C})$, $A=B+iC$ for $B,C$ self-adjoint matrix. But at this moment, I have no idea how to prove the above theorem.

phy_math
  • 6,700
  • Do you know that every self-adjoint matrix is unitarily diagonalisable and its eigenvalues are real numbers? – user1551 Jan 24 '22 at 09:24
  • @user1551, yes I know the properties of the Hermitian(self-adjoint) matrix. – phy_math Jan 24 '22 at 09:27
  • 1
    Then it suffices to consider the case where $A,B,C$ are real diagonal matrices. To begin with, if $a$ is a nonnegative real number, can you find two real numbers $b,c\ge0$ such that $bc=0$ and $a=b-c$? – user1551 Jan 24 '22 at 09:28

2 Answers2

2

Since $A$ is self-adjoint, $\exists P$ unitary and $D$ diagonal with real components such that $A=PDP^\ast$. Now do the job on $D$. Write $D=D_++D_-$ where $D_+$ is diagonal with non-negative entries and $D_-$ is diagonal with non-positive entries in the most natural manner. Set $B=PD_+P^\ast$ and $C=-PD_-P^\ast$. Then $B$ and $C$ satisfy the required hypotheses.

2

I tried to find an answer that does not (at least in appearance) use eigenvalues. Here it goes.

The following only assumes that every non-negative self-adjoint matrix has a unique non-negative square root which commutes with it.

Let $S$ be the unique non-negative square root of $A^2$. Let $B := \frac{A+S}{2}$ and $C := \frac{S-A}{2}$. The matrices $B$ and $C$ are self-adjoint as linear combinations with real coefficients of self-adjoint matrices.

Then $BC = \frac{1}{4}\left(S^2 - A^2\right) = 0$.

Let us prove that $B$ is non-negative, that is, for every $\phi \in \mathcal{H}$, $\langle \phi, B\phi\rangle \geq 0$.

Notice that $B^2 = \frac{1}{4}(A^2 + 2AS + S^2) = \frac{1}{4}(2AS + 2S^2) = \frac{1}{2}(AS+S^2) = BS$; notice also that and that $B$ commutes with $S$, so that $B^2S$ is a non-negative operator (see for example here).

If $\phi \in \ker B$, then $\langle \phi,B\phi\rangle = 0$. If $\phi \in im(B)$, then there is some $\psi$ such that $\phi = B\psi$, and therefore $\langle \phi,B\phi\rangle = \langle B\psi,B^2\psi\rangle = \langle B\psi,BS\phi\rangle = \langle \psi,B^2S\psi\rangle \geq 0$.

Moreover, it is classical that $\ker B = (im B)^\perp$, so let $\phi$ be any vector; decompose it into $\psi_1 +\psi_2$ such that $\psi_1 \in \ker B$ and $\psi_2 \in im B$. Then $\langle \phi,B\phi\rangle = \langle \psi_1 + \psi_2, B\psi_1 + B\psi_2\rangle = \langle \phi_1, B \psi_2\rangle + \langle \psi_2,B\psi_2\rangle = \langle \psi_2,B\psi_2\rangle \geq 0$.

The same argument works for $C$.

Plop
  • 2,964