3

[1] The similar question I post is here.

Let matrices $X$ and $Y$ be positive semidefinite (PSD).

(1) $X \succeq Y$ implies $X^{1/2} \succeq Y^{1/2}$.

(2) $X \succeq Y$ does not imply $X^{2} \succeq Y^{2}$.

I try to prove these two inequalities through Schur Complement but could not succeed. And I utilize reductio ad absurdum which does not work. Could anyone give some hints (maybe these two questions are in the same form so one hint is enough)? Thanks in advance!

[1]: How to prove positive semidefiniteness of two matrices through Schur Complement? .....

1 Answers1

3

Proposition 1. Let $r\in(0,1)$. i) If $0<A<B$, then $A^{r}<B^{r}$. ii) If $0\leq A\leq B$, then $A^{r}\leq B^{r}$.

Here is a hilarious (but correct) proof. For i). It is not difficult to prove that

(*) $0<A<B$ implies that $B^{-1}< A^{-1}$. On the other hand, from

$(Eq)\;\int_0^{+\infty}(\dfrac{x^{r+1}}{1+x^2}-\dfrac{x^r}{t+x})dx=\dfrac{\pi}{\sin(r\pi)}(t^r-\cos(r\pi/2))$, we deduce (replace $t$ with $A$)

$A^r=\cos(r\pi/2)I+\dfrac{\sin(r\pi)}{\pi}\int_0^{+\infty}(\dfrac{x^{r+1}}{1+x^2}I-(A+xI)^{-1}x^r)dx$.

Thus $B^r-A^r=\dfrac{\sin(r\pi)}{\pi}\int_0^{+\infty}((A+xI)^{-1}-(B+xI)^{-1})x^r)dx$ is $>0$ according to $(*)$.

For ii). Proceed by continuity.

EDIT 1. Answer to @stander Qiu . If you know an equality between analytic functions, as $(Eq)$, then, clearly, the equality remains valid when you replace $t$ with a diagonal matrix: $f(diag(\lambda_i))=\phi(diag(\lambda_i))$ and even with a diagonalizable matrix: $f(PDP^{-1})=P\phi(D)P^{-1}$. Finally, a general theorem about "matrix function" says that the equality remains valid for any matrix; in other words, it suffices to prove the required equality for diagonalizable matrices.

cf. [1]: Higham, functions of matrices.

Here is a second proof of the required result that does not use integration theory.

Proposition 2. i) If (*) $A>0,0\leq B<A$, then $B^{1/2}<A^{1/2}$. ii) If $0\leq B\leq A$, then $B^{1/2}\leq A^{1/2}$.

Proof. For i). (*)$\implies A^{-1/2}BA^{-1/2}<I\implies \rho((B^{1/2}A^{-1/2})^T(B^{1/2}A^{-1/2}))<1\implies \rho(B^{1/2}A^{-1/2})<1$

$ \implies \rho(A^{-1/4}B^{1/2}A^{-1/4})<1\implies A^{-1/4}B^{1/2}A^{-1/4}<I \implies B^{1/2}<A^{1/2}$.

For ii). Proceed by continuity.

EDIT 2. Answer to @stander Qiu. I don't know any name for $(Eq)$. Perhaps , you think about the Cauchy integral theorem: (cf. [1] p. 8, Frobenius and Poincare for application to matrices).

$f(A)=\int_{\Gamma}f(z)(zI-A)^{-1}dz$ where $A\in M_n(\mathbb{C})$ and $f$ is analytic on and inside a closed contour $\Gamma$ that encloses $spectrum(A)$.

for example, we consider the matrix sign function defined by $sign(z)=z/(z^2)^{1/2}$ when $Re(z)\not= 0$. Then $sign(A)=A(A^2)^{-1/2}$ and using Cauchy on $z^{-1/2}$, we obtain

$sign(A)=\dfrac{2}{\pi}A\int_0^{\infty}(t^2I+A^2)^{-1}dt$. Note that we can also prove this equality, using the $arctan$ function. From this, we can derive another integral formula for $A^r$.

$A^r=\dfrac{\sin(\pi r)}{\pi r}A\int_0^{\infty}(t^{1/r}I+A)^{-1}dt$.

On the other hand, from the equality $\log(x)=\int_0^1(x-1)(t(x-1)+1)^{-1}dt$,

Richter obtains directly $\log(A)=\int_0^1(A-I)(t(A-I)+I)^{-1}dt$.

  • Amazing.. I have two questions: 1. What is the theorem of that integral? 2. Could you prove the inequality via matrix? I do not mean that your method is wrong, but it is impossible for me to think in this way. lol – stander Qiu Oct 25 '17 at 08:06
  • Thanks for your answer. A very beautiful answer! I know what matrix function means and how equality preserves, but I want to know what is the name of $(Eq)$ ? – stander Qiu Oct 25 '17 at 23:56
  • Could you tell me the answer in this room? https://chat.stackexchange.com/rooms/67697/about-the-name-of-that-analytic-function @loup blanc – stander Qiu Oct 26 '17 at 11:46