2

Let $A$ and $B$ be two positive semidefinite $n \times n$ matrices. Does the following matrix quadratic equation have a solution?

$$X^TBX=A$$

When $B$ is positive definite the solution is

$$X=B^{-1/2}QA^{1/2}$$

where $Q$ is an orthogonal matrix.

nizar
  • 159

1 Answers1

3

We assume that the matrices are real. Note that $A^{1/2},B^{1/2}$ are well defined.

Let $Y=B^{1/2}X$. Then $Y^TY=A$ and $Y=QA^{1/2}$ where $Q$ is arbitrary in $O(n)$. Finally, we consider the linear equation $(*)$ $B^{1/2}X=QA^{1/2}$.

We use, in the sequel the Moore-Penrose inverse $(.)^{+}$, cf.

https://en.wikipedia.org/wiki/Moore%E2%80%93Penrose_inverse

EDIT. $(*)$ has some solution iff

$(**)$ $B^{1/2}{B^{1/2}}^+QA^{1/2}=QA^{1/2}$.

Note that the condition $(**)$ depends on $Q$ and, consequently, gives the admissible matrices $Q$. When $(**)$ is satisfied for a fixed $Q$ (that implies in particular that $rank(B)\geq rank(A)$ as user1551 wrote), then the general solution of $(*)$ (for this choice of $Q$) is

$X={B^{1/2}}^+QA^{1/2}+(I-{B^{1/2}}^+B^{1/2})W$ where $W$ is an arbitrary $n\times n$ matrix.

Note that if $B=Udiag(\lambda_1,\cdots,\lambda_r,0_{n-r})U^T$ where $\lambda_i>0, r=rank(B)$ and $U\in O(n)$, then ${B^{1/2}}^+=Udiag(1/\sqrt{\lambda_1},\cdots,1/\sqrt{\lambda_r},0_{n-r})U^T$. In particular, $B^{1/2},{B^{1/2}}^+$ commute.

As a consequence of the previous note, it is easy to see that

$\textbf{Remark 1}$. The above solutions satisfy $X^TBX=A$.

$\textbf{Remark 2}$. $(**)$ is equivalent to $Q(im(A))\subset im(B)$.

  • Guys may I ask you the following? I think is in large part related to the question above. Given $X^{T}BX+Y^{T}CY=A$ where A is symmetric and positive definite (i.e. in my particular case is a covariance matrix), C and B are diagonal and positive definite (in my particular case they are the matrixes of eigenvalues of A where the first k eigenvalues are stored on the diagonal of B and the remaining are on the diagonal of C), and the unknowns are X and Y (in my case I can put X equal to the eigenvectors corresponding to the eigenvalues in B), is the solution for X and Y unique? – Fr1 Aug 17 '19 at 12:02
  • thinks for you answer but i think the condition $B^{1/2}{B^{1/2}}^+QA^{1/2}=QA^{1/2}$ is sufficient not necessary. for example take $A=CBC^T$ it clear that X=C is a solution but i couldn't see why $Range(A)\subset Range(B)$. please can you clarify this point. – nizar Aug 17 '19 at 14:29
  • The condition is necessary and sufficient. You confuse the definitions; $rank(A)=dim(range(A))=dim(im(A))$; in other words, $rank(A)$ is an integer and $range(A)$ is a subspace. Moreover, $rank(A)\leq rank(B)$ does not imply that $range(A)\subset range(B)$. –  Aug 17 '19 at 15:08
  • @Fr1 , if $B,C$ are symmetric $>0$ and $A$ symmetric $\geq 0$, then there are an infinity of solutions. For example, $t\in [0,1]\mapsto (X_t,Y_t)$ where ${X_t}^TBX_t=tA,{Y_t}^TCY_t=(1-t)A$, these solutions being explicitly given by the formula in the OP's question. Of course, there are many other solutions. –  Aug 17 '19 at 15:17
  • I'm not confusing the rk and Im of A. In your proof you take Q as an arbitrary orthogonal matrix, so to simplify my reasoning i'll take $Q=I_n$. the condition $B^{1/2}{B^{1/2}}^+A^{1/2}=A^{1/2}$ is equivalent to $range(A^{1/2})\subset range(B^{1/2})$ which also equivalent to $range(A)\subset range(B)$, if this condition is necessary so for every matrix that can be written as$A=CBC^T$ we need to have that $range(CBC^T)\subset range(B)$, and a more general statement will be for every matrix C and semi-definite matrix B we have $range(CBC^T)\subset range(B)$ which seems to not always correct. – nizar Aug 17 '19 at 16:41
  • You did not understand, but it's partly my fault because I have not detailed. The condition depends on the choice of $Q$. In particular, if $range(A)$ is not included in $range(B)$, then you cannot choose $Q=I$. I will rewrite a part of the answer to make the details clearer. –  Aug 17 '19 at 16:58
  • @loupblanc yes sorry you’re right I forgot to mention that I am working with normalized eigenvectors, so if we add the constraint that the columns of X and Y must all sum to 1 the solution is unique right? – Fr1 Aug 17 '19 at 17:22
  • @loupblanc anyway I will open a separate question so that you can paste your answer there and be remunerated for your help with an “answered”, so just oste your reply there. I will tag you below the question so that you will be able to spot it very easily without any search – Fr1 Aug 17 '19 at 17:23
  • @Fr1 , Ok thanks. –  Aug 17 '19 at 18:16
  • @loupblanc you can find the question here – Fr1 Aug 17 '19 at 18:49
  • @loupblanc thanks so much for your explanation now i get it, and such matrix always exists if $rk(A) \leq rk(B)$. – nizar Aug 17 '19 at 19:34