0

Let $A$ be an $m\times n$ matrix with coefficients in $\mathbb R$. Supposedly this fact is true, however when I compute it:

\begin{align} (A^+ A)^\ast &= A^\ast (A^+)^\ast \newline&= A^\ast ((A^\ast A)^{-1}A^\ast)^\ast \newline &= A^\ast A(A^\ast A)^{-1} \end{align}

Which is clearly not equal to $A^+ A$.

jem do
  • 305
  • 1
    This is true even if $A$ is not of full rank. – Mittens Apr 12 '23 at 06:03
  • 1
    clearly not equal? Well, I'd said it's clearly the identity matrix, so... (of course when the matrix inverse makes sense, which you are assuming here) – Jean-Claude Arbaut Apr 12 '23 at 07:12
  • It appears that you define $A^+$ as $(A^\ast A)^{-1}A^\ast$ (note that the inverse makes sense only when $A$ has full column rank). In this case $A^\ast A=(A^\ast A)^{-1}A^\ast A=I$. Therefore $(A^\ast A)^\ast=I^\ast=I=A^\ast A$. – user1551 Apr 12 '23 at 09:38

2 Answers2

2

What's your definition of the pseudoinverse? Classically in more general case of any complex matrix, $A^+$ by definition is a unique matrix $A^+$ satisfying all of the four conditions:

  • $AA^+A = A$
  • $A^+AA^+ = A^+$
  • $(A^+A)^* = A^+A$
  • $(AA^+)^* = AA^+$

So there is nothing to show. In case where $A$ has linearly independent columns, we get a more compact formula that you used $A^+ = (A^*A)^{-1}A^*$

In this case the fact is trivial, as $$(A^*A)^{-1}A^*A = I$$

Linear independence of columns is a necessary condition for the invertibility of $A^*A$. Similar formula can be derived in case of linear independent rows, as then $AA^*$ is non-singular and $$A^+ = A^*(AA^*)^{-1}$$

In this case $$A^+A = A^*(AA^*)^{-1}A = (A^{*}(A^{*}(AA^{*})^{-1})^*)^* = (A^* (AA^* )^{-1} A)^* = (A^+A)^*$$

0

The singular value decomposition makes the problem related to the pseudoinverse simpler in many cases. Let $A = U\Sigma V^*\in\mathbb{C}^{m\times n}$ be the singular value decomposition of a matrix $A$ where first $r={\rm rank}A$ diagonal entries of $\Sigma$ are nonzero singular values $\sigma_1,\dots,\sigma_r$. Then, its pseudoinverse can be written as $A^+ = V\Sigma^+U^*$ where first $r$ diagonal entries of $\Sigma^+$ are $1/\sigma_1,\dots,1/\sigma_r$. Thus, we can write $A^+A = V\Sigma^+\Sigma V^*$ and easily check $\Sigma^+\Sigma = (\Sigma^+\Sigma)^*\in\mathbb{C}^{n\times n}$. Therefore, $(A^+A)^* = (V\Sigma^+\Sigma V^*)^* = V\Sigma^+\Sigma V^* = A^+A$.

Indeed, this is a part of the proof to show that the pseudoinverse can be written by using the singular value decomposition. We can easily check the other three properties (see Alexander's answer) similarly.