3

Suppose $x_0$ is a solution of $Ax=b$, where $b\neq0$. How to prove that $x_0 = Gb$, where $G$ is a generalized inverse matrix of A?

This is the Lemma 9.3 of Linear Algebra and Matrix. Here is proof, but I just cannot get it.

The last line says $x_0$ has the form of $u + GA(x_0 - u)$, but I do not understand why $u + GA(x_0 - u) = G^*b$ for some generalized inverse of $A$.

Lemma 9.3 Let $A x=b$ be a consistent linear system, where $A$ is $m \times n$ and at least one of the following two conditions hold: $(a) \rho(\boldsymbol{A})=n$ (i.e., A is of full column rank$),$ or $(b) \boldsymbol{b} \neq 0 .$ Then, $\boldsymbol{x}_{0}$ is a solution to $\boldsymbol{A} \boldsymbol{x}=\boldsymbol{b}$ if and only if $\boldsymbol{x}_{0}=\boldsymbol{G} \boldsymbol{b}$ for some generalized inverse $\boldsymbol{G}$ of $\boldsymbol{A}$ Proof. If $\boldsymbol{x}_{0}=\boldsymbol{G} \boldsymbol{b}$ for some generalized inverse of $\boldsymbol{A},$ then Definition 9.2 itself ensures that $x_{0}$ is a solution.

Suppose (a) holds. Then $A$ has full column rank, which means that $A x=b$ has a unique solution (Theorem 9.4 ). This solution must be of the form $G b,$ where $G$ is a generalized inverse of $A,$ because $G b$ is a solution.

Now suppose that (b) holds. That is, $\boldsymbol{b} \neq \mathbf{0}$. If $\boldsymbol{x}_{0}$ is a solution for $\boldsymbol{A}\boldsymbol{x}=\boldsymbol{b},$ then $\boldsymbol{x}_{0}=\boldsymbol{G} \boldsymbol{b}+\left(\boldsymbol{I}_{m}-\boldsymbol{G} \boldsymbol{A}\right) \boldsymbol{u}$ for some vector $\boldsymbol{u} \in \Re^{n} .$ since $\boldsymbol{b}=\boldsymbol{A} \boldsymbol{x}_{0},$ we have $$\boldsymbol{x}_{0}=\boldsymbol{G} \boldsymbol{A}\boldsymbol{x}_{0}+\left(\boldsymbol{I}_{m}-\boldsymbol{G}\boldsymbol{A}\right) \boldsymbol{u}=\boldsymbol{u}+\boldsymbol{G}\boldsymbol{A}\left(\boldsymbol{x}_{0}-\boldsymbol{u}\right)$$

Thanks!

Matcha Latte
  • 4,665
  • 4
  • 16
  • 49
  • Not sure what the authors are hinting at, but one can easily prove that $x_0=Gb$ for some $G$ by mapping basis vectors appropriately. Let $A$ be $m\times n$, its rank is $r$ and ${Ax_0,\ldots,Ax_{r-1}}$ be a basis of the column space of $A$. Let $b_i=Ax_i$ (so that $b_0=b$) and let ${b_0,\ldots,b_{m-1}}$ be a full basis of $\mathbb R^m$. Define $Gb_i$ to be $x_i$ if $i<r$ and $0$ otherwise. Then $AGA=A$ and $Gb=x_0$. – user1551 Jun 08 '20 at 22:57
  • @user1551 Than you. How to prove the existence of $G$ so that $Gb_i = x_i$ for any $i<r$? – mxdxzxyjzx Jun 09 '20 at 00:58
  • We have already defined it to be so. – user1551 Jun 09 '20 at 07:31
  • @user1551 yes. But why it is well defined? – mxdxzxyjzx Jun 09 '20 at 14:24
  • Why isn't it well defined? You pick a basis of the domain, map the basis vectors to some specific points, and extend the mapping by linearity. This is a usual way of defining linear maps. I don't see any ambiguity here. – user1551 Jun 09 '20 at 14:50

0 Answers0