4

The left pseudoinverse of $(A^TA)^{-1}A^T$ solves the problem of $\text{min} ||b-Ax||^2$. i.e. $x=(A^TA)^{-1}A^Tb$ is the solution to above problem.

And there is a well-know property that if we add a precision matrix $\Omega^{-1}$ as the weights,

${\hat{\boldsymbol x}} = \left( \mathbf{A}^{\mathrm{T}} \mathbf{\Omega}^{-1} \mathbf{A} \right)^{-1} \mathbf{A}^{ \mathrm{T}}\mathbf{\Omega}^{-1}\mathbf{b}$

is the best linear unbiased estimator for $x$. Reference: https://en.wikipedia.org/wiki/Generalized_least_squares

The right pseudoinverse $A^T(AA^T)^{-1}$ solves the problem of $\text{min} ||x||^2$ subject to $Ax=b$. i.e the solution is $x = A^T(AA^T)^{-1}b$.

Question: for right pseudoinverse, is there a property analogous to above GLS and left pseudoinverse, where adding a weight of some sort, would give me a best linear unbiased estimator?

zvi
  • 309

1 Answers1

1

What you show is the Weighted Linear Least Squares problem.
With some pre processing on the matrices you can reformulate it as an OLS problem.

Given that, you may use either formulation (Or the SVD based one which generalizes both).

Royi
  • 10,050