2

When calculating Weighted Least Square Solution, after taking the derivative, we will have the following equation:

$X^\top WX\beta=X^\top Wy$

where $X_{n\times m}$ is the data matrix, with $n\geq m$ and $X$ in full rank;
$W$ is the weight matrix, with $W$ being non-zero diagonal matrix.

I want to know how can we prove that $X^\top WX$ is non-singular, in which case we can derive the weighted least square solution $\beta=(X^\top WX)^{-1}X^\top Wy$

Leblanc
  • 21

1 Answers1

0

In weighted least squares, the diagonal $n\times n$ matrix $W$ has entries $1/\sigma_i^2$, where the error variances $\sigma_i^2$ are strictly positive for every $i$. This makes $W$ positive definite.

As a result the quadratic form $z^\top(X^\top WX)z=(Xz)^\top W (Xz)$ is strictly positive for every non-zero vector $z$ in $\mathbb R^m$. Note that since $X$ has full column rank, $Xz=0$ if and only if $z=0$. Hence $X^\top WX$ is also positive definite, making it nonsingular.

StubbornAtom
  • 17,932