Questions tagged [least-squares]

Questions about (linear or nonlinear) least-squares, an estimation method used in statistics, signal processing and elsewhere.

1900 questions
33
votes
3 answers

Difference between least squares and minimum norm solution

Consider a linear system of equations $Ax = b$. If the system is overdetermined, the least squares (approximate) solution minimizes $||b - Ax||^2$. Some source sources also mention $||b - Ax||$. If the system is underdetermined one can calculate…
32
votes
4 answers

Why does SVD provide the least squares and least norm solution to $ A x = b $?

I am studying the Singular Value Decomposition and its properties. It is widely used in order to solve equations of the form $Ax=b$. I have seen the following: When we have the equation system $Ax=b$, we calculate the SVD of A as $A=U\Sigma V^T$.…
29
votes
4 answers

How does the SVD solve the least squares problem?

How do I prove that the least-squares solution for $$\text{minimize} \quad \|Ax-b\|_2$$ is $A^{+} b$, where $A^{+}$ is the pseudoinverse of $A$?
29
votes
5 answers

Gradient of squared Frobenius norm of a matrix

In linear regression, the loss function is expressed as $$ W \mapsto \frac1N \left\| X W - Y \right\|_{\text{F}}^2 $$ where the matrices $X$ and $Y$ are given. Taking the gradient yields $$ W \mapsto \frac 2N \, X^T( X W - Y ) $$ Why is this so?
17
votes
4 answers

Matrix Calculus in Least-Square method

In the proof of matrix solution of Least Square Method, I see some matrix calculus, which I have no clue. Can anyone explain to me or recommend me a good link to study this sort of matrix calculus? In Least-Square method, we want to find such a…
17
votes
1 answer

Orthogonal Projection of $ z $ onto the Affine set $ \left\{ x \mid A x = b \right\} $

Suppose $A$ is fat(number of columns > number of rows) and full row rank. The projection of $z$ onto $\{x\mid Ax = b\}$ is (affine) $$P(z) = z - A^T(AA^T)^{-1}(Az-b)$$ How to show this? Note: $A^T(AA^T)^{-1}$ is the pseudo-inverse of $A$…
15
votes
2 answers

Least-squares solution to system of equations of $4 \times 4$ matrices with $2$ unknown matrices

This question is in the context of a robotics problem. The goal is to track a robot using both its onboard odometry system and a VR system (HTC Vive Pro) using a VR controller mounted to the robot. What is known is the transformation between…
14
votes
1 answer

Understanding L1 and L2 norms

I am not a mathematics student but somehow have to know about L1 and L2 norms. I am looking for some appropriate sources to learn these things and know they work and what are their differences. I am asking this question since there are lots of stuff…
14
votes
2 answers

why is the least square cost function for linear regression convex

I was looking at Andrew Ng's machine learning course and for linear regression he defined a hypothesis function to be $h(x) = \theta_0 + \theta_1x_1 + \dots + \theta_nx_n$, where $x$ is a vector of values, so the goal of linear regression is to find…
14
votes
3 answers

Solve least-squares minimization from overdetermined system with orthonormal constraint

I would like to find the rectangular matrix $X \in \mathbb{R}^{n \times k}$ that solves the following minimization problem: $$ \mathop{\text{minimize }}_{X \in \mathbb{R}^{n \times k}} \left\| A X - B \right\|_F^2 \quad \text{ subject to } X^T X =…
13
votes
2 answers

Does gradient descent converge to a minimum-norm solution in least-squares problems?

Consider running gradient descent (GD) on the following optimization problem: $$\arg\min_{\mathbf x \in \mathbb R^n} \| A\mathbf x-\mathbf b \|_2^2$$ where $\mathbf b$ lies in the column space of $A$, and the columns of $A$ are not linearly…
13
votes
1 answer

How do you solve linear least-squares modulo $2 \pi$?

I have an overdetermined system of $m$ equations ($i = 1, 2, \dots, m$) $$ \sum_{j=1}^n A_{ij} \, x_j = y_i \pmod{2\pi} $$ where the $x$ coefficients are unknown, and $m > n$. This is, essentially, the linear least squares problem but on…
13
votes
4 answers

Prove that the system $A^T A x = A^T b$ always has a solution

Prove that the system $$A^T A x = A^T b$$ always has a solution. The matrices and vectors are all real. The matrix $A$ is $m \times n$. I think it makes sense intuitively but I can't prove it formally.
13
votes
2 answers

The SVD Solution to Linear Least Squares / Linear System of Equations

I'm a little confused about the various explanations for using Singular Value Decomposition (SVD) to solve the Linear Least Squares (LLS) problem. I understand that LLS attempts fit $Ax=b$ by minimizing $\|A\hat{x}-b\|$, then calculating the vector…
12
votes
2 answers

simple example of recursive least squares (RLS)

I'm vaguely familiar with recursive least squares algorithms; all the information about them I can find is in the general form with vector parameters and measurements. Can someone point me towards a very simple example with numerical data, e.g. $y =…
Jason S
  • 3,179
1
2 3
99 100