0

Follow up to this post: Does gradient descent converge to a minimum-norm solution in least-squares problems?, but with all matrices

Suppose we are given $p \times n$ matrix $\mathbf{X}$ and $q \times n$ matrix $\mathbf{Y}$. We would like to find $q \times p$ matrix $\mathbf{C}$ such that the following loss function

$$\| \mathbf{Y} - \mathbf{C} \mathbf{X}\|_F^2 $$

is minimized. How can we prove that gradient descent to estimate $C$ (multivariate regression) converges to a solution?

@Rodrigo de Azevedo provided a nice example responding to the previously mentioned question.

Thanks.

Rich
  • 79
  • 1
    Do you mean gradient descent with a fixed step size? Broadly you need to know something about the Lipschitz constant of the gradient. One can always pick a step size rule such as the Armijo step size and then it will always converge (numerics notwithstanding). – copper.hat Feb 25 '20 at 05:33
  • Yes, fixed step size. I'm trying to prove it will converge – Rich Feb 25 '20 at 11:34
  • Related: https://math.stackexchange.com/q/3558597/339790 – Rodrigo de Azevedo Feb 29 '20 at 21:33
  • If the question is one of a family of questions on the same problem, please consider linking to the other questions. – Rodrigo de Azevedo Feb 29 '20 at 21:34
  • You know the gradient, right? Write the gradient descent update and advance till you get stuck. Then do ask specific questions about the obstacles encountered. Please number such questions. – Rodrigo de Azevedo Mar 03 '20 at 21:03

0 Answers0