I have the following multiple dimensional scaling (MDS) minimization problem in vectors $v_1, v_2, \dots, v_n \in \mathbb R^2$
$$\min_{v_1, v_2, \dots, v_n} \sum_{i,j} \left( \|v_i - v_j\| - d_{i,j} \right)^2$$
which I wish to solve numerically using gradient descent. I know all the values of $d_{i,j}$.
I am confused because the input to gradient descent is just a vector, but for this problem it is $n$ vectors. I've come across scant information about a gradient-matrix and matrix calculus but I don't understand enough to see how this is analgous with vector calculus and hence gradient descent.
- Is solving this minimization problem via gradient descent possible and why?
- How else could I solve this minimization problem numerically for $v_1, v_2 \dots, v_n$?