I'm looking for the optimal solution of the following problem.
Let $x$ and $b$ be two vectors of real numbers with the same dimension.
Let $\alpha$ be a scalar value.
We are looking for the optimal $\alpha$, which minimizes the squared euclidean distance between $x$ and $\alpha b$
$$ D = ||x - \alpha b||^2 $$
$$ D = x^T x - 2 \alpha x^T b + \alpha^2 b^T b $$
I took the derivative wrt. $\alpha$, set it to 0 and did the algebra:
$$\eqalign {\frac {dD} {d \alpha} &= 2 \alpha b^Tb - 2x^Tb \cr &= 0 \cr x^Tb &= \alpha b^Tb \cr \frac {x^Tb} {||b||^2} &= \alpha} $$
Did I manage to get it right, or did I make a mistake somewhere? I am no mathematician and I'm not quite sure I know all the rules and caveats of vector algebra.