(Question edited to shorten and clarify it, see the history for the original)
Suppose we are given two $n\times n$ matrices $A$ and $B$. I am interested in finding the closest matrix to $B$ that can be achieved by multiplying $A$ with orthogonal matrices. To be precise, the problem is
$$\begin{align} \min_{U,V}\ & \|UAV^T-B\|_F \\ \text{s.t.}\ & U^TU = I \\ & V^TV = I, \end{align}$$ where $\|\cdot\|_F$ is the Frobenius norm.
Without loss of generality*, we can restrict our attention to diagonal matrices with nonnegative diagonal entries $a_1,a_2,\ldots,a_n$ and $b_1,b_2,\ldots,b_n$. My hypothesis is that in this case the optimal $UAV^T$ is still diagonal, with its entries being the permutation of $a_i$ which minimizes $\sum_i (a_{\pi_i} - b_i)^2$. In other words, $U=V=P$, where $P$ is the permutation matrix corresponding to said permutation $\pi$. This appears to be true based on numerical tests, but I don't know how to prove it. Is there an elegant proof?
*For arbitrary $A$ and $B$, take their singular value decompositions $A=U_A\Sigma_AV_A^T$ and $B=U_B\Sigma_BV_B^T$ to obtain $$\begin{align} \|UAV^T-B\|_F &= \|UU_A\Sigma_AV_AV^T-U_B\Sigma_BV_B^T\|_F \\ &= \|U'\Sigma_AV'^T-\Sigma_B\|_F, \end{align}$$ where $U'=U_B^{-1}UU_A$ and $V'=V_B^{-1}VV_A$ are orthogonal. So we can work with $\Sigma_A$ and $\Sigma_B$ instead.