0

Take $A\in R^{kxk}$. Suppose that its eigenvalues are equal to its singular values. Then show that $A$ is symmetric and positive semi-definite. I've found sources stating it but I haven't managed to find a proof anywhere. I've tried a bunch of things with little success. I've been hammering at this problems for days now and can't seem to figure out how to prove it. Maybe I misinterpreted and there are some valid counterexamples

The reverse direction is a well known fact with an easy proof, I've seen it stated many times. If $A$ is positive semi-definite then its singular values are its eigenvalues. $A$ would have an orthogonal diagonalization with positive eigenvalues. It's easy to construct a valid SVD. I want to prove this other direction so I can have an equivalence

An interesting yet failed attempt

Let's assume $A$ is diagonalizable. Thus we have two factorizations

  • $A$ = $UDV^t$
  • $A$ = $BDB^{-1}$

We want to prove that we can write $A = QDQ^t$ with $Q$ an orthogonal matrix.

Take $x$ a normalized eigenvector with maximum eigenvalue $\lambda_1$. $$\lambda_1^2 = ||Ax||_2^2 = ||UDV^tx||_2^2 = ||DV^tx||_2^2 $$

We can write $x$ using the basis given by the columns in $V$, that is $x = \sum_i \alpha_i v_i $. But, as V is an orthogonal matrix we also have that $\sum_i \alpha_i^2 = 1$

Combining these properties $$ \lambda_1^2 = ||DV^tx||_2^2 = \sum_i \alpha_i^2 \lambda_i^2 $$

So we conclude that $\alpha_i = 0$ for $\lambda_i \neq \lambda_1$. Thus the eigenspace given by the biggest eigenvalue $\lambda_1$ is a subset of the space spanned by the first $k$ columns of $V$ matching that eigenvalue in the SVD decomposition. And then I got stuck, similar argument would also prove the same thing for the smallest eigenvalue but that's not enough. I also don't think this is a good direction

1 Answers1

1

$\sum_{j=1}^k\sigma_j^2=\text{trace}\big(A^T A\big)^\frac{1}{2}\cdot\text{trace}\big( AA^T\big)^\frac{1}{2}\geq \big\vert\text{trace}\big(A^2\big)\big\vert=\big\vert\sum_{j=1}^k\lambda_j^2\big\vert=\sum_{j=1}^k\lambda_j^2$
where the inequality is Cauchy-Schwarz and is met with equality $\implies A=\eta \cdot A^T$
$\implies \eta =1$ (multiply each side by $A$ then take the trace)

So $A=A^T$ and $\lambda_j=\sigma_j\geq 0$. Real Spectral Theorem tells us $A=QDQ^T$ for some $Q\in O_n(\mathbb R)$ and $D\succeq \mathbf 0\implies A\succeq \mathbf 0$ since they are congruent.

user8675309
  • 12,193
  • Thank for the answer but I think I still don't get parts of the proof. Would it be possible for you to expand your answer?
    • $\text{trace}(A^2) = \sum \lambda_i^2$ wouldn't this asume that it is diagonalizable? (might be wrong here)

    • Could you expand a bit on how you derive that inequality from Cauchy-Schwarz?

    • Why does equality imply $A = \eta A^t$?

    – SVDieseas Dec 31 '23 at 17:17
  • (i) over $\mathbb C$ every matrix is upper triangularizable (look up Schur Triangularization) $\implies \text{trace}(A^2) =\sum \lambda_j^2$ after you practice multiplying upper triangular $R$ times itself. (ii) https://math.stackexchange.com/questions/476802/how-do-you-prove-that-trbt-a-is-a-inner-product and using notation of the link $B:= A^T$ ... Cauchy-Schwarz immediately follows once you have an inner product and $\text{trace}\big(A^T A\big) = \text{trace}\big( AA^T\big)$ (cyclic property of trace). (iii) Linear dependence (scalar multiple) is the equality condition for Cauchy-Schwarz. – user8675309 Dec 31 '23 at 17:30
  • 1
    Love it, short and to the point. Need to polish study Schur decomposition a bit more :p – SVDieseas Dec 31 '23 at 18:16
  • $\eta$ could have been $-1$. how do you exclude that? – Exodd Oct 04 '24 at 17:23
  • 1
    @Exodd multiplying each side by $A$ and taking the trace, combined with the problem statement yields $0\leq \sum_{j=1}^k \sigma_j^2 = \text{trace}\big(A^T A\big)= \text{trace}\big(A^2\big) = \text{trace}\big(-1\cdot A^TA\big)\leq 0\implies A =\mathbf 0$ which still implies $A=A^T$ albeit in a trivial way. My take was OP wasn't concerned about the case of all singular values being $0$. – user8675309 Oct 04 '24 at 20:55