19

In Adam Koranyi's article "Around the finite dimensioal spectral theorem", in Theorem 1 he says that there exist unique orthogonal decompositions.

What is meant here by unique?

We know that the Polar Decomposition and the SVD are equivalent, but the polar decomposition is not unique unless the operator is invertible, therefore the SVD is not unique.

What is the difference between these uniquenesses?

glS
  • 7,963
Mambo
  • 645

3 Answers3

21

For distinct singular values, SVD is unique up to permutations of columns of the $U,V$ matrices. Usually one asks for the singular values to appear in decreasing order on the main diagonal so that uniqueness is up to permutations of singular vectors with the same singular values.

When singular values are repeated, you have additional freedom of rotating their subspace by an orthogonal matrix $O$, e.g. $U[:,i:j]O$ and $V[:,i:j]O$ for the subset of columns $i:j$ which correspond to the same singular value.

Alex R.
  • 33,289
  • 1
    Can someone please explain more abstractly? – Mambo Jun 28 '13 at 12:52
  • 1
    This is not quite right: if you have repeated singular values you can make arbitrary unitary base changes - not just permutations - on the corresponding singular vector spaces. – Max Dec 10 '20 at 17:21
  • 1
    @Max: You're right, I've added that clarification. – Alex R. Dec 10 '20 at 21:12
  • 2
    As Ryan Howe's answer points out, this accepted answer is still not correct as currently written (as of 01/2024) even in the case of distinct singular values. For example, let $A=U\Sigma V^$ and let $U'=e^{i\theta}U$ and $V'=e^{i\theta}V$ where $\theta$ is not an integer multiple of $2\pi$. Then $U'$ and $V'$ are unitary and $A=U \Sigma V^=U'\Sigma V'^*$ with $U\ne U'$ and $V\ne V'$. This is just a simple example. Of course the applied magnitude-one scaling does not need to be uniform across columns. – Mark Yasuda Jan 27 '24 at 17:35
3

The singular values are unique however the columns of $U$ and $V$ are only unique up to the complex sign (because of how Gram-Schmidt works you're only guaranteed that).

Ryan Howe
  • 420
  • 2
    what do you mean by $\textbf{complex}$ sign? – georg Aug 26 '22 at 17:21
  • @georg See theorem 4.1 https://www.cs.cornell.edu/courses/cs322/2008sp/stuff/TrefethenBau_Lec4_SVD.pdf (complex scalar factors of absolute value 1) – Ryan Howe Aug 27 '22 at 18:11
  • thanks. Is this also true for eigenvalue problem, where the matrix A has complex entries, or is it just $\pm$? – georg Aug 27 '22 at 19:29
  • 1
    Yes, the matrices U and V are the eigenvectors of the matrices $AA^{}$ and $A^{}A$. It still follows. It's because of how Gram-Schmidt works. – Ryan Howe Aug 28 '22 at 00:06
2

In addition to two reasons mentioned by Alex R. above (columns of $U$, $V$ may be permuted following permutations of distinct singular values $\sigma_i$ on diagonal of $\Sigma$, and permutations caused by equal singular values) there is a third reason connected with the orthogonal complement, too.

Namely, suppose $A$ is a $m \times n$ matrix (assume $m \ge n$). Then the singular value decomposition $A = U \Sigma V^T$ is found from the orthogonal decomposition $A=Q D Q^T$ of the positive semi-definite matrix $S=A^T A$ as by the following steps:

  1. We take $V=Q$.
  2. Set $\Sigma$ by adding to $\sqrt D$ some $m - n$ extra zero rows.
  3. If the first $r$ singular values of $\Sigma$ are non-zero, then we define the first $r$ columns $u_1 \ldots, u_r$ of $U$ to be equal to the first $r$ columns of $AV$ divided by the values $\sigma_1 , \ldots, \sigma_r$ respectively.
  4. Then the remaining $m - r$ columns of $U$ can be chosen to be arbitrary $m - r$ orthonormal vectors from the orthogonal complement $span^\perp(u_1 \ldots, u_r)$ of the subspace spanned by $u_1 \ldots, u_r$.

So we have plenty of options to choose those $m - r$ columns of $U$ (infinitely many in most cases).