13

Can anyone present to me an elegant elementary proof of the relationship between the eigenvalues of a positive definite matrix and its Cholesky decomposition?

More formally, suppose $\mathbf{A}$ is an $n\times n$ positive definite matrix and let $\mathbf{A} = \mathbf{R}^\top \mathbf{R}$ be its Cholesky decomposition. Establish the relationship between the eigenvalues of $\mathbf{A}$ and that of $\mathbf{R}$.

EDIT (Additional remarks): My question specifically wants to find, if possible, an equation or function, say $f$, that relates the eigenvalues, i.e., $f\left(\lambda_i(\mathbf{R})\right) = \lambda_i(\mathbf{A})$, with uniqueness up to order being considered if necessary.

venrey
  • 511

3 Answers3

14

There is no such relation. If the spectrum of $A$ is a function of the spectrum of $R$, it would imply that $$ A=\pmatrix{1&0\\ t&1}\pmatrix{1&t\\ 0&1}=\pmatrix{1&t\\ t&t^2+1} $$ has a constant spectrum, but this is obviously not the case because our $A$ here has a non-constant trace.

In general, if $A=R^TR$ (regardless of whether this is a Cholesky decomposition or not) for a real square matrix $R$, the eigenvalues of $A$ are the squared singular values of $R$. If we order the eigenvalues of $R$ in decreasing modulus, then with Weyl’s inequality, we can somehow relate the eigenvalues of $R$ and $A$ as follows: $$ \prod_{i=1}^k|\lambda_i(R)|^2 \le\prod_{i=1}^k\sigma_i(R)^2 =\prod_{i=1}^k\sigma_i(A) =\prod_{i=1}^k\lambda_i(A) \quad\text{for } k=1,2,\ldots,n $$ (and surely, with equality holds when $k=n$ because $|\det R|^2=\det A$).

user1551
  • 149,263
  • can you give a more detailed version of the proof or at least a reference text that explains your last line regarding the 'general' case? – venrey Jun 04 '21 at 22:39
  • 1
    @venrey Every textbook that covers singular value decomposition should have mentioned that the singular values of a (possibly non-square) matrix $M$ are the square roots of the eigenvalues of $M^\ast M$ (when $M$ is tall) or $MM^\ast$ (when $M$ is fat). I haven't any good reference at hand, but I think you may try Matrix Analysis by Horn and Johnson, Linear Algebra and Its Applications by Gilbert Strang, another textbook of the same title by Peter Lax, or Wikipedia. – user1551 Jun 05 '21 at 14:51
5

For a positive definite matrix $A$, with $Q$ as eigenvector matrix and $\Lambda$ as eigenvalue matrix, we have

$$ A = Q \Lambda Q^T $$

This can be rewritten as (since all eigenvalues of $A$ are positive) :

$$ A = (Q \sqrt{\Lambda}) (\sqrt{\Lambda} Q^T) $$

So for $A = R^TR$, $R$ can be a matrix such that,

$$ R = \sqrt{\Lambda} Q^T $$

Also we can multiply any orthogonal matrix $Q$ to this $R$ without changing the original $A = R^TR$ condition, because,

$$ A = (QR)^TQR = R^T(Q^TQ)R = R^TR $$

So rewriting $R$ as $Q\sqrt{\Lambda} Q^T$, we see that eigenvalues of $R$ are square roots of eigenvalues of $A$

artha
  • 514
  • 3
    How can you be sure that the $\mathbf{R}$ is an upper (or lower) triangular as should be the case of Cholesky? – venrey Sep 03 '18 at 16:43
  • Ah yes you're right. You can only write $R$ as $\sqrt {D} L^T$. So I think we can only comment on the sign of eigenvalues of $R$ matching those of $A$ but not the values themselves. – artha Sep 04 '18 at 04:14
  • Can you formalize your argument that, "the sign of eigenvalues of $\mathbf{R}$ matching those of $\mathbf{A}$ but not the values themselves"? – venrey Sep 04 '18 at 04:17
  • 1
    As the other answer states, there is no such relationship – Ben Grossmann Jun 23 '20 at 10:57
  • If I understand this correctly, and to clarify the above, the singular values are the same, in general, as the eigenvalues only for positive definite symmetric matrices. I think people confuse $L$ with $A$ with respect to this. However, since the $L$ in the Cholesky decomposition is not symmetric, even for a positive symmetric matrix $A$, the eigenvalues of the $L$ are not the square root of the eigenvalues of $A$. Instead, the singular values of $L$ are the square root of the eigenvalues of $A$. Right? – Jap88 Sep 05 '21 at 19:28
1

Start with Cholesky form: $$A = R^TR$$ let $R = DL$ where $D$ is a diagonal matrix such that $D_{i, i} = R_{i, i}$, then $L_{i, i} = 1$ and $L$'s rows are $R$ divided by diagonals of $D$, which means $\text{det}(L) = 1$

Consider $A = (DL)^T(DL) = L^TD^2L$, then $\text{det}(A) = \text{det}(L)^2\text{det}(D)^2 = \lambda_1\lambda_2\cdots\lambda_n$.

Basically: $\prod_{i = 1}^n(D_{i, i})^2 = \prod_{i = 1}^n\lambda_i = \prod_{i = 1}^n(R_{i, i})^2$

Alto Lagato
  • 143
  • 8