0

Question: Is it any easier to find eigenvalues of a matrix $M$, if we have it's root $S \cdot S^T = M$? (Please note that $S$ doesn't have to be triangular so this is not Cholesky).

If the answer is "no, in general", let' see if there is a solution in my particular case, when the matrix $M$ is composed from an already decomposed matrix:

Say that we have a symmetric matrix $K$ (covariance matrix). We have it's eigendecomposition:

$$K = Q \Lambda Q^T$$

where $\Lambda$ is a diagonal matrix with eigenvalues on the diagonal, $Q$ is orthogonal matrix ($Q^T = Q^{-1}$). Let's define root of matrix $K$:

$$R = Q {\sqrt \Lambda} Q^T$$

(Now $K = R \cdot R$, $R$ is symmetric as well).

Now, the main question: What can we say about the eigenvalues of $M = R \cdot (I - {1 \over n} J) \cdot R$?

($I$ is an identity matrix, $J$ is matrix of all 1's, $n$ is the number of rows and columns of these matrices.) Can the eigenvalues of $M$ be somehow easily computed from $\Lambda$ or $\sqrt{\Lambda}$, or do I have to do the costly eigenvalue decomposition again for $M$?

PS: Not sure if this helps, but since $(I - {1 \over n} J)$ is idempotent, $M = R \cdot (I - {1 \over n} J) (I - {1 \over n} J) \cdot R$, and so if we set $S = R \cdot (I - {1 \over n} J)$, then $M = S \cdot S^T$. (Please note that $S$ doesn't have to be triangular so this is not Cholesky).

Tomas
  • 1,378
  • This post from MO and this post are relevant – Ben Grossmann Jun 23 '20 at 10:55
  • This paper, with a publicly available version here, seems to take advantage of the Cholesky decomposition in some way – Ben Grossmann Jun 23 '20 at 11:03
  • @Omnomnomnom Thank you, but please note that the root $S$ is not necessarily a triangular matrix, so it's not exactly Cholesky. – Tomas Jun 23 '20 at 11:34
  • Sure, but applying a $QR$ decomposition to $S^T$ gives you the Cholesky decomposition without too much more effort – Ben Grossmann Jun 23 '20 at 11:50
  • @Omnomnomnom thanks for your references. It's not clear to me how to put all them together to get the eigenvalues of $M$. Probably not so trivial. I'd be grateful for an answer if you know how to. Thanks. – Tomas Jun 23 '20 at 12:44
  • 1
    not sure. Long story short, though, unless you're looking for numerical methods of approximating the eigenvalues, there isn't a way to make use of the Cholesky decomposition. Could you clarify what you mean by "the matrix $M$ is composed from an already decomposed matrix"? – Ben Grossmann Jun 23 '20 at 13:06
  • @Omnomnomnom thanks for the note about lack of analytical solutions and need for numerical ones! Yes, sure. I meant by it that $M$ is composed from parts of already decomposed matrix $K$: $M = Q {\sqrt \Lambda} Q^T (I - {1 \over n} J) Q {\sqrt \Lambda} Q^T$ – Tomas Jun 23 '20 at 13:10
  • I see now that you explained that in your question, sorry about that. In any case, that gives me another idea – Ben Grossmann Jun 23 '20 at 13:19

1 Answers1

1

An alternative approach is as follows: you have a matrix of the form $$ M = Q {\sqrt \Lambda} Q^T \left(I - {1 \over n} J\right) Q {\sqrt \Lambda} Q^T \\ =Q {\sqrt \Lambda} Q^T (I) Q {\sqrt \Lambda} Q^T - \frac 1n Q {\sqrt \Lambda} Q^T (J) Q {\sqrt \Lambda} Q^T\\ = Q \left[\Lambda - \frac 1n [{\sqrt \Lambda} Q^Te][{\sqrt \Lambda} Q^Te]^T\right]Q^T, $$ where $e$ denotes the vector of all $1$s. In other words, you are looking for the eigenvalues of a rank-1 perturbation of a diagonal matrix. From there, you have several methods at your disposal. For instance, the BNS formula gives an expression for the eigenvectors of this matrix. The matrix determinant lemma gives you an expression for the characteristic polynomial, since $$ \det((D + vv^T) + \lambda I) = \det((D + \lambda I) + vv^T) \\= (1 + v^T(D + \lambda I)^{-1}v)\det(D + \lambda I) \\ = \det(D + \lambda I) + v^T\operatorname{adj}(D + \lambda I)v. $$

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • Wow, thanks for this idea! Not sure I am able to follow on it yet :-) How did you come to the third part of the first equation, and what do you denote by $e$? – Tomas Jun 23 '20 at 13:47
  • @Tomas Sorry for leaving that out! $e$ is the vector whose entries are all $1$. Noting that $J = ee^T$, I think you'll be able to see how to get from equation 3 to equation 2. – Ben Grossmann Jun 23 '20 at 14:38
  • Thank you! Now I understand the first equation.But I am completely lost in the second one ($det$).We want the characteristic polynomial $p_M(z) = det(zI - M)$, right? But how is the equation related to this? What is $D$? Does $\lambda$ denote the same lambda from the decomposition of $K$, or is it the new lambda (eigenvalues of $M$)? What is $adj$? How can I use the determinant equation to find out the eigenvalues of $M$? Please be more specific and if possible make smaller, more trivial steps, my matrix algebra isn't on that level yet :-)) – Tomas Jun 23 '20 at 17:54
  • @Tomas I use a $\lambda$ instead of $z$. Here, $D = \Lambda$ and $v = \frac 1{\sqrt{n}}\sqrt{\Lambda}Q^Te$. "adj" denotes the classical adjoint. This just gives us the characteristic polynomial, and we could find eigenvalues by finding its roots. – Ben Grossmann Jun 23 '20 at 18:09
  • @Tomas Honestly if you want to get the most efficient possible method here, I recommend googling "eigenvalues of rank one update" and trying what comes up. – Ben Grossmann Jun 23 '20 at 18:10