1

$||\Sigma - I||_2 = \max|\lambda_i|$ where $\lambda_i$ is an eigenvalue of $\Sigma - I$

$\Sigma $ is a diagonal matrix and $I$ is the identity matrix

I know that since $\Sigma - I$ is diagonal, its eigenvalues are the values along the diagonal

I also know that the singular values of $\Sigma - I$ are the square roots of the eigenvalues of $(\Sigma - I)^T(\Sigma - I) = (\Sigma - I)^2$

I can't seem to see why this is true?

  • Your question is a special case of the more general fact. – A.Γ. Dec 22 '18 at 21:38
  • 3
    I don't think this is true. Take $\Sigma=\begin{bmatrix} 2 & 0\ 0 & 2\end{bmatrix}$ for example. Then $\Sigma-I$ is the identity matrix which has $2$-norm $\sqrt 2$ but the maximum eigenvalue is $1$. I think you should consider another norm on the matrix space (for example the operator norm). – Levent Dec 22 '18 at 21:39
  • @A.Γ. You should give an official answer, even it is essentially a reference to another question plus Levent's comment.. – Paul Frost Dec 22 '18 at 22:46
  • 2
    @Levent the matrix 2-norm is generally defined as $\sup_{x\neq 0} \langle Ax,x\rangle/\langle x,x\rangle$, while the Frobenius norm is the square root of sum of square of entries. – overfull hbox Dec 23 '18 at 16:05
  • @TylerChen Oh okay, thanks for pointing it out. I thought it meant the Euclidean norm. – Levent Dec 23 '18 at 18:09

1 Answers1

1

As @A.Γ noted, this is a special case of a more general fact about normal matrices, but we can prove it without that fact.

Let $\Lambda$ be a diagonal matrix with entries $\lambda_i$. We take $\lVert{A}\rVert_2 = \sup_{x\neq 0} \lVert{Ax}\rVert_2 / \lVert{x}\rVert_2 = \sup_{\lVert x \rVert_2 = 1} \lVert{Ax}\rVert_2$.

Taking $x$ to be the eigenvector corresponding to the largest magntiude eigenvalue gives, $$ \lVert{Ax}\rVert_2 = \lVert{(\max|\lambda_i|)x\rVert}_2 = (\max|\lambda_i|) \lVert{x}\rVert_2 = \max|\lambda_i| $$

So, for any matrix, the matrix 2-norm is always at least the size of the largest magnitude eigenvalue.

Now, let $x=[x_1,x_2,\ldots, x_n]^T$. Then $Ax = [\lambda_1 x_1, \lambda_2x_2, \ldots, \lambda_nx_n]^T$. Therefore, $$ \lVert{Ax}\rVert_2 = \sum_{i=1}^{n} \lambda_i^2x_i^2 $$

Since $\lVert x \rVert_2 = 1$ we have that $\sum_{i=1}^{n} x_i^2 = 1$ , which implies that $ 0\leq x_i^2 \leq 1$. Therefore, $$ \lVert{Ax}\rVert_2^2 = \sum_{i=1}^{n} \lambda_i^2 x_i^2 \leq \sum_{i=1}^{n} (\max|\lambda_i|)^2 x_i^2 = (\max|\lambda_i|)^2 \sum_{i=1}^{n} x_i^2 = (\max|\lambda_i|)^2 $$

Therefore, for diagonal matrices, the matrix 2-norm is bounded above by the size of the largest magnitude eigenvalue, and so the two quantities must be equal.