1

I stumbled onto a method for orthogonally diagonalizing a symmetric matrix with real entries and I was wondering what advantages (if any at all) it has over the eigenvector method.

It hinges on the fact that every symmetric matrix may be viewed as a dot product on some vector space. The idea is to use the Gram-Schmidt process to get zeros in every entry that is not on the diagonal.

What is this method called? What are its practical applications? (besides computing powers of matrices) description of the method here since I cannot embed pictures directly into the post

  • Here you find a diagonal matrix that is congruent to the original matrix $A$. Note that a symmetric matrix can always be diagonalized by an orthogonal matrix, which means that $A$ is both similar and congruent to the diagonal matrix made of eigenvalues. Advantage over the eigendecomposition: finitely many steps. Major drawback: one is usually more interested in similarity. – Jean-Claude Arbaut May 12 '19 at 20:38
  • Similarity is equivalent to a change of basis, so the eigenvalues do not depend on the basis. The eigenvectors do, however. See https://math.stackexchange.com/questions/2795340/do-eigenvalues-depend-on-the-choice-of-basis – Jean-Claude Arbaut May 12 '19 at 20:43

1 Answers1

0

It is a decomposition similar to the Cholesky called the $LDL$ decomposition

import numpy as np
from scipy.linalg import ldl

A = np.array([[3,-1,0],[-1,2,-1],[0,-1,3]])
A

array([[ 3, -1,  0],
       [-1,  2, -1],
       [ 0, -1,  3]])

lu, d, perm = ldl(A, lower =0)

lu

array([[ 1.        , -0.6       ,  0.        ],
       [ 0.        ,  1.        , -0.33333333],
       [ 0.        ,  0.        ,  1.        ]])

d


array([[2.4       , 0.        , 0.        ],
       [0.        , 1.66666667, 0.        ],
       [0.        , 0.        , 3.        ]]

Note that $2.4 = \frac{12}{5}$ and $1.\overline{666} = \frac{5}{3}$