10

I am trying to approximate a NPD matrix with the nearest PD form and compute its Cholesky decomposition.

I know that the usual method is to perform an eigenvalue decomposition, zero out the negative eigenvalues, and recompute the eigenvalue decomposition. However, I don't want to do any eigenvalue decompositions, and I want to deal with this issue within the Cholesky algorithm.

For those of you who are familiar with the Cholesky algorithm, there is a step where you compute the diagonal of the lower diagonal matrix as the square root of a value (let's say x). If x<0 then, this means the matrix is NPD.

A simple way to deal with this could be to set x = 0 or x = 10^-10, just to work around this problem. However, it lacks the mathematical rigor.

My question is, has any of you heard such a method, or something similar to the above method, which is either published or somewhat proven?

1 Answers1

6

You should be a bit more precise what you mean by NPD. My guess is: a symmetric/Hermitian (so, indefinite) matrix.

There is a Cholesky factorization for positive semidefinite matrices in a paper by N.J.Higham, "Analysis of the Cholesky Decomposition of a Semi-definite Matrix". I don't know of any variants that would work on indefinite matrices and find the closest positive (semi)definite matrix, but read this paper and see if you can work something out.

You might also be interested in Bunch-Parlett's symmetric indefinite decomposition described in their classic paper "Direct Methods for Solving Symmetric Indefinite Systems of Linear Equations".

To the best of my knowledge, the closest approximations like this have come to focus relatively recently and there are less demanding problems that are not yet solved, so I doubt that you'll find something already done for what you're interested in.

Edit

Of course, if you use the eigenvalue decomposition, you can get the closest semidefinite matrix. However, note that there is no need to multiply the obtained factors. Let's say you have $A = U \Lambda U^T$, where $U$ is orthogonal and $\Lambda$ is diagonal. Obtain nonnegative $\Lambda'$ by replacing negative elements in $\Lambda$ by zero and get the closest semidefinite aproximation of $A$: $A' = U \Lambda' U^T$.

Now, take QR of $(U\sqrt{\Lambda'})^T$. This will give you orthogonal $Q$ and upper triangular $R$ such that $QR = (U\sqrt{\Lambda'})^T$. Now, you have: $$A' = (U\sqrt{\Lambda'}) (U\sqrt{\Lambda'})^T = (QR)^T (QR) = R^T R,$$ which is a Cholesky(-like) decomposition you want.

I don't know why you want to avoid the eigenvalue decomposition. This looks quite clean to me.

Vedran Šego
  • 11,592
  • Hi Vedran. Thank you very much for your reply. I meant to say a Non-Positive-Definite matrix, or specifically a symmetric matrix that has some very small non-positive eigenvalues. – almostcutmyhair Jun 17 '13 at 23:21
  • So, indefinite (almost semidefinite) matrix. I asked because it was not completely clear if you're troubled only by zeroes, or by negatives as well. Anyway, I've updated my answer with the eigenvalue decomposition approach which is somewhat more stable than what you mentioned (it avoids multiplication of the eigenvalue decomposition factors). I don't know of a better way. – Vedran Šego Jun 17 '13 at 23:29
  • Thank you very much Vedran. Your arguments are definitely clear and make perfect sense. The only reason I was trying to avoid eigenvalue decomposition is because it is very slow compared to Cholesky decomposition, unless I'm unaware of a method that speeds up eigenvalue decomposition as much as Cholesky... – almostcutmyhair Jun 18 '13 at 13:43
  • You are right about that. However, Cholesky uses the properties you just don't have. It's been a while since I last read Bunch-Parlett's paper, but - if I remember right - you cannot always get $RJR^T$, where $R$ is upper triangular and $J$ is a signature matrix, which would be like a Cholesky for the indefinite matrices. If I'm not mistaken, either $J$ has to be quasidiagonal (block diagonal with the diagonal blocks of order at most 2) or $R$ has to be upper quasitriangular (block triangular with the diagonal blocks of order at most 2). My point being: better properties, more speed. – Vedran Šego Jun 18 '13 at 14:03
  • If you have a symmetric indefinite decomposition $A = R J R^T$, with $J = {\rm diag}(\pm1)$ signature and $R$ anything (not necessarily (quasi)triangular), you get the eigenvalue decomposition from that by using the hyperbolic SVD of $R$ (by Bojanczyk et al. or by Zha; there is and algorithm by Slapničar, I think). However, I don't think any of these can compare to the speed of Cholesky, and the closest approximation would probably be more complex than just replacing some elements with zeroes. – Vedran Šego Jun 18 '13 at 14:10
  • You may want to take a look at the indefinite QR (by Sanja Singer). It will not give you a unitary $Q$ (it will have similar, but a bit weaker properties) and $R$ will not be triangular (but quasitriangular). That's used when you have $A = G^* J G$ (again, $J$ is a signature matrix) to obtain a symmetric indefinite decomposition of $A$ without multiplication of $G^*$, $J$ and $G$. The fact that this QR is much less nice than the Euclidean one also speaks that the problem is genuinely different. Sorry for the off-topic. I like matrix decompositions in the nonstandard scalar product spaces. :-) – Vedran Šego Jun 18 '13 at 14:16
  • Thanks! I wish those methods were as fast as Cholesky, since I am practically dealing with about 5000x5000 matrices... – almostcutmyhair Jun 18 '13 at 20:55