2

I need to estimate $\mathbf{y}$ defined as: $$ \mathbf{y} = \mathbf{P}\mathbf{B}^T(\mathbf{B}\mathbf{P}\mathbf{B}^T+\mathbf{R})^{-1} \mathbf{x}, $$ with $\mathbf{P}$ and $\mathbf{R}$ two diagonal matrices with positive entries. I want to implement the multiplication by the matrix inverse using a preconditioned conjugate gradient method, since I can find a good preconditioner in that case and the size of the matrices prevent a direct computation of the solution. However, I know (thanks to the Matrix Cookbook) that this expression can be rewritten as: $$ \mathbf{y} = (\mathbf{P}^{-1}+ \mathbf{B}^T \mathbf{R}^{-1} \mathbf{B})^{-1} \mathbf{B}^T \mathbf{R}^{-1} \mathbf{x}, $$ Again, I can apply preconditioned conjugate gradient to compute the matrix inversion since I have a decent preconditioner at my disposal in this case as well.

My question is straightforward: Is there a method for finding $\mathbf{y}$ (typically a scheme based on conjugate gradient) able to benefit from the insignts of the two preconditioners at the same time ?

I am aware of the multipreconditioned conjugate gradient method (which is related to the block conjugate gradient method) but it is unfortunately not valid here. Any help would be appreciated, thank you.

Denys
  • 73

0 Answers0