The Conjugate Gradient algorithm (CG) is an iterative Matrix-vector multiplication based scheme for solving equations like $$Ax = b$$
Without having to calculate an explicit inverse $A^{-1}$ or some factorization of $A$.
I wonder if we have more complicated things we would want to calculate, for example $$(A^{-1}+C_1)(B^{-1}+C_2) x$$
Wherever we do encounter $M^{-1}v$ (if the matrix $M$ fulfills the requirements posed by the CG algorithm) we can calculate this by the CG algorithm.
While we could be certain that this approach shall work... Is it efficient or wasteful?
Do there exist some framework or expansion for CG which takes such things into account?
By linearity if we encounter $( {M_1}^{-1} + {M_2}^{-1} )v$ we can expand it like so : $${M_1}^{-1}v + {M_2}^{-1}v$$ and calculate the terms independently of each other.
If we encounter $(M_1)^{-1}(M_2)^{-1}v$ we can calculate either $$(M_1)^{-1}(M_2^{-1}v)$$ in other words a two-step calculation, or we can do $(M_2M_1)^{-1}v$ where the matrix to be inverted can be treated either lazily decoupled inside of the CG algorithm or be explicitly calculated before. Which would be most efficient?