Let $\Sigma$ be a symmetric, positive definite $p\times p$ covariance matrix, and let $f(\Sigma)$ be it's Cholesky factor. That is, $f(\Sigma)$ is a lower triangular $p\times p$ matrix such that $\Sigma = f(\Sigma) f(\Sigma)^{\top}$. Further let $\Lambda := \operatorname{diag}(f(\Sigma))$ be a diagonal matrix holding the diagonal elements of $f(\Sigma)$ on its diagonal, i.e. the standard deviations given by $\Sigma$, and finally, let $P = \Lambda^{-1} \Sigma \Lambda^{-1}$ denote the correlation matrix.
I am wondering if, with $\mathcal{P} := P - I_p + \Lambda$, the derivative $$ \frac{\mathrm{d}\operatorname{vec}\left( \mathcal{P} \right)}{\mathrm{d} \operatorname{vec} \left( f(\Sigma) \right)} $$ is known, where $\operatorname{vec}$ is the vectorization function and $I_p$ the $p$-dimensional identity matrix.
I found questions answering related questions, as for example here and here and here; however due to my limited knowledge of matrix calculus I don't know how to combine these sources nor if a closed form solution exists.
- Do you have any reference where I could read upon the techniques you used?
- (With your notation) I am using your result to compute $\partial Tp / \partial Lx = T (\partial p / \partial x) D $ [Eq.1], where $T$ is similar to an elimination matrix but with a different ordering, $L$ is an elimination matrix and $D$ the corresponding duplication matrix. Using your main result I implemented Eq.1, but I get different results on test cases than the true jacobian, albeit similar. Is this since Eq. 1 is wrong or is there some mistake in your derivation?
– Jul 23 '20 at 17:54