1

I am looking to understand a proof that the convolution of the pdfs of two multivariate Gaussians $N(\mu_1, \Sigma_1)$ and $N(\mu_2, \Sigma_2)$ is given by $N(\mu_1 + \mu_2, \Sigma_1 + \Sigma_2)$.

I have looked through similar questions (for example, here and here), and I found a derivation given in this document.

In page 3, the derivation states that $$ (x-a)^T A^{-1} (x-a) + (x-b)^T B^{-1} (x-b) = (x-c)^T(A^{-1} + B^{-1})(x-c) + (a-b)^T C (a-b), $$ for some vector $c$ and matrix $C$. Later in the derivation it seems that they use that $C = (A+B)^{-1}$, but there it is never given what $c$ is.

What is $c$ to make this equality work? And how would one figure out the following equality? For example, with univariate Gaussians the idea is simply to complete the square during the convolution. Is there a similar motivation here?

  • You can differentiate both sides with respect to $x$, then match terms that do not depend on $x$ and solve for c – Nick Alger Oct 14 '22 at 23:08
  • In general, $A^{-1}+B^{-1}\neq (A+B)^{-1}$. (This is not even true in 1-D.) The usual way of showing the claim is to check that the moment generating function of an $n$-dimensional gaussian vector $X\sim\mathcal{N}(\mu,\Sigma)$ is given by $$\mathbb{E}[e^{\xi\cdot X}]=\int_{\mathbb{R}^n}e^{\xi\cdot x}f_X(x),\mathrm{d}x=\exp(\xi\cdot\mu+\tfrac{1}{2}\xi^{\top}\Sigma\xi), \qquad\xi\in\mathbb{R}^n.$$ So, if $X_i\sim\mathcal{N}(\mu_i,\Sigma_i)$, $i=1,2$, are independent, then $X_1+X_2$ has PDF of the form $f_1*f_2$ and $M_{X_1+X_2}(\xi)=M_{X_1}(\xi)M_{X_2}(\xi)$, from which the claim will follow. – Sangchul Lee Oct 14 '22 at 23:17
  • @NickAlger Thanks for this. I realized that page 4 actually gives a bit more details for this, and then it's just a bunch of gross matrix algebra to see that the equality holds! – Eudoneah Oct 15 '22 at 05:46

1 Answers1

1

The equality is obtained with $c = (A^{-1} + B^{-1})(A^{-1}a + B^{-1}b)$ and $C = (A+B)^{-1} = A^{-1}(A^{-1} + B^{-1})B^{-1}$. This is actually given on page 4 of the document (which I somehow missed).

One can check the equality matrix algebra, although it isn't very fun to do.

The idea does come from completing the square. We can easily see that the quadratic terms on both sides match. Then, $c$ is determined to make the linear terms match. Finally, the last portion on the right hand side is what is leftover from completing the square.