0

By completing the square, a standard result is that the moment generating function (MGF) $\mathbb E e^{\beta X}$ of the standard Gaussian $X \sim N(0,1)$ is $e^{\beta^2/2}$.

Is there a quick argument to show that if the MGF is $e^{\beta^2/2}$ then the distribution must be Gaussian? Of course, there's a general theory which says that the MGF must be unique. I'm wondering, is there a short calculation which lets us derive the PDF from the MGF, just in this particular case?

For example, for the analogous problem with characteristic functions, there is a nice symmetry between the Fourier transform and the inverse Fourier transform.

  • 1
    Well you can always analytically continue the MGF to the imaginary axis and then inverse fourier transform, but I feel like that would be missing your point. For discrete distributions you can use the simple relationship between the MGF and the probability generating function and then identify the distribution by taking derivatives. For continuous distributions I don't know that there's a better way than just doing the analogue of inverse Fourier/Laplace transforming. – spaceisdarkgreen Jan 25 '17 at 07:25
  • Found a similar question http://math.stackexchange.com/questions/353490/deducing-a-probability-distribution-from-its-moment-generating-function – spaceisdarkgreen Jan 25 '17 at 07:44
  • @spaceisdarkgreen the Post inversion formula looks pretty interesting, I wonder if a variation applies to MGF's. However the $k$th derivative of $e^{\beta^2/2}$ has a complicated polynomial in front... though maybe that doesn't matter in the limit. (Also interesting: there's a probabilistic interpretation of the Post inversion formula involving the strong law of large numbers with exponential random variables) – Martin Gale Jan 25 '17 at 08:09

0 Answers0