Questions tagged [moment-generating-functions]

For questions relating to moment-generating-functions (m.g.f.), which are a way to find moments like the mean$~(μ)~$ and the variance$~(σ^2)~$. Finding an m.g.f. for a discrete random variable involves summation; for continuous random variables, calculus is used.

A moment generating function (MGF) is a single expected value function whose derivatives produce each of the required moments.

Definition: Let $X$ be a discrete random variable with probability mass function $f(x)$ and support $S$. Then:

$$M_X(t) = E(e^{tX})=\sum\limits_{x\in S} e^{tx}f(x)$$or, $$M_X(t) = E(e^{tX}) = \int_x e^{tx} f(x) \, \mathrm{d}x$$

is the MGF of $X$ as long as the summation is finite for some interval of $t$ around $0$.

i.e. $M(t)$ is the MGF of $X$ if there is a positive number $h$ such that the above summation exists and is finite for $−h<t<h$.

Note: There are basically two reasons for which MGF's are so important.

  • the MGF of $X$ gives us all moments of $X$.
  • the MGF (if it exists) uniquely determines the distribution. That is, if two random variables have the same MGF, then they must have the same distribution.

Thu if you find the MGF of a random variable, you have indeed determined its distribution.

1359 questions
34
votes
4 answers

How to prove: Moment Generating Function Uniqueness Theorem

Many results are based on the fact of the Moment Generating Function (MGF) Uniqueness Theorem, that says: If $X$ and $Y$ are two random variables and equality holds for their MGF's: $m_X(t) = m_Y(t)$ then $X$ and $Y$ have the same probability…
28
votes
3 answers

Distribution of the difference of two normal random variables.

If $U$ and $V$ are independent identically distributed standard normal, what is the distribution of their difference? I will present my answer here. I am hoping to know if I am right or wrong. Using the method of moment generating functions, we…
21
votes
2 answers

Tail bounds for maximum of sub-Gaussian random variables

I have a question similar to this one, but am considering sub-Guassian random variables instead of Gaussian. Let $X_1,\ldots,X_n$ be centered $1$-sub-Gaussian random variables (i.e. $\mathbb{E} e^{\lambda X_i} \le e^{\lambda^2 /2}$), not necessarily…
20
votes
3 answers

Moment Generating Function of Poisson

I'm unable to understand the proof behind determining the Moment Generating Function of a Poisson which is given below: $N \sim \mathrm{Poiss}(\lambda)$ $$ E[e^{\theta N}] = \sum\limits_{k=0}^\infty e^{\theta k} \frac{e^{-\lambda}\lambda^k }{k!} =…
20
votes
2 answers

Deriving Moment Generating Function of the Negative Binomial?

My textbook did the derivation for the binomial distribution, but omitted the derivations for the Negative Binomial Distribution. I know it is supposed to be similar to the Geometric, but it is not only limited to one success/failure. (i.e the way I…
19
votes
4 answers

Finding the Moment Generating function of a Binomial Distribution

Suppose $X$ has a $\rm{Binomial}(n,p)$ distribution. Then its moment generating function is \begin{align} M(t) &= \sum_{x=0}^n e^{xt}{n \choose x}p^x(1-p)^{n-x} \\ &=\sum_{x=0}^{n} {n \choose x}(pe^t)^x(1-p)^{n-x} \\ &=(pe^t+1-p)^n \end{align} Can…
16
votes
5 answers

Deducing a probability distribution from its moment-generating function

It's pretty trivial to get a moment-generating function from a p.d.f. (provided $\sum e^{tx}f(x)$ isn't too difficult to evaluate), but since moment-generating functions uniquely determine a probability distribution function, is there a way to…
13
votes
1 answer

How to find probability distribution function given the Moment Generating Function

After searching, I found two questions like mine, but didn't see my answer to my question. Finding a probability distribution given the moment generating function Finding probability using moment-generating functions My question is how to find…
13
votes
2 answers

Distribution of Dot-Product of Two Independent Multivariate Gaussian Vectors

Let $X,Y\stackrel{\text{i.i.d.}}{\sim}\mathcal{N}(0,I_d)$, where $I_d$ is the $d$-dimensional identity matrix. What is the distribution of $\langle X,Y\rangle=X^TY$? Approach 1: So far I know that for any $i\in\{1,...,d\}$ the MGF of $X_iY_i$…
13
votes
0 answers

The Expectation of a function of independent random variables

Assume we have for some index $i>n$ ($n \in \mathbb{N} $) the following ${\it Independent \ Random \ Variables}$ $$h_i \sim \text {i.i.d }\ \ \mathcal{CN}(0,1) \ \ \text{ Complex Gaussian}$$ $$\Omega_i \sim \text {i.i.d with pdf }\ \…
12
votes
4 answers

Methods for Finding Raw Moments of the Normal Distribution

I'm having some trouble with finding raw moments for the normal distribution. Right now I am trying to find the 4th raw moment on my own. So far, I know of two methods: I can take the 4th derivative of the moment generating function for the…
12
votes
1 answer

What is the meaning of the cumulant generating function itself?

If we define the characteristic function for a random variable X as $\Phi(t)=$ then it seems like we can think of it as essentially a spectral decomposition that measures the contributions of different frequencies to the probability…
12
votes
1 answer

About the "Cantor volume" of the $n$-dimensional unit ball

A simple derivation for the Lebesgue measure of the euclidean unit ball in $\mathbb{R}^n$ follows from computing $$ \int_{\mathbb{R}^n}e^{-\|x\|^2}\,dx $$ in two different ways. See, for instance, Keith Ball, An Elementary Introduction to Modern…
11
votes
1 answer

Inequality on the moment generating function of a centered random variable which is bounded above

I am stuck in the first part of problem 2 of the chapter 8 (error estimation) of the book "A probabilistic theory of pattern recognition" by Devroye: Show that for any $s>0$, and any random variable $X$ with $\mathbf EX=0,\mathbf EX^2=\sigma^2,…
11
votes
1 answer

Tail Lower Bounds using Moment Generating Functions

Given a random variable $X>0$ with Moment Generating Function $m(s)=E[e^{sX}]$ I'm interested in finding a lower bound $$\Pr[X \ge t] \ge 1-\varepsilon(t),$$ where $t>E[X]$. A classic technique for finding upper bounds for $\Pr[X \ge t]$ is using…
1
2 3
90 91