6

I was asked to derive the mean and variance for the negative binomial using the moment generating function of the negative binomial.

However i am not sure how to go about using the formula to go out and actually solve for the mean and variance.

JKnecht
  • 6,637
joe
  • 99
  • Have you found the mgf? Or are you just given it? Or do you need to find it first? – André Nicolas Mar 16 '16 at 20:15
  • we are given the MTG, as for what i've tried i have no idea, i don't even know where to start with this one. – joe Mar 16 '16 at 22:59
  • @joe Since you are new I wanted to let you know that you should upvote the answers you get if you find them helpful and accept one answer. If there is something lacking in an answer you should ask about what you think is missing. – JKnecht Mar 22 '16 at 07:58

2 Answers2

11

The moment generating function of a random variable $X$ is defined by

$$ M_X(t) = E(e^{tX}) = \begin{cases} \sum_i e^{tx_i}p_X(x_i), & \text{(discrete case)} \\ \\ \int_{-\infty}^{\infty} e^{tx}f_X(x)dx, & \text{(continuous case)} \end{cases} $$

If we express $e^{tX}$ formally and take expectation

$M_X(t) = E(e^{tX}) = 1 + tE(X) + \frac{t^2}{2!}E(X^2)+...+ \frac{t^k}{k!}E(X^k)+...$

and the $k$th moment of $X$ is given by

$E(X^k) = M_X^{(k)}(0) \:\:\:\:\:\:k = 1, 2...$

$M_X^{(k)}(0) = \frac{d^k}{dt^k} M_X(t) |_{t=0}$


For the negative binomial we have the moment generating function

$M_X(t)=E(e^{tX})=(pe^t)^r [1-(1-p)e^t]^{-r}$

and we want to calculate (writing $M_X(t)=M(t)$ from here on)

$\mu = E[X] = M'(0)$

$\sigma^2= E[X^2] - (E[X])^2 = M''(0)-[M'(0)]^2$


$M'(t)= (1-p) r e^t (p e^t)^r (1-(1-p) e^t)^{-r-1}+p r e^t (p e^t)^{r-1} (1-(1-p) e^t)^{-r}$

$M'(0) \Rightarrow e^t = e^0 =1$

$M'(0)= (1-p) r (p)^r (1-(1-p))^{-r-1}+p r (p)^{r-1} (1-(1-p))^{-r}$

$M'(0)= (1-p) r (p)^r (p)^{-r-1}+p r (p)^{r-1} (p)^{-r}$

$M'(0)= (1-p) r (p)^{-1}+p r (p)^{-1}$

$M'(0)= \frac{(1-p)r}{p} + r$

$M'(0)= \frac{(1-p)r}{p} + \frac{pr}{p}$

$M'(0)= \frac{(1-p)r + pr}{p}$

$M'(0)= \frac{r}{p}$

$\therefore \mu = \frac{r}{p}$


$\sigma^2= E[X^2] - (E[X])^2 = M''(0)-[M'(0)]^2$

$M''(t) = r(pe^t)^r(-r-1)[1-(1-p)e^t]^{-r-2}[-(1-p)e^t]+r^2(pe^t)^{r-1}(pe^t)[1-(1-p)e^t]^{-r-1}$

$M''(0) = r(p)^r(-r-1)[1-(1-p)]^{-r-2}[-(1-p)]+r^2(p)^{r-1}(p)[1-(1-p)]^{-r-1}$

and then it is just to simplify this and use the formula for the variance.

JKnecht
  • 6,637
2

The key idea is that if $M_X(t) = \operatorname{E}[e^{tX}]$ is the moment generating function for a random variable $X$, then we have $$\left[\frac{d^k M_X(t)}{dt^k} \right]_{t=0} = \operatorname{E}[X^k], \quad k = 0, 1, 2, \ldots.$$ That is to say, the $k^{\rm th}$ derivative of $M_X(t)$ with respect to $t$, evaluated at $t = 0$, is equal to the $k^{\rm th}$ raw moment of $X$, whenever such moments exist. So if you are given the negative binomial MGF, all you need to do to calculate $\operatorname{E}[X]$ is to take the derivative of the MGF, and evaluate it at $t = 0$.

To get the variance, recall that $$\operatorname{Var}[X] = \operatorname{E}[X^2] - \operatorname{E}[X]^2,$$ so you would calculate the second derivative $M''_X(0)$ at $t = 0$ and subtract the square of the previous result.

heropup
  • 143,828