1

Suppose that X and Y are random variables both taking possible values in {$0, 1, 2 ...$}. Further, suppose that X and Y have the same mgf for all t around $0$: Then it holds: $$\sum_{j=0}^{\infty}e^{tj}f_X(j)-\sum_{j=0}^{\infty}e^{tj}f_Y(j)=0$$

$$\Rightarrow\sum_{j=0}^{\infty}e^{tj}[f_X(j)-f_Y(j)]=0$$ $$\Rightarrow \sum_{j=0}^{\infty}e^{tj}c_j=0 $$ with $c_j:=f_X(j)-f_Y(j)$.

Is there a way to justify that $c_j=0$ for all $j \in \{0,1,...\}$

Alif
  • 417

2 Answers2

1

First, notice that from the definition if $c_j$ we have $-1 \le c_j \le 1$ for all $j$. Now suppose $c_0 \ne 0$, and WLOG $c_0 > 0$. Then we have

\begin{align*} 0 &= \sum_{j=0}^\infty e^{tj}c_j = c_0 + \sum_{j=1}^\infty e^{tj}c_j \end{align*}

so $c_0 = -\sum_{j=1}^\infty e^{tj}c_j$ for all $t \in \mathbb{R}$. Since each $c_j$ is at least $-1$ this implies $$ 0 < c_0 \le \sum_{j=1}^\infty e^{tj} = \sum_{j=1}^\infty (e^t)^j,$$

but this is a geometric series and can be made arbitrarily small by choosing $t \ll 0$. This implies $c_0 = 0$, and we can use the same reasoning to show that $c_j = 0$ for all $j$ with induction.

user6247850
  • 14,184
  • Why does the last step imply that $c_0=0$ . THe value of the geoemtric series is still >0 – Alif Nov 18 '20 at 01:05
  • If I consider mgf in an small intervall around 0, then argument would not hold? – Alif Nov 18 '20 at 01:08
  • Yes, that is correct: this argument will not work if we only have that their mgfs are only defined near $0$. – user6247850 Nov 18 '20 at 01:10
  • Is there another way to make this work around 0? – Alif Nov 18 '20 at 01:12
  • Yes, but it is no longer an easy proof: See https://math.stackexchange.com/questions/458680/how-to-prove-moment-generating-function-uniqueness-theorem?rq=1 – user6247850 Nov 18 '20 at 01:19
  • https://www.lewiswalsh.net/blog/uniqueness-of-moment-generating-functions – Alif Nov 18 '20 at 01:28
  • What do you think. How does he get to this derivative in the end. I think it is wrong. – Alif Nov 18 '20 at 01:29
  • It's not exactly wrong, he's just replacing $e^t$ with $x$, which is fine since the range of $e^t$ is $[0,\infty)$. The issue is the same as the issue with my proof: it requires sending $t$ to $-\infty$ to get that the derivative formula holds at $0$. – user6247850 Nov 18 '20 at 02:11
0

The series $g(z) = \mathbb E[z^X] = \sum_{j=0}^\infty z^j f_X(j)$ converges absolutely to an analytic function for $|z| < 1$, and the coefficients $f_X(j)$ can be obtainethbbd from the values of $g$ and its derivatives at $z=0$:

$$ f_X(j) = \frac{g^{(j)}(0)}{j!} $$

$g(z)$ for $z \ne 0$ can be obtained from the moment generating function: $$ g(e^{-t}) = \mathbb E[e^{-tX}] = \sum_{j=0}^\infty e^{-tj} f_X(j)$$

So the moment generating function determines $g$, and that determines the distribution.

Robert Israel
  • 470,583