3

I see the following all the time in statistics: $$E[e^{\lambda X}] = E[\sum_{k = 1}^\infty \lambda^kX^k/k!] = \sum_{k = 1}^\infty \frac{\lambda^k}{k!}E[X^k]$$

where $X: \Omega \to R$ is a random variable on a sample space $\Omega$.

I would like to know how to justify this in detail.

Things I've tried

Monotone Convergence Theorem and Dominated Convergence Theorem don't seem to apply, because the partial sums are alternating and not dominated.

The closest I have gotten is by noting that the radius of convergence of the power series for $e^x$ is $R = \infty$,so we have uniform convergence of the series on any bounded set. This yields something like (letting $\lambda =1$):

$$ E[e^X] = \int_\Omega e^{X(w)}dP(w) = \lim_{L \to \infty}\int_{\Omega_L}\sum_k \frac{X(w)^k}{k!}dP(w) = \lim_{L \to \infty}\sum_k \int_{\Omega_L}\frac{X(w)^k}{k!}dP(w) $$

where $\Omega_L = \{w : X(w) < L\}$

If we could move the limit into the series on the righthand side, we'd be done. The final expression can be written as an iterated integral wrt counting measure

$$\lim_{L \to \infty}\int_Ng_L(k)d\mu(k)$$

with $g_L(k) = \int_{\Omega_L}\frac{X(w)^k}{k!}dP(w)$.

If all moments of $X$ are finite then I think Dominated Convergence lets us interchange the limit and integral as follows. $|g_L| \leq \int_\Omega \frac{|X(w)|^k}{k!}dP(w) =: f(k)$, and by Tonelli's theorem:

$$ \int_Nf(k)d\mu(k) = E[e^{|X|}] < \infty $$

This kind of argument seems highly overcomplicated, and, in any case, I don't think it should be necessary to assume all moments are finite. If someone could set me straight here, I'd really appreciate it.

Ben Doner
  • 187
  • 1
    Write sum as $\lim_{N \rightarrow \infty} \sum_{k=1}^N\dots$. Then expectation commutes with limit, then expectation is linear, so expectation commutes with addition and with scalar multiplication. – Eric Towers Apr 03 '24 at 22:40
  • Expectation doesn’t commute with limits unconditionally – Ben Doner Apr 03 '24 at 23:15
  • If not all moments of $X$ are finite, how is $\sum_{k=0}^{\infty}\frac{\lambda^k}{k!}\mathbb{E}[X^k]$ defined in the first place? Anyway, this is possible essentially when $\mathbb{E}[e^{\varepsilon|X|}]<\infty$ for some $\varepsilon>0$. – Sangchul Lee Apr 04 '24 at 02:56
  • @SangchulLee I'm $E[e^{\epsilon|X|}] < \infty$ seems like a pretty strong assumption? It seems roughly like asking for $X$ to be sub-exponential – Ben Doner Apr 04 '24 at 03:26
  • Surprisingly, the convergence of the series $\sum_{k=0}^{\infty}\frac{\lambda^k}{k!}\mathbb{E}[X^k]$ for some $\lambda\neq 0$ implies that $\mathbb{E}[e^{\varepsilon|X|}]<\infty$ for some $\varepsilon>0$. So, this seemingly strong condition, $\mathbb{E}[e^{\varepsilon|X|}]<\infty$, is essentially equivalent to the validity of the series expansion in your question, see my answer below. – Sangchul Lee Apr 04 '24 at 04:00

2 Answers2

3

I think your argument (which I don't think is overcomplicated) is correct, see a related discussion here.

The statement in your question (without any additional assumptions on $X$) seems to only be valid if you have that the moment generating function $E[e^{\lambda X}] < \infty$ for all $\lambda$ in an open interval $I$ around 0. As stated in this excellent answer (highly recommended), you automatically get that all moments exist are are finite. Then for for all $\lambda \in I$ you have $$ E[e^{\lambda X}] = \sum_{k=0}^{\infty}\frac{\lambda^{k}}{k!}E[X^{k}], $$ where the sum should start at 0, not 1. If you do not have this condition, then you can get counterexamples, e.g. all moments exists and are finite, but the MGF is not, or vice versa.

3

It is not always possible to expand $\mathbb{E}[e^{\lambda X}]$ into a power series in $\lambda$, implying that we need some condition for the proposed equality to hold.

In this regard, we have:

Theorem 1. Let $X$ be a real-valued random variable. Then the followings are equivalent:

  1. There exists an open interval $I$ containing $0$ such that $\mathbb{E}[e^{\lambda X}] < \infty$ for all $\lambda \in I$.
  2. There exists $\varepsilon > 0$ such that $\mathbb{E}[e^{\varepsilon|X|}] < \infty$.
  3. $\mathbb{E}[X^k]$ is finite for all $k \geq 0$ and $\sum_{k=0}^{\infty} \frac{\lambda^k}{k!}\mathbb{E}[X^k]$ converges for some $\lambda \neq 0$.

We also have:

Theorem 2. Let $X$ be a real-valued random variable. Then

$$ \mathbb{E}[e^{\lambda X}] = \sum_{k=0}^{\infty} \frac{\lambda^k}{k!}\mathbb{E}[X^k] $$

whenever the right-hand side converges.


Proof of Theorem 1. We show that $(1) \implies (2) \implies (3) \implies (1)$.

$(1) \implies (2)$ : Choose $\varepsilon > 0$ so that $\pm \varepsilon \in I$. Then

$$ \mathbb{E}[e^{\epsilon|X|}] \leq \mathbb{E}[e^{\epsilon X} + e^{-\epsilon X}] < \infty. $$

$(2) \implies (3)$ : Tonelli's Theorem allows us to interchange the order of expectation and infinite summation when all the summands are non-negative, regardless of the convergence. Hence,

$$ \mathbb{E}[e^{\epsilon|X|}] = \mathbb{E}\left[ \sum_{k=0}^{\infty} \frac{\varepsilon^k}{k!}|X|^k \right] = \sum_{k=0}^{\infty} \frac{\varepsilon^k}{k!}\mathbb{E}[|X|^k]. $$

Now the desired claim follows by noting that $\mathbb{E}[e^{\epsilon|X|}] < \infty$ and that $|\mathbb{E}[X^k]| \leq \mathbb{E}[|X|^k]$ for any $k \geq 0$.

$(3) \implies (1)$ : Write $R = |\lambda|$. Then by the standard theory of power series, we know that

$$ f(z) = \sum_{k=0}^{\infty} \frac{z^k}{k!}\mathbb{E}[X^k] $$

converges for any $z \in (-R, R)$. Then for any such $z$, invoking Tonelli's Theorem gives

$$ \mathbb{E}[e^{zX}] \leq 2 \mathbb{E}[\cosh(zX)] = 2 \sum_{k=0}^{\infty} \frac{z^{2k}}{(2k)!}\mathbb{E}[X^{2k}] = f(z) + f(-z), $$

which is finite. So, we can set $I = (-R, R)$.


Proof of Theorem 2. Suppose $\lambda$ is such that $\sum_{k=0}^{\infty} \frac{\lambda^k}{k!}\mathbb{E}[X^k]$ converges. Since the statement is trivial when $\lambda = 0$, we can assume $\lambda \neq 0$. Moreover, replacing $X$ and $\lambda$ by $-X$ and $-\lambda$, if necessary, we can assume $\lambda > 0$ and we do so.

Now by mimicking the previous proof, we know that $\mathbb{E}[e^{|zX|}] < \infty$ for any $z$ satisfying $|z| < \lambda$, which in turn implies that we can invoke Fubini's Theorem to have

$$ \mathbb{E}[e^{zX}] = \sum_{k=0}^{\infty} \frac{z^k}{k!}\mathbb{E}[X^k] $$

whenever $|z| < \lambda$. Now, let $X_+ = \max\{0, X\}$ and $X_- = \max\{0, -X\}$ denote the positive and negative part of $X$, respectively. Then by Monotone/Dominated Convergence Theorem and Abel's Theorem,

\begin{align*} \mathbb{E}[e^{\lambda X}] &= \mathbb{E}[e^{\lambda X_+}\mathbf{1}_{\{X \geq 0\}}] + \mathbb{E}[e^{-\lambda X_-}\mathbf{1}_{\{X < 0\}}] \\ &= \underbrace{\lim_{z \to \lambda^-} \mathbb{E}[e^{z X_+}\mathbf{1}_{\{X \geq 0\}}]}_{\text{by MCT}} + \underbrace{\lim_{z \to \lambda^-} \mathbb{E}[e^{-z X_-}\mathbf{1}_{\{X < 0\}}]}_{\text{by DCT}} \\ &= \lim_{z \to \lambda^-} \mathbb{E}[e^{z X}] \\ &= \lim_{z \to \lambda^-} \sum_{k=0}^{\infty} \frac{z^k}{k!}\mathbb{E}[X^k] \\ &= \sum_{k=0}^{\infty} \frac{\lambda^k}{k!}\mathbb{E}[X^k] \tag{by Abel's Theorem} \end{align*}

Sangchul Lee
  • 181,930