3

Given the moment generating function of a random variable $X$, $$M_X (t) = \mathbb{E}[e^{tX}],$$ one defines the cumulant generating function, $$K(t) = \log \mathbb{E}[e^{tX}].$$ One then states that the right-hand side (RHS) can be written as a series, $$K(t) = k_1 \frac{t}{1!} + k_2\frac{t^2}{2!} + \cdots + k_n \frac{t^n}{n!} + \cdots,$$ where the quantities $k_i$ are the cumulants or central moments of $X.$

I do not see how one gets the RHS. First, I tried to expand $e^{tX}$ inside the expectation and use the linearity of the expectation, $$K(t) = \log(1+t\mathbb{E}[X] + \frac{t^2}{2!}\mathbb{E}[X^2] + \cdots).$$ I also considered the Taylor expansion of $\log(1+t\mathbb{E}[X])$ at zero, $$\log(1+t\mathbb{E}[X]) = \frac{t}{1!} ((1-1)!\mathbb{E}[X]) + \frac{t^2}{2!} (-(2-1)!\mathbb{E}[X]^2) + \frac{t^3}{3!}((3-1)!\mathbb{E}[X]^3) + \cdots.$$ The expressions in parentheses do not seem to be all the cumulants. The first one is correct but not the second one. Can someone provide some hint or explain how one can obtain the expression for the cumulant generating function? Many thanks.

RobPratt
  • 50,938
user996159
  • 938
  • 4
  • 11

2 Answers2

4

Isn't it the case that the cumulants are defined to be the coefficients $k_1,k_2,\cdots$?

But you have: $$\begin{align}K(t)&=\log(1+\underset{\psi}{\underbrace{t\Bbb E[X]+\frac{t^2}{2!}\Bbb E[X^2]+\cdots}})\\&=\psi-\frac{1}{2}\psi^2+\frac{1}{3}\psi^3+\cdots\\&=t\Bbb E[X]+\frac{1}{2}t^2\Bbb E[X^2]+\frac{1}{6}t^3\Bbb E[X^3]\cdots-\frac{1}{2}(t^2\Bbb E[X]^2+t^3\Bbb E[X]\Bbb E[X^2]+\cdots)\\&+\frac{1}{3}(t^3\Bbb E[X]^3+\frac{3}{2}t^4\Bbb E[X^2]\Bbb E[X]^2+\cdots)+\cdots\\&=t\Bbb E[X]+\frac{1}{2}t^2(\Bbb E[X^2]-\Bbb E[X]^2)+\frac{1}{6}t^3(\Bbb E[X^3]-3\Bbb E[X]\Bbb E[X^2]+2\Bbb E[X]^3)+O(t^4)\end{align}$$

Finding $k_1=\Bbb E[X],k_2=\Bbb E[X^2]-\Bbb E[X]^2$, $k_3=\Bbb E[X^3]-3\Bbb E[X]\Bbb E[X^2]+2\Bbb E[X]^3$.

You just have to be a bit more careful :) Tediously expanding this further and further will find all coefficients. There's probably even a closed form. Your mistake was in omitting the $+\frac{t^2}{2!}\Bbb E[X^2]+\cdots$ portion, this will not give a valid Taylor expansion.

FShrike
  • 46,840
  • 3
  • 35
  • 94
  • Great answers. Thanks. If I take as example the exponential distribution, Iencounter a problem though. In this case, $M_X(t) = \frac{\lambda}{\lambda-t} = \frac{1}{1-\frac{t}{\lambda}}.$Then, given the Taylor expansion $\frac{1}{1-x} = 1+x+x^2+O(x^3),$ we get $K(t) = \log(1+\frac{t}{\lambda} + \frac{t^2}{\lambda^2} + O(t^3)).$ I again have the tendency that using only the linear term should suffice. But I get the result $K(t) = \frac{t}{\lambda} - \frac{t^2}{2}\frac{1}{\lambda^2} + \frac{t^3}{3}\frac{1}{\lambda^3}+\cdots.$ Everything is ok up to a minus sign in determining the cumulants. Why ? – user996159 May 20 '23 at 13:36
  • Accounting for all the non linear terms leads to a complicated expression which I do not see how to handle. – user996159 May 20 '23 at 13:40
  • There is no reason to say that discounting the nonlinear terms will give you the correct answer. If you want the squared coefficient, that’s going to (possibly) involve the squared term of the original function… but notice $\log(1/(1-t/\lambda))=-\log(1-t/\lambda)=t/\lambda+(1/2)(t\lambda)^2+\cdots$ is an easier way to go – FShrike May 20 '23 at 14:33
3

A general formula (after admitting the definition)

Hi as suggested by FShrike, the first thing is to see that cumulant are originally defined as the coefficient of the $log$ expansion series above. Secondly if you want to calculate them directly you can use the following method and doing it by end.

You can find below a generic formula.

The central moment of oder $n$ writes :

$$ \mu_n \triangleq \mathbb{E}((X-\mathbb{E}[X])^n) $$

If you develop it you get : $$\mu_n =\mathbb{E}(\sum_{i=1}^n\binom{n}{i}X^i\mathbb{E}(X)^{n-i}) $$

You have first :

$$ \mathbb{E}(e^{tX})=\sum_{n=1}^\infty\dfrac{t^n\mathbb{E(X) }^n}{n!}$$

Then taking the $\log$ of it.

$$ K(t)=\log(\mathbb{E}(e^{tX}))$$

First term of the expansion is the one you mention.

It is formally

$$ K(t)= \log\left(1+\sum_{n=1}^\infty\dfrac{t^n\mathbb{E(X) }^n}{n!}\right)$$

$$ K(t) = \sum_{k=1}^\infty (-1)^{k+1} \dfrac{\left(\sum_{n=1}^\infty\dfrac{t^n\mathbb{E(X) }^n }{n!}\right)^k}{k}$$

We expect

$$K(t)=\sum_{n=1}^\infty\dfrac{t^n\mathbb{E}(\sum_{i=1}^n\binom{n}{i}X^i\mathbb{E}(X)^{n-i})}{n!} $$

Using Taylor series to show it, using derivation under the expectancy (satysfing conditions When can we interchange the derivative with an expectation?) :

$$K(0)=0$$ $$K'(0)=\dfrac{\mathbb{E}(X)}{\mathbb{E}(1)}=\text{First Cumulant}$$ $$K''(0)= \dfrac{E(X^2)-E(X)^2}{E(1)}= \text{Second Cumulant}$$

Generally

Usinf Faà di Bruno formula :

$$ \dfrac{d^n}{dt^n}\log(E(e^{tX}))_{t=0}=\sum \dfrac{n!}{m_1!...m_n!}\cdot \log^{(m_1+...+m_n=k)}(E(1))\prod_{j=1}^n \left ( \dfrac{\dfrac{d^j}{dt^j}E(e^{tX})(j)(0)}{j!}\right)^{m_j} $$

And

$$\dfrac{d^n}{dx^n}\log(x)=\dfrac{(n-1)!(-1)^{n-1}}{x^n}$$ $$\dfrac{d^n}{dx^n}\log(1)=\dfrac{(-1)^{n-1}(n-1)!}{1} $$

$$\dfrac{d^n}{dt^n}e^{tX}=X^ne^{tX}$$

$$\dfrac{d^n}{dt^n}e^{0\cdot X}=X^n$$

So

$$ \dfrac{d^n}{dt^n}\log(E(e^{tX}))_{t=0}=\sum \dfrac{n!}{m_1!...m_n!}\cdot (-1)^{k-1}(k-1)!\prod_{j=1}^n \left (\dfrac{ \mathbb{E}(X^j)}{j!}\right)^{m_j} $$

Having $$ 1\cdot m_1 + ... + n \cdot m_n =n $$ And $$ m_1+...+m_n=k$$

You may end up with some calculus.

EDX
  • 2,413