Use this tag for questions about those quantities that provide an alternative to the moments of a probability distribution.
In probability theory and statistics, cumulants of a probability distribution are quantities that provide an alternative to moments of the distribution. Moments determine cumulants in the sense that any two probability distributions whose moments are identical have identical cumulants, and, similarly, cumulants determine moments.
The first cumulant is mean, the second cumulant is variance, and the third cumulant is the same as the third central moment, but greater order cumulants are not equal to central moments. In some cases, theoretical treatments of problems in terms of cumulants are simpler than treatments using moments. In particular, when two or more random variables are statistically independent, the $n$th-order cumulant of their sum is equal to the sum of their $n$th-order cumulants. Third- and greater-order cumulants of a normal distribution are zero; it is the only distribution with that property.
Cumulants of a random variable $X$ can be defined using the cumulant generating function $K(t),$ which is the natural logarithm of the moment generating function. With $M(t)$ as the moment generating function, $M(t) = \mathrm E \left[ e^{tX} \right].$ Taking the natural logarithm of both sides yields $K(t) = \log \mathrm E \left[ e^{tX} \right].$
Cumulants $\kappa_n$ are obtained from a power series expansion of the cumulant generating function: $$K(t) = \sum_{n=1}^{\infty} \kappa_n \frac{t^n}{n!} = \mu t + \sigma^2\frac {t^2}2 + \cdots.$$
That expansion is a Maclaurin series, so the $n$th cumulant can be obtained by differentiating the above expansion $n$ times and evaluating the result at zero: $\kappa_n = K^n(0).$
If the moment generating function does not exist, cumulants can be defined in terms of the relationship between cumulants and moments.
Just as for moments, where joint moments are used for collections of random variables, it is possible to define joint cumulants.