3

$\newcommand{\E}{\mathbb{E}}$

Near the end of "Finite Operator Calculus" (1976), G.C. Rota writes:

Note that one can define cumulants relative to any sequence of binomial type, e.g. the factorial cumulants (Kendall and Stuart).

The book he makes reference to is listed in the bibliography as "M. G. Kendall and A. Stuart, “The Advanced Theory of Statistics”, Vol. I, Griffin, London, 1963". I was only able to find the 1945 version of the book in abysmal quality, and I wasn't able to find any reference to "factorial cumulants" there (for instance, there is no reference to "factorial cumulants" or "cumulants, factorial" in the index). Perhaps, they were added in the 1963 edition.

I am in a great need to understand this phrase by Rota, since it may have a direct connection to my current research topic, so any references or explanations would be greatly appreciated, especially if there is an established direct connection with free probability.

Edit:

My own guess is the following (based on the definition of factorial cumulants found in this article: https://arxiv.org/pdf/1012.0750.pdf). Let $b_{k}(z)$ be a polynomial sequence of binomial type (for instance, lower (falling) factorial sequence, as in the article). Define as $$M_{b}(z)[X]:=\sum_{k=0}^{\infty}\frac{\mathbb{E}[b_{k}(X)]}{k!}z^k$$ the moment generating function of $X$. Then define $$S_b(z)[X]=\ln M_{b}(z)[X]$$ as the new cumulant generating function, such that $$\frac{d^{k}}{dz^{k}}\Bigg|_{z=0}S_{b}(z)[X]=\kappa_{b}(X)$$ and name these new objects cumulants associated to a given binomial type sequence. However, why does the sequence have to be of binomial type at all?

Edit 2:

I suppose, the reasoning behind this is the following: let $M_{b}(z)[X], M_{b}(z)[Y]$ be generating functions for the same binomial type sequence. Then their product is $$M_{b}(z)[X]M_{b}(z)[Y]=\sum_{n=0}^{\infty}z^{n}\sum_{k+l=n}\frac{\E[b_k(X)]\E[b_{l}(Y)]}{k! l!}=$$ $$=\sum_{n=0}^{\infty}\frac{z^n}{n!}\sum_{k=0}^{n}{n \choose k}\E[b_k(X)]\E[b_{n-k}(Y)]$$ If we denote $m_{j}(X)=\E[X^j]$, then $\E[b_k(X)]=\sum_{l}{b_{kl}m_{l}(X)}$ where $b_k(x)=\sum_{l}b_{kl}x^l$, and $m_{k}(X)$ is itself a binomial type sequence of sort, in the sense that for independent $X,Y$ $$m_{n}(X+Y)=\sum_{k=0}^{n}{n \choose k}m_{k}(X)m_{n-k}(Y)$$ so $\E[b_{k}(X)]$ is an "umbral composition" $b_{k}(\vec{m}(X))$. Then $$M_{b}(z)[X]M_{b}(z)[Y]=\sum_{n=0}^{\infty}\frac{z^n}{n!}\sum_{k=0}^{n}{n \choose k}b_{k}(\vec{m}(X))b_{n-k}(\vec{m}(Y))\cong $$ $$\cong \sum_{n=0}^{\infty}\frac{z^n}{n!}b_{n}(\vec{m}(X+Y))=M_{b}(z)[X+Y]$$

(I write $\cong$ in the last transition because I am not entirely sure in the whole calculation and reasoning, and not yet comfortable with using umbral calculus language). If this reasoning is correct, then the virtue of generalized binomial type moments is the following: just like for regular moments, sum of independent random variables corresponds to multiplication of their moment generating functions. Based on this, I guess, the next step (besides checking the rigor of the computation above) is to prove that generalized binomial type cumulants are linear with respect to addition of independent random variables, and that corresponding central limit theorem exists for every binomial type sequence. But I've also read that a separate notion of independence of random variables can be devised from alternative cumulants, so I am a bit confused by now.

Edit 3: I cross-posted this to MathOverflow.

0 Answers0