Is the moment sequence of a random variable defined on $[0,1]$ square-summable when there exists a probability density function?
-
Not necessarily: take the r.v. which is always $1$ as a simple counterexample. But it is true in certain situations, including the uniform variable on $[0,1]$, where the $n$th moment is $1/(n+1)$. – Ian Oct 08 '15 at 14:11
-
thanks for your answer. Does imposing the constraint that there has to exist a PDF change the situation? – Fiego24 Oct 08 '15 at 14:22
2 Answers
I can give a proof with one additional assumption on $f$. First consider Holder's inequality for an arbitrary $p \in [1,\infty]$, with Holder conjugate $q$:
$$\left ( \int_0^1 x^n f(x) dx \right )^2 \leq \| x^n \|^2_p \| f \|^2_q \leq C \left ( \int_0^1 x^{np} dx \right )^{2/p} = C (np+1)^{-2/p}.$$
We want this exponent to be strictly less than $-1$, so we need to put a power $p<2$ on the $x^n$ piece. To do this we need to be able to put a power $q>2$ on the $f$ piece; thus the result holds provided $f \in L^q$ for some $q>2$.
If we relax this assumption too much, we can pack in enough mass near $1$ to make the result fail. For instance, I just checked (with computer assistance) that the moments of the PDF $\frac{(1-x)^{-1/2}}{2}$ on $(0,1)$ are not square summable (the resulting series looks asymptotically like a harmonic series). This PDF is not even $L^2$. I am not sure what happens if you use a PDF which is "critical" i.e. $L^2$ but not $L^p$ for any $p>2$. (Note that these do exist: $f \in L^1$, but $f \not\in L^p$ for all $p > 1$ )
The sum of the squared moments is \begin{eqnarray*} \sum_{k=0}^\infty \mathbb{E}(X^k)^2&=& \sum_{k=0}^\infty \left(\int_0^1 x^k\,\mu(dx)\right)^2\\[5pt] &=& \sum_{k=0}^\infty \int_0^1\int_0^1 x^k y^k\,\mu(dx)\,\mu(dy)\\[5pt] &=& \int_0^1\int_0^1 {1\over 1-xy}\,\mu(dx)\,\mu(dy). \end{eqnarray*} This will be finite if $\mu$ doesn't put too much mass at the upper end of the interval $[0,1]$.