12

Suppose we are given moments of a random variable $X$. Can we determine based on this if the random variable is continuous or not?

We also assume that the moments of $X$ completely determine the distribution of $X$.

In other words, do moments of continuous random variable behave fundamentally differently than moments of say discrete random variable?

Thanks, looking forward to your ideas.

Edit: It seems like there was some confusion with the questions. Let me demonstrate with an example what I have in mind.

Suppose, we are given moments of some random variable $X$ \begin{align} E[ X^n]=\frac{1}{1+n}, \end{align} for $n \ge 0$.

Can we determine if the distribution of $X$ is continuous or not?

In this example, I took $X$ to be continuous uniform on $(0,1)$.

Some Thoughts: Since we know the moments we can reconstruct the characteristic function of $X$ (I think this can be done, right? If not let as assume this) \begin{align} \phi_X(t) =\sum_{n=0}^\infty \frac{i^n E[X^n]}{n!} t^n \end{align}

We also know that $X$ has a pdf iff $\phi_X(t) \in L_1$.

So it seems it is enough to show that \begin{align} \int_{-\infty}^\infty \left| \sum_{n=0}^\infty \frac{i^n E[X^n]}{n!} t^n \right| dt \end{align} is finite or not. However, I don't think the above approach would work, as we can not switch the integration and summation.

Boby
  • 6,381

2 Answers2

5

I doubt that there are some feasible universal conditions for two reasons:

  1. If the moment problem is indeterminate, then there can be both discrete and continuous random variables with same moments. For example, it is known that there is an infinite family of discrete random variables having the same moments as the log-normal distribution (see e.g. Stoyanov Counterexamples in Probability).

  2. One can approximate a continuous distribution with discrete ones and vice versa. So the moments of discrete distribution can be quite close to those of continuous distribution.

Of course, it is possible to formulate infinitely many sufficient conditions for a distribution to be discrete. Example: Let $\mu_n = \mathsf{E}[X^n]$. If $\mu_8 - 10\mu_6 + 33\mu_4 - 40\mu_2 + 16=0$, then $X$ is discrete (moreover, $X\in\{\pm1, \pm2\}$ a.s.).

zhoraster
  • 26,086
-1

It's not clear what this is asking, for if the moments determine the distribution, then you simply look at the resulting distribution and see whether it's continuous.

But of course, the moments don't determine the distribution: two distributions on the reals that differ at a single point have the same moments. And if one is continuous; the other will be discontinuous.

Maybe you're asking: "Given a set of moments for a distribution, is there a way to determine whether some other equivalent distribution (equivalent in the sense of having the same moments) is continuous everywhere?"

John Hughes
  • 100,827
  • 4
  • 86
  • 159
  • 4
    What does the phrase "two distributions on the reals that differ at a single point" mean? – Did Dec 27 '16 at 23:35
  • Example: $p_1(x) = 1$ for $0 \le x \le 1$ and $0$ otherwise; $p_2(x) = 1$ for $0 \le x < 1$ and $0$ otherwise. Each of these is a probability distribution, being nonnegative and having integral 1 over the reals, but they differ at $x = 1$. – John Hughes Dec 27 '16 at 23:36
  • 2
    These are the same distribution so if this is the problem you have in mind, it is irrelevant. – Did Dec 27 '16 at 23:37
  • @Winther: I think you're right, but since we both had the same misunderstanding, I'm leaving this as a partial answer for those who might be similarly confused. – John Hughes Dec 27 '16 at 23:38
  • 3
    "It's not clear what this is asking" Actually the question is rather clear: assume the sequence of moments ensure uniqueness (for example, because Carleman's condition holds) then has the corresponding measure (which we do not know explicitely) some atoms or not? – Did Dec 27 '16 at 23:39
  • According to https://en.wikipedia.org/wiki/Probability_distribution#Kolmogorov_definition, a distribution is a function; the two I've defined are definitely different functions. I understand that with the theory of distributions (i.e., somehting like "linear operators on certain function spaces), these two might be considered "equivalent", but as functions, they're different, as you well know. Does the OP know the same definition of distribution that you do? If any two functions that differ on a measure 0 set are "the same", then does it make sense to ask if one is continuous? – John Hughes Dec 27 '16 at 23:42
  • @JohnHughes when I say continuous I mean "absolutely continuous" with respect to Lebesgue measure. – Boby Dec 27 '16 at 23:46
  • 2
    @JohnHughes Distributions à la Schwartz are (of course) offtopic here. No, a distribution (in the probabilistic sense) is not a function hence yes, the PDF (check this acronym) $\mathbf 1_{[0,1]}$ and the PDF $\mathbf 1_{(0,1)}$ correspond to the same distribution. The question is to determine if the CDF (not the PDF) is continuous (or maybe absolutely continuous, this part is unclear). (Flagged your non constructive comment.) – Did Dec 27 '16 at 23:50