Questions tagged [moment-problem]

The moment problem arises as the result of trying to invert the mapping that takes a measure $μ$ to the sequences of moments, and to resolve the problem of determinacy of such measure.

The moment problem arises as the result of trying to invert the mapping that takes a measure $μ$ to the sequences of moments, and to resolve the problem of determinacy of such measure. Please do not use this tag just because moments are involved.

169 questions
24
votes
2 answers

Do moments define distributions?

Do moments define distributions? Suppose I have two random variables $X$ and $Y$. If I know $E\left[X^k\right] = E\left[Y^k\right]$ for every $k \in \mathbb N$, can I say that $X$ and $Y$ have the same distribution?
12
votes
2 answers

Can one tell based on the moments of the random variable if it is continuos or not

Suppose we are given moments of a random variable $X$. Can we determine based on this if the random variable is continuous or not? We also assume that the moments of $X$ completely determine the distribution of $X$. In other words, do moments of…
Boby
  • 6,381
10
votes
1 answer

Constructing a probability measure on the Hypercube with given moments

Let $H = [-1, 1]^d$ be the $d$-dimensional hypercube, and let $\mu \in \text{int} H$. Under these conditions, I can explicitly construct a tractable probability measure $P$, supported on on $H$, which has $\mu$ as its mean. For my purposes,…
10
votes
1 answer

Does the condition $E[X]=E[X^2]=E[X^3]$ determine the distribution of $X$?

This is a question out of pure curiosity, motivated by this posting. Here I checked that if a $\mathbb{R}$-valued random variable $X$ has finite $4$-th moment and $E[X^2]=E[X^3]=E[X^4]$ then $X$ is a Bernoulli random variable. Indeed, this follows…
9
votes
1 answer

Do the moments characterize a distribution with compact support?

Given a random variable $X=(X_1,...,X_p)$ with $P(X \in M) = 1$ for compact $M$, do the values of $E[X_1]$, $E[X_2], ..., E[X_1^2], E[X_1 X_2],..., E[X_1^3],...,E[X_1 X_2 X_3]... $ determine the distribution $F(x)=P(X_1 \leq x_1 \wedge \dots\wedge…
9
votes
1 answer

If all powers of two random variables are uncorrelated, are they independent?

Let $X$ and $Y$ be random variables on a common probability space. If $$\def\E{\mathbb E}\E[X^nY^m]=\E[X^n]\,\E[Y^m]<\infty $$ for all integers $n,m\ge 0$, does it follow that $X$ and $Y$ independent? I strongly suspect the answer is no. In the…
9
votes
1 answer

Two random variables with same moments

Reading http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter10.pdf pages 368-370. it states "if we delete the hypothesis that have finite range in the above theorem, then the conclusion is no longer necessarily…
8
votes
0 answers

Can expected values of the form $E[X^r e^{-sX}]$, for arbitrary $r>0$, uniquely determine the PDF of a random variable $X≥0$?

It is known that the probability distribution of a continuous non-negative random variable, $X$, is uniquely determined by its associated Laplace transform, $$L(s) = E[e^{-sX}] = \int_0^\infty e^{-sx} f(x) dx,$$ for $s ≥ 0$. It also seems that: (A)…
6
votes
1 answer

Show that there exists no non-negative r.v. such that $\mathbb{E}[{\rm sign}(X - \frac{k+1}{2}) X^k] =0, \forall k \in \mathbb{N}_0$

Consider the following system of infinitely many equations: \begin{align} \mathbb{E} \left[ {\rm sign} \left(X - \frac{k+1}{2} \right) X^k \right] =0, \forall k \in \mathbb{N}_0 \end{align} where the random variable is $X \ge 0$. We assume that…
Boby
  • 6,381
6
votes
2 answers

Existence of random variable given first $k$ moments

A sequence of real numbers $\{m_k\}$ is the list of moments of some real random variable if and only if the infinite Hankel matrix $$\left(\begin{matrix} m_0 & m_1 & m_2 & \cdots \\ m_1 & m_2 & m_3 & \cdots \\ m_2 & m_3 & m_4 & \cdots …
keej
  • 1,277
  • 9
  • 28
4
votes
1 answer

Upper bound on $(k+1)$th moment, given $j$th moments for $j \leq k$.

Let $X$ be a bounded random variable, so that $|X| \leq R$. Suppose that you know $E[|X|^j]$, for $j \leq k$. Is there any way to obtain an upper bound on $E[ |X|^{k+1}]$ that is better than the worst-case, i.e., $E[ |X|^{k+1}] \leq R \cdot…
Alan Chung
  • 1,426
4
votes
3 answers

Truncated Sieltjes r-atomic moment problem

Given a certain $n \in \mathbb{N}$, can i construct a discrete positive random variable $X$ that fullfill the following conditions : $$\forall k \in \{1,...,n\}, \mathbb{E}(X^k) = \mu_k$$ for some $\mu_k$ given (parameters). We suppose that there…
lrnv
  • 286
  • 1
  • 9
4
votes
0 answers

How to show these moments are determined?

I'm given a sequence of moments $$ S_k=\int_{1}^{\infty}x^k \exp \left(\frac{-x}{\log(x)}\right)dx $$ and I'm told that this sequence is determined. However, I can't find a way to show this. I tried starting out by using Carleman's Condition, i.e.…
4
votes
1 answer

What conditions on the moments make a measure a probability measure?

For a positive Borel measure $\mu$ on the real line, let $\displaystyle{m_n = \int_{-\infty}^\infty x^n d\mu(x)}$, i.e. the $n$th moments of the measure. Are there any conditions on $m_n$ for when $\mu$ can be made into a probability measure, i.e.…
4
votes
1 answer

Maximizing expected value with constrained 2nd moment

$$\begin{array}{ll} \text{maximize} & \displaystyle\int_{0}^{1} x \, f(x) \, \mathrm dx\\ \text{subject to} & \displaystyle\int_{0}^{1} f(x) \, \mathrm dx = 1\\ & \displaystyle\int_{0}^{1} x^2 f(x) \, \mathrm dx = 1\\ & f(x) \geq 0 \quad \forall x…
1
2 3
11 12