6

Consider the following system of infinitely many equations: \begin{align} \mathbb{E} \left[ {\rm sign} \left(X - \frac{k+1}{2} \right) X^k \right] =0, \forall k \in \mathbb{N}_0 \end{align} where the random variable is $X \ge 0$. We assume that $0^0=1$.

Question: Can we show that there are no solutions to this system of equations?

This is clearly reminiscent of the famous moment problem. This problem came up in the context of estimation theory where this moment problem comes up in a middle of proof for consistency of an estimator.

What I tried I tried to consider a few values of $k$ and tried to see if there is a contradiction.

One can also notice that the above implis that \begin{align} \int_0^{\frac{k+1}{2} } x^k dP_X(x) =\frac{1}{2} \int_0^{\infty } x^k dP_X(x), \, k \in \mathbb{N}_0 \end{align} However, I was not sure how to use it.

Boby
  • 6,381
  • 1
    @BrianMoehring Thanks for this comment. I overlooked this. Let me update the question to that there is no solutions. – Boby Oct 05 '24 at 02:35
  • Hint: Generalise the equation for any $k∈\mathbb{N}_0$, and show why it is not possible from there. I'll send an answer if no one solves it soon. – keixx Oct 05 '24 at 03:00
  • 1
    @keixx What do you mean to generalize for any $k \in \mathbb{N}_0$? It already holds for this set. Can you clarify? I am glad that there is an answer. – Boby Oct 05 '24 at 12:28
  • No worries, this is a pretty idiosyncratic approach so I'll pop it in the answers. Just confirming, do we use $0^0=1$ or undefined? – keixx Oct 06 '24 at 01:31
  • 1
    @keixx I have added the definition and clarified this point. – Boby Oct 07 '24 at 15:56

1 Answers1

5

former proof has been deleted


@Boby The last answer didn't hold due to Lemma $1$ failing, so I have attempted with a different approach. Apologies for the lengthy delay.

We want to show that no non-negative random variable $X$ satisfies the system of equations:

$$\mathbb{E}\left[\text{sign}\left(X-\frac{k+1}{2}\right)X^k\right]=0, \forall k \in \mathbb{N}_0$$

Given that $X\ge0$, the sign function $\text{sign}\left(X-\frac{k+1}{2}\right)$ behaves as follows:

  • $\text{sign}\left(X-\frac{k+1}{2}\right)=-1$ when $X<\frac{k+1}{2}$,
  • $\text{sign}\left(X-\frac{k+1}{2}\right)=0$ when $X=\frac{k+1}{2}$,
  • $\text{sign}\left(X-\frac{k+1}{2}\right)=1$ when $X>\frac{k+1}{2}$.

So the expectation can be written as $$\mathbb{E}\left[\text{sign}\left(X-\frac{k+1}{2}\right)X^k\right]=\int_0^\infty\left(x-\frac{k+1}{2}\right)x^kdF_X(x)$$ $$=\left(-\int_0^{\frac{k+1}{2}}x^kdF_X(x)\right)+\left(\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)\right)$$ $$=0.$$ This simplifies to $$\int_0^{\frac{k+1}{2}}x^kdF_X(x)=\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)$$ Let us denote $$I_k=\int_0^{\frac{k+1}{2}}x^kdF_X(x), \ J_k=\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)$$ So $$I_k=J_k$$ Moreover the $k$-th moment of $X$ is $$\mathbb{E}\left[X^k\right]=I_k+J_k=2I_k=2J_k$$ So we have $$\mathbb{E}\left[X^k\right]=2\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)$$ We will relate the tail probability $P\left(X\ge\frac{k+1}{2}\right)$ to the moment $\mathbb{E}\left[X^k\right]$. Since $x\ge\frac{k+1}{2}$ in the integral we have $$x^k\ge\left(\frac{k+1}{2}\right)^k, \ \forall x\ge\frac{k+1}{2}$$ Therefore: $$J_k=\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)\ge\left(\frac{k+1}{2}\right)^kP\left(X\ge\frac{k+1}{2}\right)$$ From $\mathbb{E}\left[X^k\right]=2J_k$, $$\frac{1}{2}\mathbb{E}\left[X^k\right]=J_k\ge\left(\frac{k+1}{2}\right)^kP\left(X\ge\frac{k+1}{2}\right)$$$$P\left(X\ge\frac{k+1}{2}\right)\le\frac{\frac{1}{2}\mathbb{E}\left[X^k\right]}{\left(\frac{k+1}{2}\right)^k}$$ We now see how $P\left(X\ge\frac{k+1}{2}\right)$ behaves as $k\rightarrow\infty$.

Case 1: $\mathbb{E}\left[X^k\right]$ is bounded

Suppose $\mathbb{E}\left[X^k\right]\le C$ for all $k$, where $C$ is a constant. This would happen if $X$ is bounded above. Then: $$P\left(X\ge\frac{k+1}{2}\right)\le\frac{\frac{1}{2}C}{\left(\frac{k+1}{2}\right)^k}$$ As $k\rightarrow\infty$, $\left(\frac{k+1}{2}\right)^k\rightarrow\infty$ faster than any exponential function, so $P\left(X\ge\frac{k+1}{2}\right)\rightarrow0$.

Case 2: $\mathbb{E}\left[X^k\right]$ grows exponentially or faster

Suppose $\mathbb{E}\left[X^k\right]\le De^{ak}$ for some constants $D>0$ and $a>0$. Then $$P\left(X\ge\frac{k+1}{2}\right)\le\frac{\frac{1}{2}De^{ak}}{\left(\frac{k+1}{2}\right)^k}$$ But $\left(\frac{k+1}{2}\right)^k=e^{k\ln\left(\frac{k+1}{2}\right)}$ grows faster than any exponential $e^{ak}$. Therefore, $P\left(X\ge\frac{k+1}{2}\right)\rightarrow0$ as $k\rightarrow\infty$.

In all cases, we have $$\lim_{k\rightarrow\infty}P\left(X\ge\frac{k+1}{2}\right)=0$$

Since $P\left(X\ge\frac{k+1}{2}\right)\rightarrow0$, the tial integral $J_k$ tends to zero unless compensated by very large values of $x^k$. However, from $J_k=\frac{1}{2}\mathbb{E}\left[X^k\right]$, we have $$\frac{1}{2}\mathbb{E}\left[X^k\right]\ge\left(\frac{k+1}{2}\right)^kP\left(X\ge\frac{k+1}{2}\right)$$ But as $P\left(X\ge\frac{k+1}{2}\right)\rightarrow0$ and $\left(\frac{k+1}{2}\right)^k\rightarrow\infty$, the RHS could still be finite if $\mathbb{E}\left[X^k\right]$ grows accordingly. However, for $\mathbb{E}\left[X^k\right]$ to remain finite, $X$ cannot have heavy tails that compensate for the small probability $P\left(X\ge\frac{k+1}{2}\right)$.

Let us now show that $\mathbb{E}\left[X^k\right]\rightarrow0$ as $k\rightarrow\infty$.

Suppose $X$ is bounded above by some finite $M$. Then for $k$ large enough, $\frac{k+1}{2}>M$ so $P\left(X\ge\frac{k+1}{2}\right)=0$. Thus $J_k=0$ and $\mathbb{E}\left[X^k\right]=2J_k=0$ for large $k$. This implies $\mathbb{E}\left[X^k\right]=0$ for large $k$, which is only possible if $X=0$ almost surely.

If $X$ is unbounded but $\mathbb{E}\left[X^k\right]\rightarrow0$ as $k\rightarrow\infty$, then $X=0$ almost surely.

Why?

Consider for any $\delta>0$, $$\mathbb{E}\left[X^k\right]\ge\mathbb{E}\left[X^k\mathbb{I}_{\{X\ge\delta\}}\right]\ge\delta^kP(X\ge\delta)$$ If $\mathbb{E}\left[X^k\right]\rightarrow0$ as $k\rightarrow\infty$, then $$\delta^kP(X\ge\delta)\le\mathbb{E}\left[X^k\right]\rightarrow0$$ Since $\delta^k$ is a constant raised to the $k$-th power, unless $P(X\ge\delta)=0$, the LHS will not tend to zero. Therefore $P(X\ge\delta)=0$ for any $\delta>0$. Thus: $$P(X>0)=\lim_{\delta\rightarrow0}P(X\ge\delta)=0$$ Therefore $X=0$ almost surely.

Now if $X=0$ almost surely, then for $k=0$;

$$\boxed{\mathbb{E}\left[\text{sign}\left(0-\frac{1}{2}\right)\cdot1\right]=\text{sign}\left(-\frac{1}{2}\right)\cdot1=(-1)\cdot1=-1≠0}$$ Contradicting the given equation $$\boxed{\mathbb{E}\left[\text{sign}\left(X-\frac{k+1}{2}\right)X^k\right]=0}$$ Therefore $X$ cannot be identically zero. And thus, no non-negative random variable $X$ can satisfy the given system of equations.

$q.e.d$

Notes.

  • The key is recognizing the tail probability $P\left(X\ge\frac{k+1}{2}\right)$ must tend to zero as $k\rightarrow\infty$, given growth rate of $\left(\frac{k+1}{2}\right)^k$.
  • This leads to $\mathbb{E}\left[X^k\right]\rightarrow0$ as $k\rightarrow\infty$, implying $X=0$ almost surely.
  • However $X=0$ almost surely contradicts the original equation for $k=0$ and hence no such $X$ exists.
keixx
  • 610
  • Feel free to reach out if you have any queries. – keixx Oct 08 '24 at 09:00
  • Great. Quick question does Lemma~1 hold for all non-negative random variables? – Boby Oct 09 '24 at 12:55
  • Lemma $1$ holds for non-negative random variables with finite moments of all orders, as has been written. When under these conditions, the bulk of the $k$-th moment comes from values of $X$ that are not too large. As $k$ increases, $X^k$ grows rapidly for large $X$ but the rpobability of $dF_X(x)$ of $X$ being very large decreases rapidly. The tail probability $\mathbb{P}\left(X\ge \frac{k+1}{2}\right)$ decreases faster than the increase in $X^k$ for large $x$, ensuring contribution to $\mathbb{E} \left[X^k\right]$ is negligible. – keixx Oct 09 '24 at 21:09
  • For example, by Markov's inequality $$\mathbb {P}\left(X\ge \frac{k+1}{2}\right) \le \frac {\mathbb{E} \left[X^k\right]}{\left(\frac{k+1}{2}\right)^k}$$

    as $k$ increases, $\left(\frac{k+1}{2}\right)^k$ grows faster than any $\mathbb{E}\left[X^k\right]$ making the tail probability tend to zero.

    – keixx Oct 09 '24 at 21:16
  • Lemma $1$ does not hold for variables with infinite moments (i.e. $\mathbb {E} \left[X^k\right] = \infty$ for some $k$). Lemma $1$ may not hold because the ratio $$\frac{\int^\infty_{\frac{k+1}{2}}x^kdF_X(x)}{\mathbb{E}\left[X^k\right]}$$becomes $\frac{\infty}{\infty}$ which is undefined. – keixx Oct 09 '24 at 21:20
  • Thanks. I guess my question was how do you show this lemma? I tried but it's not clear to me that it is true for random variables with finite moments. – Boby Oct 10 '24 at 01:56
  • As in how I reach my ratio equation? Or.. Sorry my english comprehension isn't very good. – keixx Oct 10 '24 at 04:04
  • Sorry my english is also not very good. I wanted to say that I was trying to prove your lemma, but could not. The Marcov's inequality reasoning that you give is not enough to show this lemma. – Boby Oct 11 '24 at 02:16
  • After thinking it over yes you are correct, Lemma 1 can't be adequately proven. I'll try redo that part of the proof, bear with me for the moment while I think about it haha. Is this urgent? – keixx Oct 13 '24 at 08:39
  • 1
    Hey! Thank you!. I think this lemma is wrong. See this for example: https://math.stackexchange.com/questions/4982445/does-lim-k-to-infty-frac-mathbbexk-1-x-ge-k-mathbbexk – Boby Oct 13 '24 at 21:03
  • Thanks, I have edited a hopefully corrected answer. Feel free to reach out and sorry for delay. – keixx Oct 19 '24 at 08:25
  • I don't follow your new argument. Are you trying to argument by using contradiction? What exactly is the contradiction? – Boby Oct 22 '24 at 16:40
  • I do not understand the cases of $\mathbb{E}\left[X^k\right]$ splitting: in Case 1 it is bounded and in Case 2 is stated that it grows exponentially or faster, but there is considered that $\mathbb{E}\left[X^k\right]\le De^{ak}$ for some constants $D>0$ and $a>0$. So, as far as I see, there remains to consider the case when $\mathbb{E}\left[X^k\right]$ grows even faster. – Alex Ravsky Oct 29 '24 at 11:43
  • "$\delta^kP(X\ge\delta)\le\mathbb{E}\left[X^k\right]\rightarrow0$ Since $\delta^k$ is a constant raised to the $k$-th power, unless $P(X\ge\delta)=0$, the LHS will not tend to zero. Therefore $P(X\ge\delta)=0$ for any $\delta>0$." As far as I see, we need $\delta\ge 1$ for the conclusion, because if $\delta<1$ then anyway $\lim_{k\to\infty} \delta^kP(X\ge\delta)=0$. – Alex Ravsky Oct 29 '24 at 11:54
  • @Boby The contradiction is supposing there exists a non negative random var. $X\ge0$ satisfying the given equation? Unless you're talking about something else? – keixx Oct 30 '24 at 07:01
  • @AlexRavsky I see how that could become a problem as this can potentially affect the conclusion that $P\left(X\ge\frac{k+1}{2}\right)\rightarrow0$ as $k\rightarrow\infty$. I assume we would have to split into multiple growth rate cases and go from there? Would you be able to provide a solution? Thanks. – keixx Oct 30 '24 at 07:03