5

Let $X$ and $Y$ be 2 random variables such that $X\sim Y$ and $E[Y|X]=X$ almost surely. Is it true that $X=Y$ almost surely?

This is true in the Gaussian case and I was wondering if it was true in general.

EDIT: I think I got it if $X$ and $Y$ are $L^2$, can someone confirm?

\begin{align*} E[(Y-X)^2] &= E[Y^2]-2E[YX]+E[X^2]\\ &= E[Y^2]-2E[E[YX|X]]+E[X^2]\\ &=E[Y^2]-E[X^2]\\ &=0 \end{align*}

Hence $X=Y$ a.s.

What about $L^1$ ?

Sangchul Lee
  • 181,930
W. Volante
  • 2,364
  • 1
    Try applying the law of total variance (at least when $X$ and $Y$ have a finite variance). I think you will find $\text{Var}(Y \mid X)=0$ with probability $1$, which should give you your result – Henry Apr 03 '23 at 16:40
  • 1
    Related: https://math.stackexchange.com/questions/74692/conditional-expectation-and-almost-sure-equality and https://math.stackexchange.com/questions/666843/if-exy-y-almost-surely-and-eyx-x-almost-surely-then-x-y-almost-surel – Henry Apr 03 '23 at 16:43
  • I think you might wanna add that they are defined on the same probability space. For finite mean, the above comment resolves it. – dezdichado Apr 03 '23 at 16:43
  • 2
    @Henry, When $X$ (and hence $Y$) is assumed to have finite second moment, the proof is easily done by noting that $\mathbb{E}[(Y-X)^2]=0$. So the true obstacle is the case where $X$ has infinite second moment, although the case of finite second moment seems hinting an affirmative answer. I guess we might possibly tweak some of the techniques in this link. – Sangchul Lee Apr 03 '23 at 16:43
  • Thank you all for the $L^2$ case. – W. Volante Apr 03 '23 at 16:45

1 Answers1

4

Here is an argument based on the idea of @mathex's answer.:

Lemma. Let $X, Y \in L^1(\mathbf{P})$ be such that $X \sim Y$ and $\mathbf{E}[Y\mid X]=X$. Then, for any $k\in\mathbb{R}$, $$ \mathbf{E}[Y\wedge k\mid X\wedge k] = X\wedge k \quad\text{and}\quad \mathbf{E}[Y\vee k\mid X\vee k] = X\vee k.$$

Proof of Lemma. We only prove the first part. Let $Z = X \wedge k - \mathbf{E}[Y\wedge k\mid X\wedge k]$. Then

$$ \mathbf{E}[Z] = \mathbf{E}[X \wedge k] - \mathbf{E}[Y \wedge k] = 0. $$

So it suffices to prove that $Z \geq 0$. However, this follows from

\begin{align*} \mathbf{E}[Y\wedge k\mid X\wedge k] &\leq \mathbf{E}[Y\mid X\wedge k]\wedge k \tag{$\because$ Jensen} \\ &= \mathbf{E}[\mathbf{E}[Y\mid X]\mid X\wedge k]\wedge k \\ &= \mathbf{E}[X\mid X\wedge k]\wedge k \\ &= X \wedge k, \end{align*}

completing the proof of the lemma. $\square$

Now we return to the original question. By the lemma, for any $n \geq 1$ and with the truncation function

$$ f_n(x) = (-n) \vee (x \wedge n) = \begin{cases} -n, & x < -n \\ x, & -n \leq x \leq n, \\ n, & x > n \end{cases} $$

we get $\mathbf{E}[f_n(Y) \mid f_n(X)] = f_n(X)$. Since $f_n(X)$ and $f_n(Y)$ are bounded, arguing as in OP, we get

$$f_n(X) = f_n(Y).$$

Now we conclude by letting $n \to \infty$.

Sangchul Lee
  • 181,930