1

To be more precise, fix a probability space $(\Omega,\mathcal{F},\mathbb{P})$, and let $X:\Omega\to\mathbb{R}$ be an $L^1$ random variable. Suppose $\Sigma\subset\mathcal{F}$ is such that $\mathbb{E}[X\mid\Sigma]=0$, then is $X$ independent of $\Sigma$?

This is true in the trivial cases where $\Sigma=\{\emptyset,\Omega\}$ or $\mathcal{F}$.

2 Answers2

4

Consider the example I gave here:

Flip a fair coin to determine the amount of your bet: if heads, you bet \$1, if tails you bet \$2. Then flip again: if heads, you win the amount of your bet, if tails, you lose it. (For example, if you flip heads and then tails, you lose \$1; if you flip tails and then heads you win \$2.) Let $X$ be the amount you bet, and let $Y$ be your net winnings (negative if you lost).

Then $X,Y$ are not independent; for instance we have $P(X=2) = 1/2$, $P(Y=1)=1/4$, and $P(X=2, Y=1) = 0$. But taking $\Sigma = \sigma(X)$ we have $E[Y \mid \Sigma] = E[Y \mid X] = 0$. You can see that $P(Y=1 \mid X=1) = P(Y=-1 \mid X=1) = 1/2$, so $$E[Y \mid X=1] = 1 \cdot \frac{1}{2} + (-1) \cdot \frac{1}{2} = 0.$$ Likewise $E[Y \mid X=2] = 0$. The idea is that the value of $X$ does affect the possible values for $Y$, but no matter how the first flip came up, you are still making a (conditionally) fair bet.

This is similar in spirit to the example given by d.k.o. but perhaps a little more elementary.

Nate Eldredge
  • 101,664
1

Let $X\sim N(0,1)$ and $Y\ge 1$ a.s. is any r.v. independent of $X$. Then $$ \mathsf{E}[(X/Y)\mid \sigma(Y)]=0=\mathsf{E}[X/Y] \quad\text{a.s.} $$ but $(X/Y)$ is not independent of $\sigma(Y)$.

  • The division could've been multiplication (allowing you to drop the $Y \geq 1$ requirement). – Ian Sep 24 '17 at 03:49
  • What do you mean by "a.s."? – James Sep 24 '17 at 03:51
  • 1
    @James "a.s." is a standard abbreviation for "almost surely". – Ian Sep 24 '17 at 03:52
  • @James Conditional expectations are defined up to a null set. –  Sep 24 '17 at 04:00
  • @Ian This actually shows that $E[W\mid Z]=EW$ does not imply independence (division by a r.v. greater than 1 ensures that $X/Y$ is integrable). –  Sep 24 '17 at 04:06
  • We need the integrability of $X/Y$ (and $X$) to assert that $$ \mathsf{E}[X/Y\mid \sigma(Y)]=\mathsf{E}[X\mid\sigma(Y)]/Y \quad\text{a.s.} $$. –  Sep 24 '17 at 04:44
  • 1
    Sure, but you could have just said $Y$ is bounded and then used multiplication. There is no "secret" to the division. – Ian Sep 24 '17 at 15:29