0

Let $(X_n)_{n \in \mathbb{N}}$ be a sequence of $(-1,\infty)$-valued random variables. Is it possible to have $$ \forall \, n \colon \ \mathbb{E}[X_n \, \vert \, X_1, \ldots, X_{n-1}] = 0 $$ without the $(X_n)$ being independent?

I know that there are examples which work for two random variables and without the assumption on the range. Typically, one needs uncorrelated but dependent random variables. Do such examples also exist in this specific setting?

  • But in your example the expectation is not constant equal to zero, right? Also, I was not looking for a trivial sequence. I mentioned that I know examples for two random variables exist. – user98187609 Apr 28 '23 at 12:24
  • $E(X|X^{2})=0$ for any symmetric random variable $X$ with finite second moment. – Kavi Rama Murthy Apr 28 '23 at 12:28

2 Answers2

3

Try something like $Y_n =\pm1$ with equal probability iid, with $X_1=Y_1$ and $$X_n=Y_n\left(X_{n-1}+1\right).$$

Clearly you could replace the bracketed term with most other functions of $X_1, \ldots, X_{n-1}$ and still have dependency with a conditional expectation of $0$ thanks to the $Y_n$.

Henry
  • 169,616
  • Ah @Henry you were quicker. Indeed this is a nice example. I was thinking of a more general one. – Mr. Gandalf Sauron Apr 28 '23 at 10:44
  • The resulting $X_n$ has $\frac{X_n+n}2 \sim Bin(n, \frac12)$ i.e. the same marginal distribution as the position of a standard random walk after $n$ steps though not the same conditional distribution – Henry Apr 28 '23 at 11:24
  • @geetha290krm Well , does it really "capture dependency" or correlation if $X_{n} ,\forall n>2$ are independent random variables and just $(X_{1},X_{2})$ are dependent ? . Henry's example shows that the entire process $(X_{n})$ need not be independent of the past but still can have conditional expectation given the past to be $0$. This type of examples often come up in Gambling games in fair casinos in which whatever winnings or results that have occured past, your expected profit given the past is $0$. – Mr. Gandalf Sauron Apr 28 '23 at 11:42
2

I am afraid Henry already posted an answer. But to elaborate, take any sequence $X_{n}$ of random variables and let $Y$ be a random variable independent of $X_{n}$'s and with $E[Y]=0$

Then define $Z_{1}=X_{1}$ , $Z_{2}=X_{2}$ and $Z_{3}=Yf(X_{2},X_{1})$ where $f$ is an arbitrary measurable function from $\Bbb{R}^{2}\to \Bbb{R}$

Then $E(Z_{3}|Z_{1},Z_{2})=E(Yf(X_{2},X_{1})|Z_{1},Z_{2}))=f(X_{1},X_{2})E(Y|Z_{1},Z_{2})=f(X_{1},X_{2})E(Y)=0$ where the last equality is due to the tower law for conditional expectation.So you can take $f$ to be any function you like for example $f(x,y)=x^{2}+y^{2}$ or $f(x,y)=x^{\pi}y^{e}(x+y)^{12345}$ and you'll find that $Z_{3}$ is not independent of $X_{1},X_{2}$ but still the conditional expectation will be $0$.

Note that directly by definition of conditional expectation , if $X$ is $\mathcal{G}$ measurable , then $E(Y\cdot X|\mathcal{G})=XE(Y|\mathcal{G})$