2

Let $S_n := X_1 + \dots + X_n$ for some i.i.d. r.v. $X_i$ (specifics not so important) and let $f$ be a function satisfying

\begin{align} Ef(x+X_1) = f(x). \end{align} Then, as always the following would hold \begin{align} E\left(f(S_n + X_{n+1}) \vert X_1, \dots X_n\right) = f(S_n) \quad \text{(correct?)} \end{align} Now, what if we additionally have some r.v. $Y_1, \dots, Y_n$ depending on $X_1, \dots, X_n$ in some way but independent of $X_{n+1}$ and consider \begin{align} E\left( f(S_n + X_{n+1}) \vert X_1,\dots, X_n, Y_1, \dots, Y_n\right), \end{align} is this still equal to $f(S_n)$?

It is weird that I have never thought about this before... Also, this question could probably be asked in a more general form, but I thought giving a hands-on example might make it more palpable.

Edit: As was correctly pointed out, I should've been clearer: The $Y_1, \dots, Y_n$ are not stochastically independent of the $X_1, \dots, X_n$. They're not measurebale functions of the $X_1, \dots, X_n$.

MMM
  • 348
  • By "depending on $X_1,\ldots,X_n$ in some way", do you mean that the $Y_i$ are functions of the $X_i$? Or just that they're not independent of them? – joriki Mar 02 '20 at 20:45
  • That they are not independent. Thanks for clarifying that. – MMM Mar 02 '20 at 20:55
  • $Y_n$ can be independent of $X_{n+1}$, and $X_n$ can be independent of $X_{n+1}$, while $X_{n+1}$ depends on $(X_n,Y_n)$. – Michael Mar 03 '20 at 01:55
  • Thanks for pointing this out. So, with the additional condition that $X_{n+1}$ is independent of (X_1, \dots, X_n, Y_1, \dots, Y_n) the statement is true, correct? – MMM Mar 03 '20 at 11:08
  • 1
    @BenC. : Yes, the d.k.o. answer treats the case when $X_{n+1}$ is independent of $(X_1, ..., X_n, Y_1,..., Y_n)$. – Michael Mar 03 '20 at 21:58

2 Answers2

4

In general, if $\mathcal{G}$ is a $\sigma$-field s.t. $S$ is $\mathcal{G}$-measurable and $X$ is independent of $\mathcal{G}$, then for any integrable function $\varphi$, $\mathsf{E}[\varphi(S,X)\mid \mathcal{G}]=g(S)$ a.s., where $g(s):=\mathsf{E}\varphi(s,X)$. Apply this result to your case with $\mathcal{G}=\sigma\{X_1,\ldots,X_n,Y_1,\ldots,Y_n\}$, assuming that $X_{n+1}$ is independent of $\mathcal{G}$ (see @Michael's comments and examples 1, 2, and 3).

  • Thanks! Makes sense... I guess, it is considered common knowledge then? – MMM Mar 02 '20 at 21:02
  • 1
    @BenC. See, for example Lemma 6.2.1 on page 236 here. –  Mar 02 '20 at 21:05
  • Oh wow, perfect! Thanks a lot for the quick response! – MMM Mar 02 '20 at 21:10
  • Wait a minute, wouldn't I be stuck with $Ef(S_n, X_{n+1})$ in my case, which is not neccessarily $f(S_n)$? My function $f$ is only specified to satisfy $E(f(x+X_{n+1})) = f(x) $ for real numbers $x$. – MMM Mar 02 '20 at 21:16
  • In your case $\varphi(S_n,X_{n+1})=f(S_n+X_{n+1})$ and $\mathsf{E}\varphi(S_n,X_{n+1})=f(S_n)$... –  Mar 02 '20 at 22:11
  • 1
    With the problem as stated, there is no reason that $X_{n+1}$ should be independent of $\sigma(X_1,...,X_n,Y_1,...,Y_n)$, (see my comment above). – Michael Mar 03 '20 at 01:59
  • 1
    @Michael Thnx for pointing that out. –  Mar 03 '20 at 08:15
2

If $Y_1,\ldots,Y_n$ are measurable functions of $X_1,\ldots,X_n$, then $\sigma(X_1,\ldots,X_n,Y_1,\ldots,Y_n) = \sigma(X_1,\ldots,X_n)$. Hence $$\mathbb E[f(S_n + X_{n+1})| X_1,\ldots,X_n,Y_1,\ldots,Y_n] = \mathbb E[f(S_n + X_{n+1}) | X_1,\ldots,X_n ] = \big.\int f(x+y) dQ_{X_{n+1}}(y)\big|_{y=S_n} = f(S_n) $$

Mick
  • 2,285
  • Yeah, sorry. It was meant to mean that $Y_1, \dots,Y_n$ are not stochastically independent of the $X_1, \dots, X_n$. – MMM Mar 02 '20 at 20:56
  • the answer is the same since $X_{n+1}$ is independent of $Y_1,\ldots,Y_n$. – Mick Mar 02 '20 at 21:02