1

Let $X$ be a random variable on $\mathscr L^1(\Omega, \mathscr F, \mathbb P)$.

Let $\{J_n\}_{n=1}^{\infty}$ be disjoint events with positive probability. Let $\mathscr J = \sigma(\{J_n\}_{n=1}^{\infty})$.

Q1 Is $\sum_{J \in \mathscr J} E[X|J]1_J$ a version of $E[X|\mathscr J]$?

Q2 Is $\sum_{n=1}^{\infty} E[X|J_n]1_{J_n}$ a version of $E[X|\mathscr J]$?

Both questions: It can be shown that $E[X|\mathscr J]$ is integrable. It is obvious that $E[X|\mathscr J]$ is $\mathscr J$-measurable.

Q1 It remains to show that $E[\sum_{J \in \mathscr J} E[X|J]1_J1_{J*}] = E[X1_{J*}]$ for all $J* \in \mathscr J$.

$$LHS = E[\sum_{J \in \mathscr J} E[X|J]1_J1_{J*}]$$

$$ = E[E[X|J*]1_{J*}1_{J*}]$$

$$ = E[E[X|J*]1_{J*}] = RHS$$

Q2 It remains to show that $E[\sum_{n=1}^{\infty} E[X|J_n]1_{J_n}1_{J*}] = E[X1_{J*}]$ for all $J* \in \mathscr J$.

Does it suffice to show the equality for $J* \in \sigma(J_1), \sigma(J_2), ...$

and then conclude by Dynkin's lemma or uniqueness lemma that we have the equality for all $J* \in \sigma(\sigma(J_1), \sigma(J_2), ...) \stackrel{?}{=} \mathscr J$

?

BCLC
  • 14,197
  • 5
    Do you assume that the sets $J_n$ are pairwise disjoint...? Re Q1: I don't see why the sum should give rise to something measurable... in general, it's going to be a sum over uncountably many events $J$. – saz May 26 '18 at 19:23
  • @saz Q2 yes thanks edited! Q1 oh right yeah. Forgot what I was initially thinking. – BCLC May 26 '18 at 21:42

1 Answers1

3

Q1: As I already pointed out in my comment, there is a measurability issue (because of the summation over possibly uncountably many events $J$) and therefore the mapping is, in general, not a version of $\mathbb{E}(X \mid \mathcal{J})$.


Q2: If the sets $(J_n)_{n \in \mathbb{N}}$ cover the whole space, i.e. $\bigcup_{n \in \mathbb{N}} J_n = \Omega$, then the assertion follows from the following statement which is a direct consequence of a well-known uniqueness of measure theorem.

Proposition Let $(\Omega,\mathcal{F},\mathbb{P})$ be a probability space, and let $\mathcal{J} \subseteq \mathcal{F}$ be a sub-$\sigma$-algebra. Let $\mathcal{G}$ be a generator of $\mathcal{J}$, i.e. $\sigma(\mathcal{G}) = \mathcal{J}$, such that $\mathcal{G}$ is $\cap$-stable and there exists $(G_n)_{n \in \mathbb{N}} \subseteq \mathcal{G}$ such that $G_n \uparrow \Omega$. If $X \in L^1(\mathcal{F})$ and $Y \in L^1(\mathcal{J})$ are such that $$\forall G \in \mathcal{G}: \quad \int_G X \, d\mathbb{P} = \int_G Y \, d\mathbb{P},$$ then $Y$ is a version of $\mathbb{E}(X \mid \mathcal{J})$.

You will have to apply this theorem for the generator

$$\mathcal{G} := \left\{ \bigcup_{i=1}^k J_{n_i}; k \in \mathbb{N}, n_i \in \mathbb{N} \right\}$$

which consists of all sets which can be obtained as finite unions of the sets $(J_n)_{n \in \mathbb{N}}$.

If $\bigcup_{n \in \mathbb{N}} J_n = \Omega$ does not hold true, then the assertion is, in general, wrong; it is not difficult to construct counterexamples. (Consider for instance $\mathcal{J} = \{A,A^c\}$ and $J_n := A$ for $n \in \mathbb{N}$.)

saz
  • 123,507
  • Thanks saz! ^-^ – BCLC May 31 '18 at 09:20
  • Saz, based on this proposition, does the claim here then require further justification? – BCLC May 31 '18 at 10:27
  • @BCLC If you want to apply the proposition you have to check that its assumptions are satisfied. Not more and not less. – saz May 31 '18 at 11:12
  • Thanks, saz. Exactly! the assumptions were not checked hence the claim requires the further justification that is the checking of the assumptions? For example the assumption that $G_n \uparrow \Omega$ is certainly stronger than $\bigcup G_n = \Omega$... – BCLC May 31 '18 at 11:49
  • Maybe it's worth to mention that if $J_\infty:=\bigcup_{n\in\mathbb N}J_n\ne\Omega$, then we still can conclude that $$\operatorname E\left[X\mid\mathscr J\right]=\sum_{n\in\mathbb N}1_{J_n}\operatorname E\left[X\mid J_n\right]+1_{\Omega\setminus J_\infty}\operatorname E\left[X\mid\Omega\setminus J_\infty\right].$$ At least if I'm not missing anything @saz? – 0xbadf00d May 12 '19 at 12:57
  • @saz Oh, and if the system in question is even finite, i.e. we're given $n\in\mathbb N$ and disjoint $J_1,\ldots,J_n$ and $\mathscr J=\sigma(J_1,\ldots,J_n)$, is it sufficient to prove that $$\forall i\in\left{1,\ldots,n\right}:\text E\left[1_{J_i}X\right]=\text E\left[1_{J_i}Y\right],$$ where $$Y:=\sum_{i=1}^n1_{J_i}\operatorname E\left[X\mid J_i\right]+1_{Ω\setminus\biguplus_{i=1}^nJ_i}\text E\left[X\mid\Omega\setminus\biguplus_{i=1}^nJ_i\right],$$ or do we need to prove this on $\mathcal G:=\left{\biguplus_{j=1}^kJ_{i_j}:k\in{1,\ldots,n}\text{ and }1\le i_1<\cdots<i_k\le n\right}$? – 0xbadf00d May 12 '19 at 19:06
  • @0xbadf00d Re your first comment. This is an immediate consequence of what I wrote (just consider the partition $\tilde{J}n := J_n$, $n \geq 1$, $\tilde{J}_0 := \Omega \backslash \bigcup{n \geq 1} J_n$.) Re your 2nd comment: Not sure what you mean by your strange notation with the plus inside the union. Moreover, if you define $Y$ be the latter expression it is trivial that the first identity holds (by the very definition of the conditional expectation). – saz May 13 '19 at 04:49
  • Anyway, if $Y \in L^1(\mathcal{J})$ is some randomv variable, then we need $E(1_J X) = E(Y 1_J)$ for $J \in {J_1,\ldots,J_n,\Omega \backslash \bigcup_{j=1}^n J_j}$; afterwards we can use the linearity of the expectation and the fact that the sets are disjoint to get the identity for a $\cap$-stable generator of $\mathcal{J}$, and hence we get $Y= E(X \mid \mathcal{J})$. – saz May 13 '19 at 04:49
  • @saz Thank you for your comments. $\biguplus$ is a (rather common) notation indicating that the union is disjoint. I define $Y$ to be $Y:=\sum_{i=1}^n1_{J_i}\operatorname E\left[X\mid J_i\right]+1_{Ω\setminus\biguplus_{i=1}^nJ_i}\text E\left[X\mid\Omega\setminus\biguplus_{i=1}^nJ_i\right]$, if you meant that. I don't know why you say it's trivial then, since it's basically the same thing asked in the question, but there you wrote that we need to show it on a $\pi$-system ($\cap$-stable system). – 0xbadf00d May 13 '19 at 15:43
  • @0xbadf00d It follows straight-forward from the definition of conditional expectation that $$\mathbb{E}(\mathbb{E}(X \mid J_i) 1_{J_i}) = \mathbb{E}(X 1_{J_i}).$$ Since the random variable $Y$ (as defined in your previous comments) clearly satisfies $$\mathbb{E}(X \mid J_i) 1_{J_i} = Y 1_{J_i}$$ it is immediate that $\mathbb{E}(Y 1_{J_i}) = \mathbb{E}(X 1_{J_i})$ for all $i$. (... and just as a side remark: This might not be "obvious" and "trivial" for beginners ... sure.) – saz May 13 '19 at 16:38