2

Let $\Omega = \{ \omega_i \}$ be a countable set, and consider some probability space $(\omega, \mathcal F, P)$ with $p_i := P(\{ w_i \})$. Let $X : \Omega \to \mathbb R$ be a random variable, then it's expected value is $$ E(X) := \sum_i X(\omega_i) \cdot p_i. $$ Now it is possible to define the conditional expectation $E(X | Y)$ of $X$ dependend on a second random variable, on Wikipedia it is written that

[...] the conditional expectation of $X$ given the event $Y = y$ is a function of $y$ over the range of $Y$ [...]

This is the conditional expectation with respect (conditioned) to another random variable $Y$. But as is also written on wikipedia, we can condition on a sub-$\sigma$-algebra, see Wikipedia too.

The conditional expectation w.r.t. a random variable is a function on the range of $Y$, i.e. $E(X | Y) : Y(\Omega) \to \mathbb R$, the condittional expectation w.r.t. a sub-$\sigma$-algebra $\mathcal H \subseteq \mathcal F$ is a function $E(X | \mathcal H) : \Omega \to \mathbb R$, so has another domain.

Now my question, as they both capture in some sense the same concept, how to convert between them? On the german Wikipedia I found an explanation, there it is said (in a very free translation)

... to get from $E(X|Y)$ to $E(X|\mathcal H)$ set $\mathcal H := \sigma(Y)$, then $E(X|\sigma(Y))(\omega) = E(X | \{ \omega' : Y(\omega') = Y(\omega) \} = E(X|Y=y)$ with $y = Y(\omega)$ and $E(X|Y=y) = E(X|\sigma(Y))(\omega)$ for some $\omega$ with $y = Y(\omega)$. For the other conversion given $E(X|\mathcal H)$ let $Y$ be the family $(1_{B})_{B\in \mathcal H}$.

The conversion from $E(X|\mathcal H)$ to the form $E(X|Y)$ for some random variable $Y$ I do not understand, how could $Y$ be a family of indicator functions as said above (I made the part I do not understand bold)? Can someone please explain, thank you!

StefanH
  • 18,586
  • "The conditional expectation w.r.t. a random variable is a function on the range of $Y$, i.e. $E(X | Y) : Y(\Omega) \to \mathbb R$" Not at all (what is your source?). Actually, if $X$ and $Y$ are random variables defined on $\Omega$ then $E(X|Y)$ is also a (class of) random variable(s) defined on $\Omega$. The function $y\mapsto E(X|Y=y)$, defined on $Y(\Omega)$, is a different beast, which is best defined (up to null sets) using the random variable $E(X|Y)$. – Did Jan 25 '15 at 13:35
  • @Did: This is written on wikipedia: "Let $X$ and $Y$ be discrete random variables, then the conditional expectation of $X$ given the event $Y=y$ is a function of $y$ over the range of $Y$" (see http://en.wikipedia.org/wiki/Conditional_expectation) and it is also written here: http://en.wikipedia.org/wiki/Expected_value#Iterated_expectation. But according to the general definition this is just one version, equal a.s. to other version. – StefanH Jan 25 '15 at 14:16
  • Exactly what my previous comment explains. You seem to be confusing the random variable $\omega\mapsto E(X|Y)(\omega)$ and the function $y\mapsto E(X|Y=y)$. – Did Jan 25 '15 at 15:24
  • The function $y \mapsto E(X|Y = y)$ could be made to a random variable by setting $\omega \mapsto E(X|Y = Y(\omega))$, and for this modified version it has to hold that $E(X|Y=Y(\omega)) = E(X|\mathcal H)$ almost everywhere. – StefanH Jan 25 '15 at 16:59
  • Except that the valid construction goes in the other way: one deduces the function $y\mapsto E(X|Y=y)$ from the random variable $E(X|Y)$. May I suggest to get a reliable textbook? – Did Jan 25 '15 at 17:01
  • What textbook do you think of? – StefanH Jan 25 '15 at 17:10
  • This very much depends on your background, about which you say nothing, hence the question in your comment is absurd. Nevertheless I might mention that the book Probability with martingales by David Williams is excellent and congenial. – Did Jan 25 '15 at 17:16

1 Answers1

4

If a probability space is given then the conditional expectation does not depend on the values taken by the random variable in the condition; it depends only on the sets on which this random variable takes these values as constants. You can experience this if you calculate a very simple example when the random variables involved are discrete.

Since only the sets count, one can say that only the indicator functions count.

So, if you have a $(\sigma$-)algebra in the condition then you can replace this $\ \ (\sigma-)$algebra by any random variable which generates the same $(\sigma$-)algebra. Or the other way around, if you have a random variable in the condition then you can replace it by the $(\sigma$-)algebra generated by that random variable.

To answer your question briefly: If a random variable is in the condition of a conditional expectation then it can be considered as a system of indicator functions of sets on which the random variable is constant.

zoli
  • 20,817
  • "if you have a $\sigma$-algebra [...] then you can replace this $\sigma$-algebra by any random variable which generates [it]". But what if the $\sigma$-algebra ist not generated by any random variable: http://math.stackexchange.com/questions/267584/is-every-sigma-algebra-generated-by-some-random-variable? – StefanH Jan 22 '15 at 13:02
  • And also, how is a system of indicator functions related to a random variable, for example consider ${ \emptyset, {1,2},{3}, {4}, {3,4},{1,2,3}, {1,2,4}, {1,2,3,4}}$ which is a ($\sigma$)-algebra over $Y = {1,2,3,4}$. Then for example $X(1)=X(2)=0, X(3)=1, X(4)=2$ generates this ($\sigma$)-algebra. But the system of indicator functions is ${ I_{\emptyset}, I_{{1,2}}, I_{{3}}, I_{{4}}, I_{{1,2,3}}, I_{{1,2,4}}, I_Y }$, but $X$ could be written in terms of indicator just as $X = 0\cdot I_{{1,2}} + I_{{3}} + I_{{4}}$, is there any construction to get there? – StefanH Jan 22 '15 at 13:08
  • But what is wrong? If X is in the condition then only the indicator functions count. Only those, of course, that play role in the construction. – zoli Jan 22 '15 at 17:35
  • As far as the first question: The conditional expectation and every random variables are defined only a.s. Or as eq. classses of measurable functions that are equal a.s. So it is meaningless to say that a random variable is only Borel measurable and not Lebesgue measurable. – zoli Jan 22 '15 at 17:39
  • If I give you some family of indicator functions ${ I_B }_{B \in \mathbb B}$ how do you construct a random variable out of it? What sets (or indicator functions) do you choose where your random variable is constant? – StefanH Jan 22 '15 at 18:57
  • The indicators will have describe dijoint sets whose union is Omega. Then I will construct many r.v.'s. – zoli Jan 22 '15 at 21:44
  • Yes, then it is clear, but what if a $\sigma$-Algebra could not be decomposed in such a way? Or is it always possible... sorry maybe you already addressed this in your forementioned comment, but I do not understand... – StefanH Jan 22 '15 at 23:03
  • If the union of a sequence of sets cover $\Omega$ and these sets belong to the actual $\sigma-$algebra then you can always construct a partitioning of $\Omega$ such that the new covering sets will belong to the same $\sigma-$ algebra. – zoli Jan 23 '15 at 09:11
  • How? Can you please explain? And by the way, the new partitioning must be finer I guess, this seems non-trivial to me... – StefanH Jan 23 '15 at 14:54
  • Dearest Stefan, please give a specific example: (1) a discrete random variable and a set of indicators for the condition. – zoli Jan 24 '15 at 18:36