30

I know from standard textbooks that "Given the measurable functions $X_i:(\Omega,\mathcal{F})\rightarrow(\Omega_i,\mathcal{A}_i)$, the $\sigma$-algebra generated by a set of random variables $(X_i; i\in I)$ is given by \begin{equation} \sigma\big(X_i;~i\in I\big)=\sigma\left(\bigcup_{i\in I}\sigma(X_i)\right)=\sigma\left(\bigcup_{i\in I} X_i^{-1}(\mathcal{A}_i)\right)". \end{equation} I could understand this idea conceptually, but when this becomes to the real examples I am a bit confused. For example, suppose that $I$ is a discrete time set $I=\mathbb{N}$ and $X_i:\Omega\rightarrow\mathbb{R}$ denotes the number of heads in $i$th draw of a coin. Now it is clear that e.g. \begin{align} \sigma(X_1)=\big\{\emptyset,\{T\},\{H\},\Omega\big\} ~~etc.. \end{align} My questions are as follows:

  1. What would then be the exact elements of $\sigma(X_1,X_2,...,X_k)=:\mathcal{F}_1^k$ for some $k\in\mathbb{N}$? (say when $k=2$?)
  2. Should I distinguish the events H and T in $i$th draw from those in $j$th draw? ($i\ne j$)
  3. Does this idea hold also for the random vector? That is, $\sigma(X_1,X_2)=\sigma(Y)$ where $Y$ is a random vector defined as $Y=(X_1,X_2)^{T}$ (with $T$ being transpose), and $I$ is now obviously $\{1,2\}$.

Many thanks in advance, John.

John
  • 429
  • 1
  • 4
  • 12
  • 3
    I would not use $T$ for tails and for $\Bbb N$ at the same time. Also, $$ \sigma(X_i) = {\emptyset, {\omega:X_i(\omega) = T},{\omega:X_i(\omega) = H},\Omega} $$ which in particular for $i=1$ is not what you wrote. That answers your question 2. Also, think first of the case $I = {1,2}$ for your original problem of coin tossing, and imagine that $\Omega = {HH,HT,TH,TT}$. Can you write down now $\mathcal F_1$ and $\mathcal F_2$? What about $I = {1,2,3}$? Finally, your 3rd question does not seem to make sense, but first let's do these exercises. – SBF Nov 26 '14 at 18:41
  • 3
    Many thanks for your comment and help Ilya. (I guess you meant $\sigma(X_i)={\emptyset, {w:X_i(w)=0}, {w:X_i(w)=1},\Omega}$ right?) I see. So in this case $\sigma(X_1)={\emptyset, {TT,TH}, {HH,HT},\Omega}$ and $\sigma(X_2)={\emptyset,{HT,TT},{TH,HH},\Omega}$. So it follows that $\sigma(X_1,X_2)={\emptyset, {TT,TH}, {HH,HT}{HT,TT},{TH,HH},\Omega}$ (which is itself a $\sigma$-algebra in this case). Similarly for the case $I={1,2,3}$. Thanks very much. – John Nov 27 '14 at 16:45
  • 1
    Indeed, that's the case - I guess now it shall be easier to approach the original problem. Just tell me if anything else is unclear – SBF Nov 27 '14 at 16:47
  • Just a quick question, do we then consider $\Omega=\mathbb{R}^k$ (where $k$ is the cardinality of $I$) in the continous case? – John Nov 27 '14 at 16:49
  • (By the way, in the third question, I meant transpose; sorry for the typo; hope the question now make sense to you.) – John Nov 27 '14 at 16:50
  • The beauty of some/most parts of probability and measure theory is that they are completely dimension-independent. There is no difference in considering $X = X_1$ or $X = (X_1,X_2)$, especially in discrete case. So yes, $\sigma(X_1,X_2) = \sigma(Y)$ almost by definition. You can easily check this just recalling carefully how the two objects are defined. – SBF Nov 27 '14 at 16:53
  • related question https://math.stackexchange.com/questions/1005666/sigma-algebras-generated-by-random-variables – Arseny Nerinovsky Apr 24 '18 at 07:05
  • @John, you did not complete the calculation of $\sigma(X_1,X_2)$ in your 2nd comment. It is still not sigma algebra but is very close. It still needs to be closed under union and complement. – lzstat Jul 18 '19 at 00:32
  • Don't quite get how the answer works in comments. $X_i$ are random variables living in the same probability space and hence same $\Omega$. Each $\sigma(X_i)$ for each $i$ should take the same form as the OP wrote down. Then, the union of such identical $\sigma$-algebra should give us exactly again $\sigma({\emptyset, {T}, {H}, \Omega})$, isn't it? the answer of @Ilya actually redefined the sample space. – chichi Feb 25 '23 at 08:06
  • In the second comment the OP wrote down different sigma algebras, so what do you mean when you mention identical sigma algebras? – SBF Mar 03 '23 at 19:01

1 Answers1

3

I thought I would add some minor details to the great comment by SBF, seeing that there are confusion among some readers.

Let us consider the random experiment (RE) that John proposed. Here we are tossing a coin indefinitely and recording the side that faces up. Then our sample space $\Omega$ should be the set of all sequences of $H$ and $T$. That is, $$\Omega = \left\{\omega \,\middle|\, \omega : \mathbb{N}\to\{H, T\} \right\}$$ For a given outcome of the RE (a sequence of $H$ and $T$), if the $i^\text{th}$ toss ($i\in\mathbb{N}$) lands on head/tail then we can define $X_i$ to be $H,T$ respectively. That is, $$X_i : \Omega\to\{H,T\},\quad X_i(\omega)=\begin{cases} H, & \omega(i) = H \\ T, & \omega(i)= T \end{cases}$$ Given these definitions, $\sigma(X_1)=\{\emptyset, \Omega, X_1^{-1}(H), X_1^{-1}(T) \}$ where $X_1^{-1}(H)$ is the event which contains all outcomes that had its first toss landed on $H$. We get to place a condition on the first toss. That is, $$X_1^{-1}(H)=\left\{\omega : \mathbb{N}\to\{H, T\} \,\middle|\, \omega=H\dots \right\}=\left\{\omega : \mathbb{N}\to\{H, T\} \,\middle|\, \omega(1)=H\right\}$$ and similarly for $X_1^{-1}(T)$. This should clear up Q2 since for $i\ne j$, $$X^{-1}_i(H)\ne X^{-1}_j(H) \ne X^{-1}_i(T) \ne X^{-1}_j(T)$$ meaning the events never coincide. As such, $$\bigcup_{i=1}^\infty \sigma(X_i)=\bigcup_{i=1}^\infty\{\emptyset, \Omega, X_i^{-1}(H), X_i^{-1}(T) \}\ne \{\emptyset,\Omega,\{H\},\{T\}\}$$ as one might suspect.

As for Q1, $\sigma(X_1, X_2)$ (denoted as $\mathcal{F}_2$) is the $\sigma$ - algebra created from events in $\sigma(X_1)$ and $\sigma(X_2)$ via taking intersections and unions. This means that $\mathcal{F}_2$ contains all events in which we get to place conditions on the first and the second toss. For example, it contains the event in which all outcomes had its first and second trials to be $HH$, represented by $X^{-1}_1(H)\cap X^{-1}_2(H)$. It also contains the event in which all outcomes which has at least $T$ in the first trial or $H$ in the second trial, represented by $X^{-1}_1(T)\cup X^{-1}_2(H)$. And so on, one can go nuts with it. Similarly, $\mathcal{F}_k$ will be the $\sigma$ - algebra containing events in which one can place conditions on the first $k$ tosses, $k\in\mathbb{N}$.

For Q3, one can see that $\sigma(X_1,X_2) = \sigma(Y)$ where $Y=(X_1, X_2)$ since for example, $Y^{-1}(H,T)=X_1^{-1}(H)\cap X_2^{-1}(T)$. In general, the equality $\sigma(X_1,\dots,X_k) = \sigma(Y)$ where $Y=(X_1, \dots, X_k)$ still holds and it comes from the fact that $f(x_1, \dots, x_k)$ is really just the same function as $f((x_1, \dots, x_j), (x_{j+1}, \dots, x_k))$ where $j=1, \dots, k-1$.