0

As the title says, I have never had a clear intuition about this approach in measure theory. I hope there is more theoretical logical support and a clearer conclusion.

The independence definition requires that the joint measure P of the Cartesian product of all component space measurable sets is equal to the product of the component measures of the component measurable sets:

The random variables $X_1, X_2, \ldots, X_n$ are independent iff, for arbitrary Borel measurable sets $A_1, A_2, \ldots, A_n$,

$$ P\left(\bigcap_{k=1}^n\left\{X_k \in A_k\right\}\right)=\prod_{k=1}^n P\left(X_k \in A_k\right) $$

That is: Why do some propositions defined on sigma algebra only need to be verified on a generator set/family? And whether all properties can be simplified in this way. What is the underlying logic?

1 Answers1

1

It's the following fact:

Let $(\Omega, {\cal F}, P)$ be a probability triple. For $k=1,\ldots n$ let ${\cal I}_k$ be a $\pi$-system on $\Omega$ with ${\cal I}_k\subset{\cal F}$ and $\Omega\in {\cal I}_k$. If $$P\left(\bigcap_{k=1}^n I_k\right)=\prod_{k=1}^nP(I_k)$$ whenever $I_k\in{\cal I}_k$ ($k=1,\ldots,n$), then $\sigma({\cal I}_1),\ldots,\sigma({\cal I}_n)$ are independent.

which in turn follows from the result:

If two probability measures agree on a $\pi$-system, then they agree on the $\sigma$-algebra generated by that $\pi$-system.

To apply the Fact to your context, take ${\cal I}_k$ to be the collection of events of the form $\{X_k\in B\}$ with $B$ a member of a $\pi$-system that generates the Borel sets. For example, take ${\cal I}_k$ to be the collection of all sets of the form $\{X_k\le x\}$.

This post outlines the proof of the Fact from the Result when $n=3$.

This post proves the Fact using Dynkin's $\pi$-$\lambda$ theorem.

grand_chat
  • 40,909