A proof for the existence of a Wiener process is based on using Gaussian random variables to construct such a process with the properties we need from it.
That got me thinking that I never actually learned how a random variable with a certain distribution is constructed.
How does one prove that there exists $X:\Omega\rightarrow\mathbb{R}$ with $\mathbb{P}\{X\leq x\}=F(x)$ for a given $F$ ?
More importantly what would $\Omega$ be in the case of usual R.V (e.g normal) and what would $X(\omega)$ represent ?
- 551
-
1If you start with a fixed probability space $(\Omega,\mathcal A,P)$ and a fixed CDF $F$ then it can be quite difficult or even impossible to prove the existence of a random variable $X:\Omega\to\mathbb R$ having $F$ as CDF. That is simply a wrong way to start. Fortunately it can be shown that for fixed $F$ we can always construct lots of probability spaces that are willing in this. In fact we can start with: "let $(\Omega,\mathcal A,P)$ be a probability space with....(fill in whatever you want)". A nice question about this you find here. – drhab Feb 03 '18 at 11:25
2 Answers
If $F$ is a CDF then a unique probability measure $\mathsf P$ exists on measurable space $(\mathbb R,\mathcal B)$ with: $$\mathsf P((-\infty,x])=F(x)\text{ for every }x\in\mathbb R$$
Now we have probability space $(\mathbb R,\mathcal B,\mathsf P)$. Observe that here $\Omega=\mathbb R$.
Then the function $X:\mathbb R\to\mathbb R$ prescribed by $\omega\mapsto\omega$ is a random variable, and this with:$$\mathsf P(X\leq x)=\mathsf P(\{X\leq x\})=\mathsf P(\{\omega\in\mathbb R\mid X(\omega)\leq x\})=\mathsf P((-\infty,x])=F(x)$$
So starting with CDF $F$ a random variable $X$ was constructed having $F$ as its CDF.
- 153,781
A less trivial question is this: Given a sequence of random variables $(X_j)$, possibly defined on different probabilty spaces, how can we construct a sequence of independent random variables $(X_j')$ (necessarily all defined on the same probability space) such that $X_j'\sim X_j$?
People assume we can do this all the time, when they say assume $(X_j)$ is an independent sequence such that $X_j$ has such and such a distribution.
The obvious, elegant and intuitively clear answer is this: If $(\Omega_j, P_j)$ is the probability space on which $X_j$ is defined let $$\Omega=\prod_{j=1}^\infty\Omega_j$$ and $$P=\prod_{j=1}^\infty P_j,$$etc.
But that runs into technicalities regarding the product of infinitely many measures. There's a cheap trick. First prove this:
Lemma Suppose $X$ is a (real-valued) random variable and $Y$ is a random variable uniformly distributed on $[0,1]$. There exists a function $f$ such that $X\sim f(Y)$.
(Proof: Exercise - you don't want me to have all the fun. Hint: If $F$ is a CDF there exist $G:[0,1]\to\Bbb R$ with $G(F(\lambda)=\lambda$.)
Given that, we need only construct an iid sequence of random variables uniformly distributed on $[0,1]$.
Now say $b_1,b_2,\dots$ is an iid sequence with $P(b_j=1)=P(b_j=0)=1/2$. Note we can construct the $b_j$ explicitly using the Rademacher functions. Then the random variable $$X=\sum_{j=1}^\infty 2^{-j}b_j$$is uniformly distributed on $[0,1]$.
Now partition $\Bbb N$ into infinitely many infinite sets $S_n$. USe the $b_j$ for $j\in S_n$ to construct a random variable $X_n$ uniformly distributed on $[0,1]$, exactly as above except for notation. Then the $X_n$ are iid uniformly distributed on $[0,1]$.
- 92,839