0

I am working on finding a counter example to some problem. For this I have set up $Z_i \sim U(-1,1)$, $\forall i \geq 1$ and $Y$ such that $P(Y=1) = P(Y=-1)$. Now for my counter example I would need $\{Y, Z_1, Z_2, Z_3...\}$ to be mutually independent. Is this assumption allowed, i.e just saying that they are all mutually independent? The statement that I am trying to show is false is of no importance here, I am just unsure if it is perfectly allowed to have such a set up.

  • 1
    You can do that. For a sequence of Borel probability measures $\mu_1, \mu_2,...$, there always exist a sequence of independent random variables $X_1, X_2,...$ (defined on some probability space) such that $X_i\sim\mu_i$. Actually, this also holds for uncountable collections. This is a nontrivial theorem though, similar to the existence of Lebesgue measure. – Mark Apr 23 '24 at 10:43
  • @Mark Do you mind naming the theorem please? I'm being lazy as I don't want to search my notes! – demim00nde Apr 23 '24 at 10:44
  • 1
    @demim00nde Here it is for example discussed for iid random variables, but you can similarly build any infinite product space: https://math.stackexchange.com/questions/246445/existence-of-independent-and-identically-distributed-random-variables – Mark Apr 23 '24 at 10:50
  • I remember reading a relatively short proof somewhere in the internet, but can't find it now. – Mark Apr 23 '24 at 10:51

0 Answers0