4

Say I have a probability space $(\mathbb{R}, \Sigma, \mu)$ and a function $f$ of the form $f: \mathbb{R} \times \mathbb{R} \rightarrow \mathbb{R}$ such that for any distinct $x_1,x_2 \in \mathbb{R}$ the functions $f(x_1, -)$ and $f(x_2, -)$ are independent random variables over $(\mathbb{R}, \Sigma, \mu)$.

Now let's define the random variables $G$ and $H$ as: \begin{align} G(y) = f(f(x_1, y), y) \\ H(y) = f(f(x_2, y), y) \\ \end{align}

Are $G$ and $H$ independent?

gigalord
  • 337
  • I think the first question to ask is for $g(y) = f(x_1,y)$, what is the distribution of $g$? For only then can we define the distribution of $G(y) = f(f(x_1,y),y)$, and we are interested in whether $$\mathbb P(G(y)\in A, H(y)\in B) = \mathbb P(G(y)\in A)\mathbb P(H(y)\in B)$$ for Borel sets $A$, and $B$. – Math1000 Dec 28 '19 at 00:14
  • For example, if $g(y)=f(x_1,y)=x_1$ with probability one then clearly $G(y)=f(f(x_1,y),y) = f(x_1,y)$, so if $h(y)=f(x_2,y)=x_2$ with probability one then $G$ and $H$ are independent. But in the general case, I think not. – Math1000 Dec 28 '19 at 00:22

1 Answers1

1

Not necessarily. Here is a counter-example:

Let $\mu((-\infty, y]) = \int_{-\infty}^y \frac{1}{\sqrt{2\pi}} e^{-\frac{t^2}{2}} dt$ for all $y \in \mathbb{R}$ (the Gaussian distribution).

Define $f:\mathbb{R}^2\rightarrow\mathbb{R}$ by $$f(x,y)= \left\{ \begin{array}{ll} y &\mbox{ if $x=0$} \\ 0 & \mbox{ if $x\neq 0$} \end{array} \right.$$ Then for any distinct real numbers $x_1,x_2$ (distinct meaning $x_1\neq x_2$), at least one of the numbers $x_1,x_2$ must be nonzero. Thus, at least one of the random variables $f(x_1,y)$ and $f(x_2,y)$ must be zero for all outcomes $y \in \mathbb{R}$. Since a constant is always independent of any other random variable, $f(x_1,y)$ and $f(x_2,y)$ must be independent.

However, take $x_1=1,x_2=2$. Then for all outcomes $y \in \mathbb{R}$ we have \begin{align} G(y)&=f(f(1,y),y) = f(0,y) = y\\ H(y) &=f(f(2,y),y) = f(0,y) = y \end{align} So $G$ and $H$ are the same (Gaussian) random variable and hence they are not independent.

Michael
  • 26,378
  • Thank you for the very helpful answer! My understanding is that this relies upon the phenomenon that a constant random variable is independent of itself. If we add the constraint that $f(x, -)$ must be non-constant, does this change the conclusion?

    I am happy to post this as a separate question if you prefer.

    – gigalord Jan 02 '20 at 15:04
  • 1
    Why are you interested in this question? I used constants because that was the only way I could think of an uncountably infinite number of random variables (all on the sample space $[0,1]$) that are pairwise independent. I do not think this is even possible if you impose the "non-constant" assumption (so, I do not think there even exists a system that satisfies your starting assumptions). See here for a related question: https://math.stackexchange.com/questions/877671/why-one-cannot-construct-more-than-countably-many-independent-random-variables – Michael Jan 02 '20 at 18:18
  • By "non-constant" I really mean "non-degenerate." I believe I have an argument that, assuming such a function $f(x,y)$ exists to satisfy your assumptions (including the new non-degenerate assumption), gives a new function $g(x,y)$ that satisfies your assumptions and is a counter-example to your conjecture. It is too long to write so perhaps you can ask it as a new question and I can try to type it up there. – Michael Jan 02 '20 at 19:03
  • Great, thanks! I have re-asked the question here https://math.stackexchange.com/questions/3495365/does-the-composition-of-a-non-degenerate-random-variable-valued-function-with-it. – gigalord Jan 02 '20 at 19:20