15

By considering the set $\{1,2,3,4\}$, one can easily come up with an example (attributed to S. Bernstein) of pairwise independent but not independent random variables.

Counld anybody give an example with continuous random variables?

5 Answers5

7

Let $x,y,z'$ be normally distributed, with $0$ mean. Define $$z=\begin{cases} z' & xyz'\ge 0\\ -z' & xyz'<0\end{cases}$$ The resulting $x,y,z$ will always satisfy $xyz\ge 0$, but be pairwise independent.

vadim123
  • 83,937
  • Is there a more general class of these that follows this pattern? e.g. can I shove a $c$ in here somewhere and have the independence assertions still hold? – Him Apr 17 '18 at 23:03
  • 1
    @Scott, sure why not. Replace $xy$ with $xyc$ everywhere. By the way, you are commenting on a solution from over a year ago, which is usually not recommended. – vadim123 Apr 17 '18 at 23:29
  • replacing $xy$ by $xyc$ yields an indentical joint distribution $p(x,y,z)$ to the given example – Him Apr 18 '18 at 03:22
  • 1
    @vadim123 why is that not recommended? I think it's perfectly fine – mathworker21 Nov 17 '20 at 18:31
6

An answer of mine on stats.SE gives essentially the same answer as the one given by vadim123.

Consider three standard normal random variables $X,Y,Z$ whose joint probability density function $f_{X,Y,Z}(x,y,z)$ is not $\phi(x)\phi(y)\phi(z)$ where $\phi(\cdot)$ is the standard normal density, but rather

$$f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z) & ~~~~\text{if}~ x \geq 0, y\geq 0, z \geq 0,\\ & \text{or if}~ x < 0, y < 0, z \geq 0,\\ & \text{or if}~ x < 0, y\geq 0, z < 0,\\ & \text{or if}~ x \geq 0, y< 0, z < 0,\\ 0 & \text{otherwise.} \end{cases}\tag{1}$$

We can calculate the joint density of any pair of the random variables, (say $X$ and $Z$) by integrating out the joint density with respect to the unwanted variable, that is, $$f_{X,Z}(x,z) = \int_{-\infty}^\infty f_{X,Y,Z}(x,y,z)\,\mathrm dy. \tag{2}$$

  • If $x \geq 0, z \geq 0$ or if $x < 0, z < 0$, then $f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z), & y \geq 0,\\ 0, & y < 0,\end{cases}$ and so $(2)$ reduces to $$f_{X,Z}(x,z) = \phi(x)\phi(z)\int_{0}^\infty 2\phi(y)\,\mathrm dy = \phi(x)\phi(z). \tag{3}$$

  • If $x \geq 0, z < 0$ or if $x < 0, z \geq 0$, then $f_{X,Y,Z}(x,y,z) = \begin{cases} 2\phi(x)\phi(y)\phi(z), & y < 0,\\ 0, & y \geq 0,\end{cases}$ and so $(2)$ reduces to $$f_{X,Z}(x,z) = \phi(x)\phi(z)\int_{-\infty}^0 2\phi(y)\,\mathrm dy = \phi(x)\phi(z). \tag{4}$$

In short, $(3)$ and $(4)$ show that $f_{X,Z}(x,z) = \phi(x)\phi(z)$ for all $x, z \in (-\infty,\infty)$ and so $X$ and $Z$ are (pairwise) independent standard normal random variables. Similar calculations (left as an exercise for the bemused reader) show that $X$ and $Y$ are (pairwise) independent standard normal random variables, and $Y$ and $Z$ also are (pairwise) independent standard normal random variables. But $X,Y,Z$ are not mutually independent normal random variables. Indeed, their joint density $f_{X,Y,Z}(x,y,z)$ does not equal the product $\phi(x)\phi(y)\phi(z)$ of their marginal densities for any choice of $x, y, z \in (-\infty,\infty)$

Dilip Sarwate
  • 26,411
5

The continuous analog of the Bernstein example: Divide up the unit cube into eight congruent subcubes of side length $1/2$. Select four of these cubes: Subcube #1 has one vertex at $(x,y,z)=(1,0,0)$, subcube #2 has one vertex at $(0,1,0)$, subcube #3 has one vertex at $(0,0,1)$, and subcube #4 has one vertex at $(1,1,1)$. (To visualize this, you have two layers of cubes: the bottom layer has two cubes in a diagonal formation, and the top layer has two cubes in the opposite diagonal formation.)

Now let $(X,Y,Z)$ be uniform over these four cubes. Clearly $X, Y, Z$ are not mutually independent, but every pair of variables $(X,Y)$, $(X,Z)$, $(Y,Z)$ is uniform over the unit square (and hence independent).

grand_chat
  • 40,909
  • 1
    Nice! My example with normal random variables uses the same idea with the entire three-dimensional space being divided into eight octants over four of which the joint density has zero value. – Dilip Sarwate Nov 24 '16 at 21:49
2

Here's a potentially very simple construction for $k$-wise independent random variables, uniformly distributed on $[0, 1]$ (though admittedly I didn't work it out very carefully so hopefully the condition checks out)? There is a standard way of constructing discrete $k$-wise independent random variables: let $X_1,\ldots, X_{k}$ be uniform independent rvs drawn from $F$, $F$ a finite field of order $q$ ($q$ sufficiently larger than $k$, of course). Then, for $u\in F$, let $Y_u = X_1 + uX_2+\cdots + u^{k-1}X_k$. Then, the random variables $\{Y_u : u\in F\}$ are $k$-wise independent.

Now, just divide $[0, 1]$ into $q$ evenly spaced subintervals. Let $Z_u$ be uniform from the $Y_u$th subinterval. The rvs $\{Z_u : u\in F\}$ are $k$-wise independent (their joint CDF would decompose as the product of the individual CDFs, since the $\{Y_u\}$ are $k$-wise independent), and are uniform over $[0, 1]$ since each $Y_u$ is uniform over $F$.

tc1729
  • 3,217
0

The example of Bernstein would be like this: on $V$ a vector space over a finite field $\mathbb{F}_q$ every linear map $x \mapsto \phi(x) \in \mathbb{F}_q$ can be seen as random variable with values in a discrete space with a uniform measure. Now, $\phi_1$, $\ldots$, $\phi_m$ are independent as random variables if and only if they are so as elements of $V^{*}$, so the examples write themselves.

Now, we cannot do this directly with $V$ an $n$-dimensional. vector space over $\mathbb{R}$ because we do not have a uniform probability measure on $\mathbb{R}$. But we can still do something if we involve $\mathbb{Z}$. So take $V/\ \mathbb{Z}^n$ as the space, and as random variables

$$\bar \phi \colon V/\mathbb{Z}^n \to \mathbb{R}/\mathbb{Z}$$

given by linear maps $\phi \colon V \to \mathbb{R}$ with integer coefficients.

orangeskid
  • 56,630