0

If $X$ and $Y$ are discrete random variables, each taking only two distinct values, prove that $X$ and $Y$ are independent if and only if $E(XY)=E(X)E(Y)$.

I was wondering if this excercise could be generalized for any amount of distinct values, but was told it actually fails when $X$ and $Y$ take three values. I cannot seem to find an example in which $X$ and $Y$ take three distinct values, are uncorrelated but not independent. All examples seem to have $X$ take three values and $Y$ two.

One of the answers for this question mentions that three distinct values for the pair $(X,Y)$ is the simplest situation where discrete random variables can be uncorrelated but not independent, but the example has $(X,Y)$ take values $(0,0)$ , $(1,1)$ , and $(2,0)$, which means $Y$ takes the value $0$ twice.

Is there an example in which $Y$ takes 3 distinct values, or is it impossible. If so, why?

Shambhala
  • 991

1 Answers1

1

Consider two random variables $\xi$ and $\eta$ with following common distribution $p_{i,j} = P(\xi = i, \eta = j)$: $$p_{0,0} = 0, \quad p_{1,1} = p_{-1,1} = p_{1,-1} = p_{-1,-1} = \frac{\epsilon}{4}, \quad p_{0,1} = p_{0,-1} = p_{1,0} = p_{-1,0} = \frac{1}{4} -\frac{\epsilon}{4}, $$ where $0< \epsilon < 1$. It is easy to find the distribution of each random variable: $$P(\xi = 0) = P(\xi = 0, \eta = 0) + P(\xi = 0, \eta = -1 ) + P(\xi = 0 , \eta = 1) = $$ $$ =p_{0,0} + p_{0,-1} + p_{0,1} = 0 +\frac{1}{4} -\frac{\epsilon}{4} + \frac{1}{4} -\frac{\epsilon}{4} = \frac{1}{2} -\frac{\epsilon}{2}, $$ $$P(\xi = 1) = P(\xi = 1, \eta = 0) + P(\xi = 1, \eta = -1 ) + P(\xi = 1 , \eta = 1) = $$ $$ =p_{1,0} + p_{1,-1} + p_{1,1} = \frac{1}{4} -\frac{\epsilon}{4} + \frac{\epsilon}{4}+ \frac{\epsilon}{4} = \frac{1}{4} + \frac{\epsilon}{4}, $$ $$P(\xi = -1) = P(\xi = -1, \eta = 0) + P(\xi = -1, \eta = -1 ) + P(\xi = -1 , \eta = 1) = $$ $$ =p_{-1,0} + p_{-1,-1} + p_{-1,1} = \frac{1}{4} -\frac{\epsilon}{4} + \frac{\epsilon}{4}+ \frac{\epsilon}{4} = \frac{1}{4} + \frac{\epsilon}{4}, $$ $$P(\eta = 0) = P(\xi = 0, \eta = 0) + P(\xi = -1, \eta = 0 ) + P(\xi = 1 , \eta = 0) = $$ $$ =p_{0,0} + p_{-1,0} + p_{1,0} = 0 +\frac{1}{4} -\frac{\epsilon}{4} + \frac{1}{4} -\frac{\epsilon}{4} = \frac{1}{2} -\frac{\epsilon}{2}, $$ $$P(\eta = 1) = P(\xi = 0, \eta = 1) + P(\xi = -1, \eta = 1 ) + P(\xi = 1 , \eta = 1) = $$ $$ =p_{0,1} + p_{-1,1} + p_{1,1} = \frac{1}{4} -\frac{\epsilon}{4} + \frac{\epsilon}{4}+ \frac{\epsilon}{4} = \frac{1}{4} + \frac{\epsilon}{4}, $$ $$P(\eta = -1) = P(\xi = 0, \eta = -1) + P(\xi = -1, \eta = -1 ) + P(\xi = 1 , \eta = -1) = $$ $$ =p_{-1,0} + p_{-1,-1} + p_{-1,1} = \frac{1}{4} -\frac{\epsilon}{4} + \frac{\epsilon}{4}+ \frac{\epsilon}{4} = \frac{1}{4} + \frac{\epsilon}{4}. $$ Thus, each of the random variables takes on three different values $\{ -1,0,1 \}$ with non-zero probabilities: $$\xi \sim \begin{cases} -1, \, \frac{1}{4} + \frac{\epsilon}{4}, \\ 0, \, \frac{1}{2} -\frac{\epsilon}{2}, \\ 1, \, \frac{1}{4} + \frac{\epsilon}{4} ,\end{cases} \quad \eta \sim \begin{cases} -1, \, \frac{1}{4} + \frac{\epsilon}{4}, \\ 0, \, \frac{1}{2} -\frac{\epsilon}{2}, \\ 1, \, \frac{1}{4} + \frac{\epsilon}{4} .\end{cases}$$ We can easily find, that $$E\xi = 0, \quad E\eta = 0.$$ Now compute $$E\xi \eta = \sum ij \cdot p_{i,j} = 0+ 1 \cdot \frac{\epsilon}{4} + (-1)\cdot \frac{\epsilon}{4} + (-1) \cdot \frac{\epsilon}{4} + 1 \cdot \frac{\epsilon}{4} + 0 +0 +0 + 0 = 0.$$ But, it's obvious that $$P(\xi = 0, \eta = 0) = 0 \neq \frac{1}{4} (1-\epsilon)^2 = P(\xi =0) P(\eta = 0).$$ Hence, $\xi$ and $\eta$ are dependent.

greyls
  • 1,308