0

Today at the university we've learned about the condition of the uncorrelation: if r(X,Y)=0 then the two random variable X and Y are independent. My teacher also asked that if we can prove the reversed condition: if there is given two random variable X and Y which are independent then it follows that they are uncorrelated. I have the prove it with a concrete and simple example (I have to give values to X and Y and to to have a visible result of the proving). Thank you guys:)

MrNobody
  • 111
  • 2

1 Answers1

0

You cannot prove in general that two uncorrelated random variables are independent because it is not true.

You can prove that two independent random variables are uncorrelated (providing that they have finite positive variances). For example consider two independent random variables $X$ and $Y$ with means $\mu_X$ and $\mu_Y$ and variances $\sigma^2_X$ and $\sigma^2_Y$. Independence means $\mathbb P(X=x,Y=y)= \mathbb P(X=x)\mathbb P(Y=y)$.

Their correlation is $$\rho_{X,Y} =\dfrac{\mathbb E[(X-\mu_X)(Y-\mu_Y)]}{\sigma_X^{\,}\sigma_Y^{\,}} = \dfrac{\mathbb E[XY] -\mu_X\mathbb E[Y] -\mu_Y\mathbb E[X] + \mu_X\mu_Y }{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\left(\sum \sum x y \mathbb P(X=x,Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\left(\sum \sum x y \mathbb P(X=x) \mathbb P(Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\= \dfrac{\left(\sum x \mathbb P(X=x)\right) \left( \sum y \mathbb P(Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\mu_X\mu_Y-\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} = 0$$

Henry
  • 169,616