Today at the university we've learned about the condition of the uncorrelation: if r(X,Y)=0 then the two random variable X and Y are independent. My teacher also asked that if we can prove the reversed condition: if there is given two random variable X and Y which are independent then it follows that they are uncorrelated. I have the prove it with a concrete and simple example (I have to give values to X and Y and to to have a visible result of the proving). Thank you guys:)
-
@JMoravitz I don't think so. This is the opposite of that question. – Matt Samuel Nov 14 '19 at 17:47
-
You seem to be right, I hadn't read closely enough. But then the linked answer disproves the claim made in the first sentence. – JMoravitz Nov 14 '19 at 17:50
-
You cannot prove in general that two uncorrelated random variables are independent because it is not true. You can prove that two independent random variables are uncorrelated (providing that they have finite positive variances) – Henry Nov 14 '19 at 17:50
-
thank you for your fast responses, I would like to mark the question as marked but tbh I don't see any opportunity to do that. – MrNobody Nov 14 '19 at 18:05
-
1Possible duplicate of Uncorrelated but not independent random variables – SlipEternal Nov 14 '19 at 21:57
1 Answers
You cannot prove in general that two uncorrelated random variables are independent because it is not true.
You can prove that two independent random variables are uncorrelated (providing that they have finite positive variances). For example consider two independent random variables $X$ and $Y$ with means $\mu_X$ and $\mu_Y$ and variances $\sigma^2_X$ and $\sigma^2_Y$. Independence means $\mathbb P(X=x,Y=y)= \mathbb P(X=x)\mathbb P(Y=y)$.
Their correlation is $$\rho_{X,Y} =\dfrac{\mathbb E[(X-\mu_X)(Y-\mu_Y)]}{\sigma_X^{\,}\sigma_Y^{\,}} = \dfrac{\mathbb E[XY] -\mu_X\mathbb E[Y] -\mu_Y\mathbb E[X] + \mu_X\mu_Y }{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\left(\sum \sum x y \mathbb P(X=x,Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\left(\sum \sum x y \mathbb P(X=x) \mathbb P(Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\= \dfrac{\left(\sum x \mathbb P(X=x)\right) \left( \sum y \mathbb P(Y=y)\right) -\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} \\ = \dfrac{\mu_X\mu_Y-\mu_X\mu_Y}{\sigma_X^{\,}\sigma_Y^{\,}} = 0$$
- 169,616