2

I understand that two independent random variables are by definition uncorrelated as their covariance is equivalent to 0:

$Cov(x,y) = E(xy)- E(x)E(y)$

$E(x)*E(y) = E(xy)$, when x and y are two random independent variables.

Therefore, $Cov(x,y) = 0$.

However, I am having trouble understanding if two random variables, X and Y, are uncorrelated, it does not necessarily mean they are independent.

Could someone also give me a real world example of when two random variables are neither independent nor casually connected?

I believe it will help me understand this concept better.

7 Answers7

2

Consider $X,Y$ i.i.d. as Bernulli with parameter $p$.

Consider now the rv's

$$U=X+Y$$

and

$$V=X-Y$$

It is easy to verify that $Cov(U;V)=0$ but they are clearly dependent. To prove they actually are dependent observe that, for example,

$$P(V=0)=(1-p)^2+p^2$$

but

$$P(V=0|U=0)=1$$

tommik
  • 33,201
  • 4
  • 17
  • 35
1

It helps to see an example. Let $t$ be a real-valued random variable that takes the uniform distribution in the interval $[0,2\pi]$. Next, let $Y$ be the random variable $Y = \sin(t)$ and let $X$ be the random variable $X = \cos(t)$.

Now $X$ and $Y$ are NOT independent. Check this for yourself. If given $X$ then you can narrow $Y$ down to at most 2 values $Y$ must be $\pm\sqrt{1-X^2}$. What about Cov$(XY)$ though? Isn't this 0? [It is infact.]

Mike
  • 23,484
0

If two random variables are independent, then they are uncorrelated. This is because if $X$ and $Y$ are independent, then one property is $$ E(XY) = E(X)E(Y) $$ which you can prove pretty easily using the definition of expected value and independence.

Example from this source.

Suppose $X$ is uniformly distributed over $[-1, 1]$. Suppose $Y = |X|$ meaning that $Y = X$ when $X \geq 0$ and $Y = -X$ when $X < 0$. Then $Y$ is uniformly distributed over $[0, 1]$. $X$ and $Y$ are clearly dependent, since $Y$'s value is determined based on the magnitude of $X$.

But $$E(XY | X \geq 0) = \int_{0}^{1} x^2 dx = \frac{1}{3}$$ and $$E(XY | X <0) = \int_{-1}^{0} -x^2 dx = -\frac{1}{3}.$$

Law of iterated expectations says $$E(XY) = E(XY | X \geq 0) + E(XY | X < 0) = 0.$$

Thus these variables are uncorrelated but independent.

Pavan C.
  • 1,727
0

We can construct such distributions $X,Y$.

Let $X= \begin{cases}-1 &w.p. &x\\0 &w.p. &(1-2x)\\1 &w.p. &x\end{cases}$ $\qquad\qquad Y = \begin{cases}1 &\text{whenever } x=0\\0 &\text{otherwise}\end{cases}$

we can calculate $\mathbb E [XY] = 0,\mathbb E [X] = 0$ and then:

$$Cov(X,Y) = \mathbb E [XY]-\mathbb E[X]\mathbb E[Y] = 0$$

Rahul Madhavan
  • 2,939
  • 1
  • 12
  • 16
0

Take X to be normal and Y to be $X^2$ then X and Y are by definition not independent as Y is defined in terms of X but look at E[XY] . That's 0 so they are uncorrelated. (If course 0 means are implied) . Generally independence is a more fundamental property. It's definition relies on the probability space on which remain variable are defined. More simply independence definition is P(A and B) is P(A)P(B) and that E[XY] = E[X]E[Y] is a derived result. Correlation is a still further defined term. So of course from the heirarchy it follows independence would imply correlation and not the other way around. One can also look at it geometrically. Correlation would imply orthogonality in a way between random variables. So I can define teri orthogonal random variables in terms of each other. Whereas independence would definitely mean there can be no such mutual dependecy

rostader
  • 487
0

We can also look at the problem from an intuitive point of view. Let's consider the influence of stress on exam scores. You can imagine that in small doses stress is beneficial because it enables you to concentrate. However, after exceeding some threshold it will cause worse performance during the exam. If you plotted the values of stress versus corresponding exam scores, it would be in the shape of a parabola. These two variables are uncorrelated, but definitely dependent.

treskov
  • 127
0

Consider $X$ and $Y$ both in $\{-1,1\}$ and such that on drawing $n$, $y_n=(-1)^nx_n$. They will be uncorrelated (by symmetry), but $y_n$ is perfectly predictable knowing $x_n$.