5

As is well known, there are many examples of (pairs of) random variables which have covariance zero, but which are not independent.

However, I'm wondering whether there are any general theorems about what having covariance zero does imply. Are there theorems that say something like, "Covariance zero, under additional hypothesis H, implies the following relationship between $X$ and $Y$," or is there essentially no possible general relationship?

By the way, although for certain types of distributions/families the implication "Covariance zero implies independent" is valid, that is not the kind of theorem I am asking about. What relationship can there be that is short of full independence? Any reasonable hypothesis H would be interesting to me, as would references to online resources or standard references.

kcrisman
  • 2,285
  • 1
    I suppose you're not looking for results of the form "$X$ uncorrelated with $Y$ $\implies$ $\operatorname{Var}(X\pm Y)=\operatorname{Var}(X)+\operatorname{Var}(Y)$" or "$X,Y$ uncorrelated with $Z$ $\implies$ $X\pm Y$ uncorrelated with $Z$"? –  Nov 07 '19 at 15:32
  • You are right that that is weaker than I was hoping for (follows basically from definition of variance), but then again you didn't add an additional hypothesis H so I suppose that would be expected. Ideal would be a theorem that has an equivalence for how strong an addition hypothesis would have to be in order to say more - for instance, that essentially there is only independent and dependent, with no gradations between. – kcrisman Nov 07 '19 at 16:00
  • @kcrisman do you know entropy? independence is equivalent to $H(X|Y) = H(X)$. – mathworker21 Nov 07 '19 at 20:06
  • @mathworker21 I know of entropy but haven't thought of it in a long time, nor really used it much. You could make a more detailed version of this an answer, but keep in mind it is answering a different question; I asked what covariance zero implies (since it doesn't imply independence), not what other things might imply independence. Unless there is some relationship between entropy and covariance not mentioned here? – kcrisman Nov 07 '19 at 23:21
  • 1
    @kcrisman ha! I read your question as "but what does imply it". – mathworker21 Nov 07 '19 at 23:26

3 Answers3

6

The covariance is just one number computed from two random variables. You cannot expect it to capture much information.

Saying that the covariance is zero is one equation; by comparison, saying that two RVs are independent is a huge amount of equations: one for each couple of real numbers! It is very strict.

All you can say when the covariance is zero is that a linear regression on a sample large enough will give you a horizontal line: you cannot predict the value of one variable from the other based on a linear model.

I would be happy to learn, but I'm afraid that not much more can be said.

  • Well, that was why I asked for "given some hypothesis H". Clearly for certain types of random variables there is enough "extra" information - though see this post for a warning about what "enough" extra information is. So I figure maybe there is also some possible theoretical source, though I take the point that it would have to be a fairly strong condition to imply anything very close to independence. But maybe there is something else it could imply. – kcrisman Nov 04 '19 at 16:33
  • 2
    $H(X)-H(X|Y)$ is just one number, and it captures independence – mathworker21 Nov 07 '19 at 20:06
  • It's not the answer I'm looking for so I'm not accepting, but you deserve the bounty for this for the effort :) – kcrisman Nov 15 '19 at 01:58
  • @kcrisman Well, thanks, and if you ever get a more satisfactory answer, let me know (again, I'll be happy to learn), and I'll transfer the bounty to that answer. – Arnaud Mortier Nov 15 '19 at 05:12
  • Wow, I didn't even know that was possible. I think you get to keep it even in that unlikely event, though I guess you don't need the rep :) – kcrisman Nov 15 '19 at 14:05
  • I added a similar answer haha, before reading yours. Good intuition! – Fourier_T Dec 14 '23 at 02:45
0

Think about it in terms of points scattered on a plane. When covariance $=0$, you are saying that you cannot construct a linear curve between those points on the plane but you can still construct some "curve" joining the points hence, independence is not $0$. But when independence is $0$, this means we cannot construct any curve joining any set of points, hence there is no covariance whatsoever.

Fourier_T
  • 123
-1

The opposite to lack of correlation is full correlation. Typically, partial or full correlation between two events without dependence occurs when there is a third intevening variable. For example suppose there is a partial correlation between my time of waking and high tide. this may be explained by the fact that I need to get to work early in the morning and at my longditude high tide tends to occur in the morning. In this case time is the intervening variable.

Prof. J
  • 11