14

Suppse you have two random variables $X,Y$ and you are given that for any $m,n$ that:

$$E(X^n Y^m) = E(X^n)E(Y^m)$$

Does this imply that $X$ and $Y$ are independent? Are there some condtions on how fast the moments grow can be added to help?

Attempt at solution: I know that if the characteristic functions split like $E(e^{i(X,Y)\cdot(s,t)}) = E(e^{iXs})E(e^{iYt})$, (where $.$ is the Schur product) then the RV's are independent. I would try to approximate this by the moments. However, I think you might need some condition that the moments don't grow too fast to make this work.

3 Answers3

4

The answer is yes if the series $$\sum_{m=1}^{+\infty}\frac{t^m}{m!}\mathbb E|X|^m\quad\mbox{ and }\quad\sum_{n=1}^{+\infty}\frac{t^n}{n!}\mathbb E|Y|^n$$ are convergent for each $t$ (it follows from dominated convergence theorem which gives the splitting equality mentioned in the OP). It's in particular the case when $X$ and $Y$ are bounded.

Davide Giraudo
  • 181,608
  • I don't think your counterexample works. $X = 2$ a.s., so it's independent of itself. I think the claim should hold provided only that the moment generating functions exist in a neighborhood of $0$ (so the moments are bounded by an exponential times a factorial). – Qiaochu Yuan Aug 08 '13 at 21:56
  • You are right, I removed it. I would be surprised if the result holds, but I'm still looking for a counter-example. – Davide Giraudo Aug 08 '13 at 22:15
  • Can you show your work please? I may be able to duplicate it, but since you posted that as an answer there must be more than a vague reference. – mhp Mar 09 '16 at 13:26
  • I would like to mention that the case of bounded $X$ and $Y$ is known in Poland as the Kac theorem. – Santiago Aug 23 '16 at 18:21
3

I have asked myself the same question recently, and stumbled upon your question. If you still care, I managed to find an answer: in all generality, it does not.

There is an article by B. M. Bisgaard and Z. Sasvári masterfully titled When does $E[X^k\cdot Y^l]=E[X^k]\cdot E[Y^l]$ imply independence?, published in Statistics & Probability Letters (2006), which explicitly constructs examples when it fails, and gives reasonable sufficient conditions for it to hold, e.g. that described by D. Giaudo. As suggested in a comment, they use tools involved in the study of the two-dimensional moment problem.

Pierre PC
  • 286
  • 1
  • 10
1

The general answer to your question is, no: "moment independence" does not imply independence of the r.v.'s involved. Assume that $$ E(X^n Y^m) = E(X^n)E(Y^m) \Rightarrow \int_{S_x}\int_{S_y}x^ny^mf_{XY}(x,y)dydx = \int_{S_x}x^nf_X(x)dx\int_{S_y}y^mf_Y(y)dy $$

$$\Rightarrow \int_{S_x}\int_{S_y}x^ny^m\left[f_{XY}(x,y)-f_X(x)f_Y(y)\right]dydx =0$$

In principle, this iterated integral can be zero while $f_{XY}(x,y)-f_X(x)f_Y(y)\neq 0$ at the same time.

The usual difficulty is to find specific examples of random variables such that they exhibit certain "independence" traits, but they are not independent. Still, this does not change the result.

  • I don't see how you've actually given an argument justifying what you are saying, considering that for a large class of random vectors the answer is "yes" rather than "no" (e.g. if the mgf of $X$ and $Y$ exist the answer is yes). – guy Aug 10 '13 at 02:52
  • @guy This is a constructivist criticism, which I perfectly respect. I will try then to find a concrete example! – Alecos Papadopoulos Aug 10 '13 at 14:25
  • I think part of the issue is as follows: for a fixed integers $n,m$ I can see how it would be possible that $f_XY - f_X f_Y$ is not identically zero and the integral you wrote would work out to zero. However, we have that it is zero for every $n,m$, which is much stronger and I think should be enough to conclude that $f_XY - f_X f_Y$ IS identically zero....with perhaps some more technicalities needed. – Mihai Nica Sep 04 '13 at 23:07
  • The general issue is the following: Independence between $X$ and $Y$ is defined as the case where $h(X)$ is independent of $g(X)$, for every $h()$ and $g()$. So the really beautiful question (and result) is : Does $E(X^n Y^m) = E(X^n)E(Y^m) \Rightarrow E(h(X) g(Y)) = E(h(X))E(g(Y))$ ? I don't really have a strong opinion against, I am just trying to understand it. – Alecos Papadopoulos Sep 05 '13 at 00:04
  • See this answer for an explicit counterexample. – Mike Earnest Apr 01 '22 at 19:50