3

I had an exam this morning, and I had to prove that for $X$ and $Y$ bounded, if for all $n$ and all $m$, $$\mathbb E[X^nY^m]=\mathbb E[X^n]\mathbb E[Y^m],$$then $X$ and $Y$ are independents. Using Characteristic function I think that I could justify it using the fact that $$\cos(tX)=\sum_{k=0}^\infty \frac{(-1)^kt^{2k}X^{2k}}{(2k!)}\quad \text{and}\quad \sin(tX)=\sum_{k=0}^\infty \frac{(-1)^kt^{2k+1}X^{2k+1}}{(2k+1)!},$$ but how can I do without characteristic function, and out measure theory ? (it's a lecture of introduction of probability, so we don't have tools of measure theory as DCT or MCT, neither characteristic function).


My definition of the expectation is : for $X$ s.t. $\mathbb E|X|<\infty $, $$\mathbb E[X]=\lim_{n\to \infty }\mathbb E[X_n]=\sum_{k\in\mathbb Z}\frac{k}{2^n}\mathbb P\left(X_n=\frac{k}{2^n}\right),$$ where $X_n=2^{-n}\lfloor 2^nX\rfloor$.

StubbornAtom
  • 17,932

2 Answers2

6

It would be interesting to known if there is a short proof of this.


Here is a (not so short) one... Let $X,Y$ be bounded random variables. We start with $$ \mathbb E[X^nY^m] = \mathbb E[X^n]\mathbb E[Y^m]\qquad \text{for all }n,m \in \mathbb N \tag1$$ Using linear combinations, we deduce $$ \mathbb E[f(X)g(Y)] = \mathbb E[f(X)]\mathbb E[g(Y)]\qquad \text{for all polynomials }f,g \tag2$$ Using Weierstrass approximation (uniform convergence) we deduce $$ \mathbb E[f(X)g(Y)] = \mathbb E[f(X)]\mathbb E[g(Y)]\qquad \text{for all continuous functions }f,g \tag3$$ Using dominated convergence (pointwise a.e. convergence) we deduce $$ \mathbb E[f(X)g(Y)] = \mathbb E[f(X)]\mathbb E[g(Y)]\qquad \text{for all bounded measurable functions }f,g \tag4$$ In particular, $(4)$ holds for all indicator functions. That is the definition of "$X$ and $Y$ are independent".

GEdgar
  • 117,296
  • This looks like the "go-to proof" to me! Which theorem are you using in step $(3)$? I thought you'd apply the DCT and then I think we need some restriction on the kind of continuous functions approximated (e.g., compact support or boundedness?) – charlus Jan 11 '21 at 17:59
  • 2
    The random variables $X,Y$ are bounded. So there is a constsnt $M$ with $-M \le X \le M, -M \le Y \le M$. Then approximate any continuous function $f$ by polynomials $f_n$ that converge uniformly to $f$ on $[-M,M]$. Then $f_n(X)$ converges uniformly to $f(X)$. Similarly for $g$. – GEdgar Jan 11 '21 at 22:51
  • Thank you. My question is: how do you justify the exchange of the $\lim$ and $\mathbb{E}$ signs? I know that if the $f_n$ are continuous converge uniformly to $f$ on a compact interval $[a,b]$ then $\lim\int_a^bf_n(x)dx=\int_a^bf(x)dx$ but here I don't see the continuity. Sorry for being dense about this. – charlus Jan 12 '21 at 14:07
  • 1
    If $f_n \to f$ uniformly on $[-M,M]$, then $f_n(X) \to f(X)$ uniformly on $\Omega$ (the sample space, domain of $X$). Then $\mathbb E[f_n(X)] \to \mathbb E[f(X)]$. Similar for $\mathbb E[g_n(Y)] \to \mathbb E[g(Y)]$ and $\mathbb E[f_n(X)g_n(Y)] \to \mathbb E[f(X)g(Y)]$. – GEdgar Jan 12 '21 at 19:37
  • Thanks a lot, it's all very clear now. I did not know this result. – charlus Jan 13 '21 at 10:23
1

Are moment generating functions OK? Since $e^{sX}=\sum_{n=0}^\infty\frac{s^nX^n}{n!}$ and $e^{tY}=\sum_{m=0}^\infty\frac{t^m Y^m}{m!}$, by linearity $$M_{X,Y}(s,t)=\mathbb E\left[e^{sX}\cdot e^{tY}\right]=\mathbb E\left[e^{sX}\right]\cdot\mathbb E\left[e^{tY}\right]=M_X(s)\cdot M_Y(t).$$ So $X,Y$ are independent.

This is probably similar to whatever method you used with characteristic functions, but I think normally MGFs are regarded as more elementary.

jlammy
  • 9,424
  • How do you prove that $\mathbb E[e^{sX+tY}]=\mathbb E[e^{sX}]\mathbb E[e^{tY}]$ without monotone convergence theorem ? – user841366 Jan 11 '21 at 17:12