2

Assume you have a sequence of independent Bernoulli random variables $X_i$ each with probability $p_i$. Let $c_i$ be a sequence of real numbers and $ m,M$ be a real numbers such that $0 < m <c_i<M< \infty $. Define

$$Y_n=\frac{\sum\limits_{i=1}^n c_i X_i -\sum\limits_{i=1}^n c_i p_i}{\sum\limits_{i=1}^n c_i}.$$

Then $Y_n$ converges to zero in $L^2$, but do we have the same a.s.?

Proof that $Y_n$ converges to zero in $L^2$:

$E[Y_n^2]=E[Y_n^2]-E[Y_n]^2=V(Y_n)=\frac{\sum c_i^2 p_i(1-p_i)}{(\sum c_i )^2}\le \frac{nM^2}{n^2m^2}=\frac{M^2}{m^2n},$

and this last quantity goes to zero as $n$ goes to infinity.

user394334
  • 1,707

1 Answers1

2

I found a theorem in "A course in real analysis, by McDonald and Weiss" that solves the problem:

Kolmogorov's Strong Law of Large Numbers:

Let $X_1,X_2,...$ be mutually independent variables with finite varaiation and set $S_n=X_1+X_2+\cdots X_n$. Suppose $\{b_n\}$ is an increasing sequence of positive real numbers saitisfying $\lim_{n \rightarrow \infty }b_n=\infty$ and

$$\sum\limits_{n=1}^\infty \frac{Var(X_n)}{b_n^2}<\infty.$$

Then, with probability one

$$\lim_{n \rightarrow \infty}\frac{S_n-E[S_n]}{b_n}=0.$$

Now we get with our notation, set $X_n'= c_nX_n$, $b_n=\sum_{i=1}^n c_i$. Then we get

$$\sum\limits_{n=1}^\infty \frac{Var(X_n')}{b_n^2}=\sum\limits_{n=1}^\infty \frac{c_i^2p_i(1-p_i)}{(\sum_{i=1}^n c_i)^2}\le \sum\limits_{n=1}^\infty \frac{M^2 }{m^2n^2}<\infty.$$

We also have $\lim_{n \rightarrow \infty }b_n =\lim_{n \rightarrow \infty} \sum\limits_{i=1}^n c_i\ge \lim_{n \rightarrow \infty}mn=\infty$. So we can use the theorem with $X_n'$. We now get

$$\lim_{n \rightarrow \infty}\frac{\sum\limits_{i=1}^n c_i X_i -\sum\limits_{i=1}^n c_i p_i}{\sum\limits_{i=1}^n c_i}=\lim_{n \rightarrow \infty}\frac{S_n-E[S_n]}{b_n}=0,$$ with probability one.

user394334
  • 1,707