4

$(X_i)_{i \geq 2}$ is a sequence of independent random variables. Their probability measure is defined as

$P(X_i=i)=\frac{1}{i\log i}$ and $P(X_i=0)=1-\frac{1}{i \log i}$.

How can we show that $\frac{1}{n}\sum\limits_{i=2}^n(X_i -E(X_i))$ converges in probability to zero but not almost surely.

To prove the weak law of large number, I can just use Chebyshev inequality. But I don't know how to show that this doesn't converge to zero almost surely.

1 Answers1

0

To prove the convergence in probability, we need to use Chebyshev's inequality,

$P(|\frac{1}{n}\sum_{j=2}^n(X_j-E(X_j))|>\epsilon)=\frac{Var[\frac{1}{n}\sum_{j=2}^n(X_j-E[X_j])]}{\epsilon^2}$.

$Var[\frac{1}{n}\sum_{j=2}^n(X_j-E(X_j))]$ is nothing but $E[(\frac{1}{n}\sum_{j=2}^n(X_j-E[X_j]))^2]$.

By independence between r.v.s and linearity of R, $E[(\frac{1}{n}\sum_{j=2}^n(X_j-E[X_j]))^2] = (\sum_{j=2}^n\frac{j^2}{j \log j}-\sum_{i=2}^n \sum_{j=2}^n \frac{1}{\log i}\frac{1}{\log j})/n^2$ which is bounded by $\frac{n(n-1)}{n^2 \log n}$. Apparently, $\lim\limits_{n\rightarrow\infty} \frac{n(n-1)}{n^2 \log n} = 0$.

We conclude, $P(|\frac{1}{n}\sum_{j=2}^n(X_j-E(X_j))|>\epsilon)=0$ as $n\rightarrow\infty$.

To show that the convergence is not almost sure, we need Borel-Cantelli. An similar proof can be found in Sequence satisfies weak law of large numbers but doesn't satisfy strong law of large numbers.

Here we define $S_n= \sum_{i=2}^n(X_i-E[X_i])$. We have $\{X_n=n\}\subset \{S_n\geq \frac{n}{2}\} \cup \{S_{n-1}\geq\frac{n}{2}\}$.

Note that $\sum_{n=2}^{\infty}P(X_n=n)=\sum_{n=2}^{\infty}\frac{1}{n\log n}$ diverges. By Borel-Cantelli, we have $\limsup\limits_{n\rightarrow \infty}P[S_n \geq \frac{n}{2}]=1$.

From what we get above, we can say $P[\lim\limits_{n \rightarrow \infty}\frac{S_n}{n}=0]<1$ which means the almost sure convergence does not hold.

Clement C.
  • 68,437