12

Let $\{X_n\}_{n=1}^{\infty}$ be a sequence of independent random variables such that $$P(X_n=n+1)=P(X_n=-(n+1))=\frac{1}{2(n+1)\log(n+1)}$$ $$P(X_n=0)=1-\frac{1}{(n+1)\log(n+1)}$$ Prove that $X_n$ satisfies weak law of large numbers but doesn't satisfy strong law of large numbers.

I thought that I could show that $\mathbb EX_n=0$ and then I tried to show that $\frac{ \mathrm{Var}X_n}{n^2} \rightarrow 0$. But I don't think this is the right way, I think I need to use $S_n$ somehowe. How should it be done?

Davide Giraudo
  • 181,608
nilcorc
  • 1,102
  • 2
    You definitely have to use $S_n$, because the statement to be proved is that $S_n/n \to 0$ converges in probability to a constant, not $X_n/n$. – Nate Eldredge May 18 '15 at 21:10
  • Could you tell me what does $S_n$ look like, because I don't know how to do it? – nilcorc May 18 '15 at 21:12
  • 4
    $S_n = X_1 + \dots + X_n$ – Nate Eldredge May 18 '15 at 21:13
  • The weak law is to show $Pr[|S_n/n|\geq \epsilon] \leq E[(S_n/n)^2]/\epsilon^2\rightarrow 0$. To show the strong law does not work, you need to consider what happens to the average $S_n/n$ if we have $X_n\neq 0$ for some particular $n$, and then find out how often this happens. – Michael May 18 '15 at 21:49

2 Answers2

11

To show convergence in probability, $$ \mathbb E[X_n] = \frac{n+1}{2(n+1)\log(n+1)} - \frac{n+1}{2(n+1)\log(n+1)} = 0$$ and $$ \mathrm{Var}(X_n)=\mathbb E[X_n^2] = \frac{2(n+1)^2}{2(n+1)\log(n+1)} = \frac{n+1}{\log(n+1)}. $$ Hence $\mathbb E[S_n]=0$, and $$\begin{align*} \frac1{\varepsilon^2}\mathbb E\left[\left(\frac{S_n}n\right)^2\right] &= \frac1{n^2\varepsilon^2} \mathbb E[S_n^2]\\ &= \frac1{n^2\varepsilon^2} \mathrm{Var}\left(\sum_{i=1}^n X_i\right)\\ &= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \mathrm{Var}(X_i)\\ &= \frac1{n^2\varepsilon^2} \sum_{i=1}^n \frac{i+1}{\log(i+1)}\\ &\leqslant \frac1{n^2\varepsilon^2}\left(\frac{n(n+1)}{\log(n+1)}\right)\\ &= \frac{n+1}{n\log(n+1)\varepsilon^2}\\ &= \frac1{\log(n+1)\varepsilon^2} + \frac1{n\log(n+1)\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0. \end{align*}$$ By Markov's inequality, $$ \mathbb P\left(\frac{S_n}n \geqslant\varepsilon \right)\leqslant \frac{\mathbb E\left[\left(\frac{S_n}n\right)^2 \right]}{\varepsilon^2}\stackrel{n\to\infty}{\longrightarrow}0.$$

To show that the convergence is not almost sure, for each $n$ we have, as pointed out by @Frank $$ \{X_n=n+1\} \subset \left\{|S_n|\geqslant \frac n2\right\} \cup \left\{|S_{n-1}|\geqslant \frac n2\right\}.$$ Since $$\sum_{n=1}^\infty \mathbb P(X_n=n+1) = \sum_{n=1}^\infty\frac1{2(n+1)\log(n+1)}=+\infty,$$ by Borel-Cantelli we have $$\limsup_{n\to\infty} \mathbb P\left(\frac{|S_n|}n\geqslant \frac12\right)=1,$$ and hence $$\mathbb P\left(\lim_{n\to\infty} \frac{S_n}n = 0\right)<1.$$

Math1000
  • 38,041
  • Is it also true that $\mathbb{P}(lim_{n\rightarrow\infty}\frac{S_n}{n}=0) = 0$? (instead of just <1) Thanks. – orangecat Oct 28 '22 at 19:55
2

To show the convergence to $0$ is not almost sure, note $$\{X_n = n+1\} \subset \{|S_n| \geq n/2\} \cup \{|S_{n-1}| \ \geq n/2\}.$$ The probabilities on the left have infinite sum. By Borel-Cantelli, it therefore occurs with probability $1$ that $|S_n|/n \geq 1/2$ infinitely often.

For the convergence in probability I quote the following result from Chung's "A course in probability theory":

Suppose $(X_n)$ is a sequence of independent random variables with distribution functions $F_n$. Let $(b_n)$ be a sequence of real numbers increasing to $\infty$. Suppose that

$$\sum_{j=1}^n \int_{|x|>b_n} dF_j(x) = o(1) $$

and that

$$\frac{1}{b_n^2} \sum_{j=1}^n \int_{|x| \leq b_n} x^2 dF_j(x) = o(1) .$$

Define some new constants

$$a_n := \sum_{j=1}^n \int_{|x|\leq b_n} x dF_j(x) .$$

Then the following convergence holds in probability:

$$\frac{1}{b_n} (S_n - a_n) \rightarrow 0. $$

(This is easy to show using some truncated random variables). In your case, take $b_n = n+1$. Then the first sum above vanishes; so it's definitely $o(1)$. The quantity on the LHS of the second condition is of order

$$ \frac{li(n)}{n+1} $$

where $li$ is the logarithmic integral function (see wikipedia). But according to wikipedia

$$li(n) = O(n/\log(n)) .$$

So

$$\frac{li(n)}{n+1} = O(1/log(n)) = o(1). $$

By the symmetry of your variables, the $a_n$ defined above equal zero. This gives

$$\frac{S_n}{n+1} \rightarrow 0 $$

in probability, and since $n/(n+1) \rightarrow 0$, certainly $S_n/n \rightarrow 0$ in probability.

Note: you could also just use $b_n=n$ to avoid this last step in the argument, but then there will be a non-zero term in the first sum.

Frank
  • 3,974