5

Let ${E_1,E_2,\dots}$ be a sequence of jointly independent events. If ${\sum_{n=1}^\infty {\bf P}(E_n) = \infty}$, show that almost surely an infinite number of the ${E_n}$ hold simultaneously. (Hint: compute the mean and variance of ${S_n= \sum_{i=1}^n 1_{E_i}}$. One can also compute the fourth moment if desired, but it is not necessary to do so for this result.)

Question: From the hint, we want ${\bf P}(\lim_n S_n = \infty) = 1$. i.e., $S_n$ diverges almost surely, yet I didn’t see immediately how the mean ${\bf E}(S_n) = \sum_{i=1}^n {\bf P}(E_i)$ and the variance ${\bf Var}(S_n) = \sum_{i=1}^n {\bf P}(E_i)(1 - {\bf P}(E_i))$ is applicable here, the Chebyshev’s inequality does not seem to convey too much information.

shark
  • 1,551

2 Answers2

2

$$ P\left(\vert S_n - E S_n \vert > \dfrac{1}{2}ES_n\right) \le \dfrac{4\text{Var}(S_n)}{(ES_n)^2} = \dfrac{4\sum_{k = 1}^n P(E_k)(1 - P(E_k))}{(\sum_{k = 1}^n P(E_k))^2} \le \dfrac{4}{\sum_{k = 1}^n P(E_k)} \rightarrow 0 $$ Or, equivalently, $$ P\left(\vert S_n - E S_n \vert \le \dfrac{1}{2}ES_n\right) \rightarrow 1 $$ But, $$ \{\vert S_n - ES_n \vert \le \dfrac{1}{2}ES_n\} \subseteq \{S_n \ge \dfrac{1}{2}ES_n\} $$ Therefore, $$ P\left(S_n \ge \dfrac{1}{2}ES_n\right) \rightarrow 1 $$ Now, for $M > 0$, we can choose $n_0$ such that $$ \dfrac{1}{2} ES_n \ge M \ \forall n \ge n_0 $$ Thus, $$ P(S_n \ge M) \ge P\left(S_n \ge \dfrac{1}{2}ES_n\right) \forall n \ge n_0 $$ Finally, since $$ \bigcap_{M \in \mathbb{N}} \bigcup_{n \in \mathbb{N}} \{S_n \ge M\} \subseteq \{\lim_n S_n = \infty\} $$ we can conclude that $\mathbb{P}(\lim_n S_n = \infty) = 1$

1

One can use the Paley-Zygmund inequality (the second moment method). As the sequence $S_n$ is non-decreasing, $\lim_n S_n = \lim \sup_{n \to \infty} S_n$, and it suffices to show that ${\bf P}(\lim \sup_{n \to \infty} S_n = \infty) = 1$.

Fix some large $M > 0$. For any $0 \leq \theta \leq 1$ small, let $n' = n(\theta)$ be sufficiently large with $\theta{\bf E}S_{n'} > M$. For all $n \geq n'$, the Paley-Zygmund inequality gives ${\bf P}(S_n > M) \geq {\bf P}(S_n > \theta {\bf E}S_n) \geq (1 - \theta)^2\frac{({\bf E}S_n)^2}{({\bf E}|S_n|^2)} = (1 - \theta)^2$ (by the independence hypothesis, $({\bf E}S_n)^2 = {\bf E}|S_n|^2$). In particular, ${\bf P}(S_n > M) \to 1$.

We thus obtain ${\bf P}(\lim \sup_{n \to \infty} S_n > M) = \lim_{N \to \infty}{\bf P}(\bigvee_{n \geq N}(S_n > M)) \geq \lim_{N \to \infty} {\bf P}(S_N > M) = 1$, since $M$ is arbitrary, the claim follows from continuity from above.

shark
  • 1,551