4

Hi I am writing because I have been trying to understand the difference between the two definitions for some time. According to Wikipedia the definition of Convergence in probability is:

A sequence ${X_n}$ of random variables converges in probability towards the random variable $X$ if for all $ε > 0$

${\displaystyle \lim _{n\to \infty }\Pr {\big (}|X_{n}-X|>\varepsilon {\big )}=0}$.

And for amost sure convergence:

To say that the sequence $X_n$ converges almost surely or almost everywhere or with probability 1 or strongly towards $X$ means that

$ \operatorname {Pr} \!\left(\lim _{n\to \infty }\!X_{n}=X\right)=1$

or

$ {\displaystyle \operatorname {Pr} {\Big (}\limsup _{n\to \infty }{\big \{}\omega \in \Omega :|X_{n}(\omega )-X(\omega )|>\varepsilon {\big \}}{\Big )}=0\quad {\text{for all}}\quad \varepsilon >0.}$

For me the two definitions are practically identical, so I can't understand what the difference is between the two terminologies.

Can someone help me?

MarcoDJ01
  • 718
  • 1
    The difference is technical and you can only understand the difference through measure theory. Unfortunately , Probability Theory cannot be understood with intuition alone. – Kavi Rama Murthy Sep 30 '22 at 11:25
  • @geetha290krm The difference is not only technical but also practical, and an intuition for the difference can readily be built using simple examples of sequences that converge in probability but do not converge almost surely. I believe it is never good in mathematics to try to work with 'technicalities' in the absence of a fairly clear intuition behind the 'technicalities'. – Julian Newman Jul 15 '24 at 17:06
  • Convergence in probability means for all $\epsilon>0$ the chance of the event $|X_n-X|>\epsilon$ tends to zero as $n\to\infty$. Almost sure convergence means for almost every $\omega\in \Omega$, $\lim_{n\to\infty} X_n(\omega) =X(\omega)$. You said the definitions are practically identical but you're glossing over the fact that one of them has $\lim_n \mathbb{P}(...)$ and the other has $\mathbb{P}(\lim_n...)$--the interchange of limits is not always equal! – Nap D. Lover Dec 04 '24 at 20:08

2 Answers2

2

Almost sure convergence is strictly stronger than convergence in probability. Perhaps an example of a sequence converging in probability but not almost surely might help you? See the first answer of this question for example. Heuristically, convergence in probability does not rule out that for almost every "event" $\omega\in\Omega$ a large deviation of $X_n(\omega)$ from the respective limit $X(\omega)$ may be observed for arbitrarily large $n>0$.

0

We consider a probability space $(\Omega, \mathcal A, \text{Pr})$, and for concision we define the preimage

$$S_n(\varepsilon) := \{ \omega \in \Omega : \lvert X_n(\omega) - X(\omega) \rvert \geq \varepsilon \} \in \mathcal{A}.$$

With this notation, by definition we have that $$\begin{align}X_n \to X \text{ almost surely} &\iff \operatorname{Pr}( \limsup_{n \to \infty} S_n(\varepsilon)) = 0 \quad \forall \varepsilon > 0, \\ X_n \to X \text{ in probability} &\iff \lim_{n \to \infty} \operatorname{Pr}(S_n (\varepsilon)) = 0 \quad \forall \varepsilon > 0.\end{align}$$

Now, by the reverse Fatou's lemma $(*)$ it follows that for all $\varepsilon > 0$

$$0 = \operatorname{Pr}( \limsup_{n \to \infty} S_n(\varepsilon)) \overset{(*)}{\geq} \limsup_{n \to \infty} \operatorname{Pr}(S_n(\varepsilon)) \ge \lim_{n \to \infty} \operatorname{Pr}(S_n(\varepsilon)) \tag{1}.$$

Naturally $\text{Pr} (S_n(\varepsilon)) \geq 0$, as for any probability, so $(1)$ finally implies that

$$\operatorname{Pr}( \limsup_{n \to \infty} S_n(\varepsilon)) = 0 \quad \forall \varepsilon > 0 \Longrightarrow \lim_{n \to \infty} \operatorname{Pr}(S_n(\varepsilon)) = 0 \quad \forall \varepsilon > 0. \tag 2$$

In other words, almost sure convergence implies convergence in probability. In turn, the counterexample in Davide Giraudo's answer to this post shows that the converse of $(2)$ doesn't hold. I transcribe and expand here his answer. Let $(X_n)$ be a sequence of real, independent random variables such that for all $n \in \mathbb N^*$

$$ \text{Pr}(X_n = x) = \begin{cases} 1 - \frac 1n & \text{for $x = 0$}, \\ \frac 1n & \text{for $x = 1$}, \\ 0 & \text{otherwise}. \end{cases} \tag 3$$

We immediately see from $(3)$ that for all $n \in \mathbb N^*$ and $\varepsilon > 0 $ we have

$$\lim_{n\to\infty} \text{Pr}(\lvert X_n \rvert \geq \varepsilon) = \begin{cases} \lim_{n\to\infty} \text{Pr}(X_n = 1) = \lim_{n\to\infty} \frac 1n = 0 & \text{for $\varepsilon \leq 1$}, \\ \lim_{n\to\infty} 0 = 0 & \text{for $\varepsilon > 1$}, \end{cases}$$

which implies that $X_n \to 0$ in probability. Furthermore, all the events $\{X_n = 1\}$ are mutually independent, and

$$\sum_{n\geq 1} \text{Pr}(\lvert X_n \rvert = 1) = \sum_{n\geq 1} \frac 1n = \infty,$$

so by the second Borel-Cantelli's lemma we can conclude that

$$ \text{Pr}\bigl(\limsup_{n\to\infty} \{\lvert X_n \rvert = 1 \}\bigr) = 1,$$

which is to say that

$$\text{Pr}\bigl(\limsup_{n\to\infty} \{\lvert X_n \rvert \geq \varepsilon \}\bigr) \neq 0 \quad \forall \varepsilon \in\;]0, 1].$$

Consequently, $X_n \to 0$ in probability but not almost surely. That is, convergence in probability does not imply almost sure convergence, so the latter is a stronger condition than the former.

Albert
  • 747