41

I recently asked a question and got a great answer that involved proving that "X is almost surely one of the roots of P".

I know (now) that "almost surely" means "with probability 1", but I've never understood why that phrase exists.

When something has a probability of 1, it's going to happen, no almost about it. What's the story there? I've tried looking this up online, but I just got more definitions, not meaningful explanations.

Jerry Guern
  • 2,764
  • 14
    Suppose you pick a number at random between $0$ and $1$. The probability is $1$ that you will not pick $.5$. Yet, you can't rule out the possibility of picking $.5$. – littleO Sep 20 '15 at 03:37
  • 3
    Probability-0 events can happen: what's the probability that a uniformly distributed random variable $U[0,1]$ takes the value $a$, for a given $a \in [0,1]$? – Chappers Sep 20 '15 at 03:39

3 Answers3

39

In terms of the sample space of events $Ω$, an event $E$ happens almost surely if $P(E) = 1$, whereas an event happens surely if $E=Ω $.

An example: suppose we are (independently) flipping a (fair) coin infinitely many times. The event

$$\{ \text{I will get heads infinitely often}\}$$ is an almost sure event (because it is possible get only a finite number of heads...but how likely is that? Rigorous proof uses Borel Cantelli, if you are interested)

In contrast, $$\{\text{ I will get heads or tails on my 16th flip} \}$$ must happen. This is a sure event.

Calvin Khor
  • 36,192
  • 6
  • 47
  • 102
  • 1
    I think you meant that the negation of the first event is almost sure. – yoann Sep 20 '15 at 09:19
  • As in? $P(\text{heads i.o}) = 0$? – Calvin Khor Sep 20 '15 at 09:20
  • 1
    @yoann: the second Borel Cantelli lemma (https://en.wikipedia.org/wiki/Borel–Cantelli_lemma) states that if $A_n$ are independent events with $∑P(A_n) = ∞$, then $A_n$ occurs infinitely often (the event $A_n\ i.o$ is precisely $\limsup_{n→∞} A_n$). Here $A_n = {\text{$n$th flip is heads}}$ and $P(A_n) = 1/2$ so we do indeed have $P(A_n i.o) = 1$. – Calvin Khor Sep 20 '15 at 09:38
  • 1
    @yoann I also misread that; you need to read it as "I will get an infinite number of heads, ignoring any tails that I get" rather than "I will only get heads" – Dave Sep 20 '15 at 09:58
  • @Dave Thanks, I indeed read "I will only get tails". Yes of course, we will have an infinite number of heads with probability 1. – yoann Sep 20 '15 at 10:04
  • 1
    @Dave Ah, I see the confusion. Sorry :) In my defense, 'infinitely often' is reasonably standard notation. The situation is not unlike the fact that as I move along the positive integers $n=1,2,3,…$, $n$ will be even infinitely often, but I make no claims that it happens all the time(or even at what frequency!). – Calvin Khor Sep 20 '15 at 10:07
  • @yoann I think I can only tag one person per comment so please see the above. Sorry for the confusion :) – Calvin Khor Sep 20 '15 at 10:09
  • I don't intend to bump this question but I suppose I should have said "This is what might be called a sure event." – Calvin Khor Mar 02 '20 at 13:02
12

There is a difference between "almost surely" and "surely."

Consider choosing a real number uniformly at random from the interval $[0,1]$. The event "$1/2$ will not be chosen" has probability $1$, but is not impossible.

I recommend reading the relevant Wikipedia article, which I found very clarifying when I was learning probability.

Potato
  • 41,411
  • 7
    To clarify the difference between 'surely' and 'almost surely': if you pick a real number uniformly at random from $[0,1]$, you will almost surely pick a number larger than zero, and surely pick a nonnegative number. – Hugh Sep 20 '15 at 12:43
4

Subtle difference is present.

If some event occurs, then it has probability measure $1$. But if some event occurs with probability $1$, then the event need not occur, and the event that the event does not occur has probability measure $0$.

Note that probability is not about what happenED, for if so then we do not need probability theory. Probability is about what may happen.

So it is not that a random walk must go back to the origin, but that it will go back to the origin with probability $1$.

You may be referred to the original, axiomatic treatment of probability by A. Kolmogorov

Yes
  • 20,910