Let $S=\{-1,0,1\}$ be the sample space, let $X$ be the uniform distribution on $S$, then $P(x > 0)=\dfrac{1}{3}$.
Then the problem comes, let $S=\mathbb{R}$, $X$ be a random variable taken from $S$, then what is the probability $P(X > 0)$? Intuition tells us it is $\dfrac{1}{2}$,however the Lebesgue measure $\mathbb{R}$ and $\mathbb{R_+}$ are both $\infty$ and how can we define the probability of $\dfrac{\infty}{\infty}$?
Real analysis tells us if $S_0,S_1,S_2,...S_n \in A$ ($A$ is a $\sigma$ -algebra) is a descending series which means $S_0 \supset S_1 \supset S_2 \supset S_3...$,and $\textbf{one of those sets has finite measure}$, then
$m(\lim_{n \rightarrow \infty} S_n)=\lim_{n \rightarrow \infty}m(S_n)$.
Take $S_0=\{x \in \mathbb{R} \mid x>0\}$, and $S_i = \{x \in \mathbb{R} \mid x>i\}$,then $\{S_i\}$ forms a decending chain, however every set in the chain has infinite measure. Then how can we define the probability?
And what if $S = \mathbb{Q}$?both $\mathbb{Q}$ and $\mathbb{Q}_+$ have Lebesgue measure $0$...
Any help would be appreciated.