Let $(a_n)_n$ be a descending convergent sequences with $\lim_n a_n = a$ and $(b_n)_n$ a convergent series with $\lim_n b_n = b$. Let $(\Omega, \mathcal{F}, \mathbb{P})$ be a probability space with $X$, $Y$ random variables.
Is there a way to show
$$a_n + b_n Y \ge X \ \ [\mathbb{P}]\ \ \text{for all}\ \ n \quad \implies\quad a+bY \ge X \ \ [\mathbb{P}].$$
We know that for every $n$ exists a set $\Omega_n \in \mathcal{F}$ with $\mathbb{P}(\Omega_n)=1$, such that $a_n + b_n Y(\omega) \ge X(\omega)$ for every $\omega \in \Omega_n$. I was thinking of using $\sigma$-continutity of the probability measure, however, I don't know how to show the required inclusions of the sets $\Omega_{n+1} \subset \Omega_n$. Then I'd have this:
$$1 = \mathbb{P}\left(\bigcap_n \left\{a_n + b_n Y \ge X \right\}\right) \\ = \lim_n \ \mathbb{P}\big(a_n + b_n Y \ge X \big) \\ = \mathbb{P}\big(a + b Y \ge X \big).$$
Thanks for reading and ideas.