0

Considering:

Almost sure convergence as $X_n \xrightarrow[]{\text{a.s}} X \Leftrightarrow P(\lim_{n \rightarrow \infty}X_n=X)=1$ and

Mean $L_p$-convergence as $X_n \xrightarrow[]{\text{mean}} X \Leftrightarrow \mathbb{E}|X_n-X|^p \rightarrow 0$

Are there any examples of such Random variables that they converge in terms of mean and do not converge in terms of 'almost sure' convergence and vice-versa - converging a.s (in terms of definition listed above) and do not converging in terms of mean.

May you help me with this one? (Doing this as a part of my types of RV convergence implication tree proof building - I've used infamous Riss model and similar during proofing, however don't have any ideas for this two - I want to prove that mean (so-called $L_p$) convergence do not necessarily imply a.s convergence and vice versa).

StubbornAtom
  • 17,932
  • Surely your definition of almost sure convergence should have the probability equalling $1$? – jlammy Jan 28 '21 at 21:32
  • https://math.stackexchange.com/questions/3959062/probability-density-functions-that-demonstrate-the-difference-between-the-weak-a/3959071#3959071 – Ian Jan 28 '21 at 21:33
  • @jlammy oddly yes.. – 9cloudalpha Jan 28 '21 at 21:49
  • What do you mean by "oddly yes"? The thing that you wrote, with $0$ on the RHS, is literally the complete opposite of convergence. I'm sure it's just a typo in your text -- it should be $1$ on the RHS. – jlammy Jan 28 '21 at 21:52
  • @jlammy maybe it is not exactly a.s convergence, I'm living not in the USA/England/etc and we call it here a.s convergence) – 9cloudalpha Jan 28 '21 at 21:56
  • Like I said, I think that it's just a typo -- the RHS must be $1$. Think about it this way -- say that $X_n=1$ identically. Then under your definition, the $X_n$ don't converge to the rv $X=1$! In other words, even the constant sequence doesn't converge under your definition! If there are mathematically literate aliens out there, I'm sure that they'd like their constant sequences to converge too. – jlammy Jan 28 '21 at 21:58
  • @jlammy, oh, sorry, sleepy now, of course it is 1 there. However during my attempts I was assuming 1 ofc. – 9cloudalpha Jan 28 '21 at 22:00

1 Answers1

1

Consider $$X_n=\begin{cases}n^3 & \text{with probability }n^{-2} \\ 0 & \text{with probability }1-n^{-2}.\end{cases}$$ Then $X_n\to0$ almost surely: indeed, $\sum\mathbb P(\lvert X_n\rvert>\varepsilon)\leq\sum n^{-2}$ which converges, so we can apply Borel-Cantelli. But $\mathbb E[X_n]=n\to\infty$.


Now consider independent rvs $$Y_n=\begin{cases}1 & \text{with probability }n^{-1} \\ 0 & \text{with probability }1-n^{-1}.\end{cases}$$ Then $\mathbb E[Y_n]=1/n\to0$, but $Y_n$ doesn't converge a.s. Indeed, as $\sum\mathbb P(Y_n=1)=+\infty$ and the events $\{Y_n=1\}$ are independent, the second Borel-Cantelli lemma gives that $$\mathbb P\left(\limsup_{n\to\infty}\{Y_n=1\}\right)=1,$$ so we can't have $Y_n\to0$ almost surely.

jlammy
  • 9,424