10

In a Post I found it says:

Whenever ${\rm E}(X)$ exists (finite or infinite), the strong law of large numbers holds. That is, if $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite or infinite expectation, letting $S_n = X_1+\cdots + X_n$, it holds $n^{-1}S_n \to {\rm E}(X_1)$ almost surely. The infinite expectation case follows from the finite case by the monotone convergence theorem.

Can someone give a reference/answer to this question? I want to prove that:

If $EX^{+}_{k}=∞ $ and $EX^{-}_{k}<∞ $ then $n^{−1}S_{n}→∞$ a.s.

JRC
  • 946
Lambda87
  • 343
  • If $\mathbb{E} X_k^+ = \infty$, then $\mathbb{E}X_k = \infty$ or it does not exists. – Hetebrij Feb 07 '16 at 08:56
  • 9
    Note that $S_n\geqslant S_n^x$ for every $x$, where $S_n^x=X_1^x+\cdots+X_n^x$ and $X_n^x=\inf(X_n,x)$ for every $n$, thus $\liminf S_n/n\geqslant\lim S_n^x/n=E(X_1^x)$ almost surely, then the limit $E(X_1^x)\to+\infty$ when $x\to\infty$ yields the conclusion. – Did Feb 07 '16 at 08:58
  • Thanks. But.where are we using monotone convergence here? you did some shortcuts so I can't understand the answer. can you write what arguments you used? – Lambda87 Feb 07 '16 at 09:10
  • 1
    $X_1^n \le X_1^{n+1}$ for all $n \in \mathbb{N}$, so $\lim_{n \to \infty} \mathbb{E} X_1^n = \mathbb{E} \lim_{n \to \infty} X_1^n = \mathbb{E} X_1 = \infty$. – Hetebrij Feb 07 '16 at 09:14
  • So, if to take all the comments if get the following thing: $n^{-1}S_{n}=lim_{k→\infty}n^{-1}S_{n}^{k}→lim_{k→\infty}E(X_{1}^{k})=E(X_{1}) = \infty$ am I right? – Lambda87 Feb 07 '16 at 09:23
  • 1
    More precise, for all $k \in \mathbb{R}$: $n^{-1}S_{n} \ge n^{-1}S_{n}^{k}→E(X_{1}^{k})$. So in particular you can consider the $\limsup$, which is the limit if it exists to get $n^{-1}S_{n}=\limsup_{k→\infty}n^{-1}S_{n}^{k}→\limsup_{k→\infty}E(X_{1}^{k}) = \lim_{k \to \infty} E(X_1^k) =E(X_{1}) = \infty$. – Hetebrij Feb 07 '16 at 10:12
  • Thanks @Hetebrij this step: limk→∞E(Xk1)=E(X1) is from monotone convergence right? – Lambda87 Feb 07 '16 at 10:20
  • 1
    Yes, that is monotone convergence. – Hetebrij Feb 07 '16 at 13:14
  • $EX_k^-<\infty$ excludes $EX_k^-=-\infty$ or not? And something else: Were you asking how to prove the second highlighted statement, or were you asking about the use of monotone convergence theorem in the first statement? – Jimmy R. Feb 07 '16 at 17:38
  • @Hetebrij Don't we need non-negativity for MCT? – user428487 May 31 '18 at 13:36
  • 1
    @user428487 We do, but we can consider the R.V. $Y^k = \begin{cases} X^k_1 & X_1 > 0 \ 0 & X_1 <0 \end{cases}$. Then we have $X^k_1 = Y^k - X_1^-$, thus we can use linearity of the expectation to split the expectation in $Y^k$ and $X_1^-$, where the latter is independent of $k$, so we can take the limit inside the expectation. For the former, $Y^k$, we can use MCT to switch limit and expectation. Then take the two expectations together again, and we have switched the expectation and limit. – Hetebrij Jun 01 '18 at 14:00
  • I have a full proof that can be found here: https://math.stackexchange.com/questions/3441217/lower-and-upper-bounds-for-expected-value/3441913#3441913 – Xiaohai Zhang Nov 19 '19 at 09:51

1 Answers1

2

The answer is positive. Reference: first page of the paper The Strong Law of Large Numbers When the Mean is Undefined KB Erickson · 1973 TRANSACTIONS OF THE AMERICAN MATHEMATICAL SOCIETY Volume 185. November 1973

Also theorem 2.4.5, R.Durrett, Probability, theory and examples (there's a proof there, which coinside with the Did's proof above).

Botnakov N.
  • 5,722