0

I would like to prove the following result : let $(Xn)_n$ be a sequence in $L^{1}(\Omega,\mathcal{F}_t, \mathbb{P})$ that converges almost surely to $X\in L^1$. Then if $(X_n)_n$ is uniformly integrable, $X_n$ converges in $L^1$ to $X$.

Here is my attempt for the first implication :

By almost sure convergence we have

$$ \mathbb{P}(\{\omega : \forall k\in\mathbb{N}^{*}, \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) - X(\omega)\rvert\leq\frac{1}{k}\}) = \mathbb{P}(\cap_{k\geq 1}\{\omega : \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) - X(\omega)\rvert\leq\frac{1}{k}\}) = 1 $$

If we denote $A_k = \{\omega : \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) - X(\omega)\rvert\leq\frac{1}{k}\}$ we see that for all $k\geq 1$ $\mathbb{P}(A_k)=1$ and $\mathbb{P}(A_{k}^{c})=0$.

Moreover

$$ \lvert X_n - X\rvert \leq \lvert X_n\rvert1_{A_{k}^{c}} + \lvert X\rvert1_{A_{k}^{c}} + \lvert X_n - X\rvert1_{A_{k}} $$

Now take $\epsilon>0$. By uniformly integrability of $X_n$ and $X$ we can find a $\delta>0$ such that $\mathbb{P}(B)\leq\delta$ implies that $\mathbb{E}(\lvert X_n\rvert1_{B})$ and $\mathbb{E}(\lvert X \rvert1_{B})$ are less than $\frac{\epsilon}{3}$. Clearly $A_{k}^{c}$ will always satisfy this condition. On the other hand, there exists $k$ big enough such that on $A_k$ we have the existence of $N$ such that for all $n\geq N$ we have $\lvert X_n(\omega) - X(\omega)\rvert\leq\frac{1}{k}\leq\frac{\epsilon}{3}$.

Thus we have

$$ \mathbb{E}(\lvert X_n - X\rvert)\leq 3 \frac{\epsilon}{3} = \epsilon $$

I would like to know if what I did is correct please, and if not have some hints in order to continue to work on this.

Thank you a lot !

G2MWF
  • 1,615
  • 1
    Your $N$ depends on $\omega$, so you cannot go from the second to last inequality to the last one,as the bound is not holding for all $\omega$ simultaneously. Considering that $1_{A_k} = 1$, it is useless to use $A_k$s in your proof. – Mason Oct 11 '23 at 01:07
  • @Mason Clearly, you are right, thank you. – G2MWF Oct 11 '23 at 06:16

2 Answers2

3

This is the cleanest proof that I wrote when I was doing my grad coursework and I have used it since then.

The first reduction as also done by Kakashi is to take $X_{n}-X$ which is u.i.(uniformly integrable) and then show that if $Y_{n}\xrightarrow{P} 0$ and $Y_{n}$ is u.i. then $Y_{n}\xrightarrow{L^{1}}0$

First fix $\epsilon>0$ and find $M>0$ such that $\sup_{n}E(|Y_{n}|\mathbf{1}_{|Y_{n}|>M})<\epsilon$

Now, $E(|Y_{n}|)=E(|Y_{n}|\mathbf{1}_{|Y_{n}|>M})+E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})\leq \epsilon+E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})$

Now $E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})=E(|Y_{n}|\mathbf{1}_{\{|Y_{n}|\leq M,|Y_{n}|\leq \epsilon\}}) +E(|Y_{n}|\mathbf{1}_{\{|Y_{n}|\leq M,|Y_{n}|> \epsilon\}})$

Now as $Y_{n}\xrightarrow{P}0$, $P(|Y_{n}|>\epsilon)<\delta$ for all $n>N(\delta,\epsilon)$ . Set $\delta=\frac{\epsilon}{M}$

Hence, we have $E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})\leq \epsilon\cdot P(\{|Y_{n}|\leq M,|Y_{n}|\leq \epsilon\})+ M\cdot P(\{|Y_{n}|\leq M,|Y_{n}|> \epsilon\})$

The above, is less than $\epsilon\cdot 1 +M\cdot\delta=\epsilon+M\cdot\frac{\epsilon}{M}$ for all $n\geq N(\epsilon,\frac{\epsilon}{M})$ (which depends only on $\epsilon$)

Thus $E(|Y_{n}|)<3\epsilon$ for all $n\geq N(\epsilon)$

The other direction is also true, i.e. if $X_{n}$ is $L^{1}$ Cauchy, then it is uniformly integrable and obviously $X_{n}$ being $L^{1}$ Cauchy implies it converges in probability to some random variable.

1

I would try a more brute force approach. First of all, since $(X_n - X)$ is uniformly integrable, we can reduce to the case $X = 0$. Now we want to show that $\limsup_{n}E(|X_n|) = 0$. If we had an $L^1$ dominating function, this would be a consequence of the DCT, but instead we have uniform integrability, which is a weaker condition. Let $0 \leq Y \in L^1$ be arbitrary. We have $$E(|X_n|) = E(|X_n| \land Y) + E((|X_n| - Y)^{+}).$$ By DCT, $E(|X_n| \land Y) \to 0$. So taking the $\limsup$ of both sides yields $$\limsup_{n}E(|X_n|) = \limsup_{n}E((|X_n| - Y)^{+}).$$ Since this holds for all $0 \leq Y \in L^1$, we can take the infimum over $0 \leq Y \in L^1$ to get $$\limsup_{n}E(|X_n|) = \inf_{0 \leq Y \in L^1}\limsup_{n}E((|X_n| - Y)^{+}).$$ Coincidentally, the definition of uniform integrability is that the RHS is $0$.

Mason
  • 12,787
  • Thank you for your answer ! I have some questions : why can we reduce to the case $X=0$ ? And I am not sure to understand why starting from « so for all non negative.. » we use the lim sup ? I think it is to use uniform integrability however $\lim\neq\lim sup$ except if we have convergence which is what we want to prove no ? Also I do not see how do we get this inequality $\limsup_{n}E(|X_n|) \leq \inf_{Y}\limsup_{n}E((|X_n| - Y)I(|X_n| \geq Y)).$ – G2MWF Oct 11 '23 at 09:23
  • 1
    @coboy I made the answer more clear. We actually have equality throughout. Note that for a nonnegative sequence of numbers $(a_n)$, $\limsup_{n}a_n \geq \liminf_{n}a_n \geq 0$, so if we show $\limsup_{n}a_n = 0$, then we obtain $\liminf_{n}a_n = \limsup_{n}a_n = 0$, i.e. $\lim_{n}a_n = 0$. – Mason Oct 11 '23 at 18:57