2

I ask this because I'm trying to understand a proof of the expected value of a non negative random variable is equal to $\int_{0}^{\infty}(1-F(x))dx$ ($F$ is the distribution function of $X$) when $E[X]$ exists, and in some step I have $\lim\limits_{x\to\infty}({x·P[X>x])}=0$ and I don't know why it verifies. The problem I see is the indetermination in the limit $\infty·0$.

1 Answers1

3

If $EX<\infty$, then you can conclude $\lim_{x\to\infty}xP(X>x)=0$. In order for this proof to work in full generality, you need to be comfortable with the Lebesgue-Stieltjes integral. This simultaneously generalizes the formulas $EX=\sum_{x\in \mathcal X}xp(x)$ and $EX=\int_{0}^\infty xp(x)\,dx$ for the expectation of discrete and continuous random variables.

Assume$^1$ that $X\ge 0$ always. Then $$ EX=\int_0^\infty t\,dF(t) = \int_0^x t\,dF(t)+\int_x^\infty t\,dF(t)\tag{see 2} $$ Also, you have that $$ EX=\lim_{x\to\infty} \int_0^x t\,dF(t) $$ Therefore, it follows that $$ \lim_{x\to\infty} \int_x^\infty t\,dF(t)=0 $$ You can then conclude by noting $$ \int_x^\infty t\,dF(t)\ge \int_x^\infty x\,dF(t)=x\int_x^\infty dF(t)=x[(\lim_{N\to\infty}F(N))-F(x)]=xP(X>x)\ge 0 $$ and using the squeeze theorem.

  1. If you want to generalize to allow $X\le 0$, apply the result to $Y=\max(X,0)$.

  2. If you are only interested in the case when $X$ has a continuous distribution, then you can replace $\int t\,dF(t)$ with $\int t f(t)\,dt$ throughout, where $f(t)$ is the pdf.
    It is also possible to give a proof which works when $X$ is discrete, by replacing the integrals in question with sums.

Mike Earnest
  • 84,902