5

I found the following theorem in Allan Gut, Probability theory (without proof): Let $X$ be a nonnegative random variable and $g$ a nonnegative differentiable strictly increasing function. Then $$Eg(X)=g(0)+\int_{(0,\infty)}g'(x) \mathbb{P}(X>x)\,dx,$$ and $$Eg(X)<\infty\Longleftrightarrow \sum_{n=1}^\infty g'(n)\mathbb{P}(X>n)<\infty.$$

The first statement I could prove (let $X\sim F$):

\begin{eqnarray*} Eg(X)=\int_{(0,\infty)} g(x)dF(x) & =\int_{(0,\infty)} \int_{(0,x)}g'(t)\,dt\,dF(x)+g(0)\\[8pt] & =g(0)+\int_{(0,\infty)}g'(t) \mathbb{P}(X>t)\,dt, \end{eqnarray*} by Fubini. But I am struggling with the second statement. My idea is the following: $$Eg(X)=\int_{(0,\infty)} g(x)\,dF(x)=\sum_{k=1}^\infty \int_{(n_k,n_{k+1})} g(x) \, dF(x).$$

for some increasing sequence satisfying $\cup_{k=1}^\infty [n_k,n_{k+1}] = \mathbb{R^+}$. But i cannot find the appropriate sequence. any suggestions? Thank you.

Seneca
  • 245
  • I suspect the second statement should be $\displaystyle Eg(X)<\infty\Longleftrightarrow \sum_{n=1}^\infty g'(n)\mathbb{P}(X>n)<\infty$ replacing the $k$ with $n$ – Henry Apr 25 '14 at 08:10
  • Yes you are right.I corrected it. – Seneca Apr 25 '14 at 08:55
  • What if you apply your idea of decomposition directly on the previous expression $\int_{(0,\infty)}g'(t)\mathbb{P}(X > t)dt$, and then using the monotony and positivity of your functions/distributions to compare it to $\sum g'(n) \mathbb{P}(X > n)$ ? – yago Apr 25 '14 at 09:02
  • The problem is the missing monotonicity of $g'$. – Seneca Apr 25 '14 at 12:53
  • indeed, because the derivative of Volterra's function is bounded then we can construct strictly increasing differentiable functions that are not integrable, so we can't use the identity $g(x)=\int_{(0,t)}g(t),d t$ freely –  Jan 30 '23 at 19:23

2 Answers2

5

Without further assumptions on the monotonicity of the derivative the claim is in general not correct.

Example Let

$$h(x) := \begin{cases} 2 \left(n^2-\frac{1}{n^2} \right) \cdot (x-n)+\frac{1}{n^2} & x \in \left[n,n+\frac{1}{2} \right] \\ n^2- 2 \left(n^2-\frac{1}{(n+1)^2} \right) \left(x-n-\frac{1}{2} \right) & x \in \left[n+ \frac{1}{2},n+1 \right] \end{cases}$$

for $n \in \mathbb{N}$.

$\hspace{90pt}$enter image description here

Then $h$ is a (strictly) positive continous function and therefore

$$g(x) := \int_0^x h(y) \, dy$$

defines a strictly increasing differentiable positive function. In particular, $g'(n)= \frac{1}{n^2}$ and $$g' (x) \geq \frac{n^2}{2}, \qquad x \in \left[n+\frac{1}{4},n+\frac{3}{4} \right]. \tag{1}$$

Now if we consider a random variable $X$ such that $\mathbb{P}(X > x) = \frac{1}{x}$ for $x$ sufficiently large, then $(1)$ shows that

$$\int_{(0,\infty)} g'(x) \mathbb{P}(X>x) \, dx = \infty.$$

On the other hand,

$$\sum_{n} g'(n) \mathbb{P}(X >n) \leq \sum_n g'(n) = \sum_n \frac{1}{n^2} < \infty.$$

saz
  • 123,507
  • Thank you very very much saz! Nice work! I just read your answer and it is a nicely constructed counter example. I really normally believe books and it takes some time till I start thinking about the correctness. So this theorem in the book of Allan Gut is indeed wrong. When $g'$ is also monotone, it is not so difficult to prove and all the examples of functions $g$ in this book are monotone. (the proof of the theorem was an exercise) – Seneca May 12 '14 at 16:42
  • @Seneca You are welcome. By the way, you can accept (and upvote) helpful answers by clicking on the check mark next to the answer. – saz May 12 '14 at 21:32
  • Ah I did not know. Of course, I just did it :-) – Seneca May 13 '14 at 08:39
0

Too long for a comment: there is a slight problem using Tonelli's theorem in the derivation given by the OP, the problem is that it seems possible to build strictly increasing non-negative functions such that it derivative is not Lebesgue integrable in any interval of the form $[0,x]$. However this potential problem can be avoided following this other derivation:

$$ \begin{align*} \mathrm{E}[g(X)]&=\int_{(0,\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=\int_{(0,g(0)]}\overbrace{\Pr [g(X)>y]}^{=1}\mathop{}\!d y+\int_{(g(0),\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=g(0)+\int_{(g(0),\infty )}\Pr [g(X)>y]\mathop{}\!d y\\ &=g(0)+\int_{(0,\infty )}\Pr [X>x]\mathop{}\!d g(x),\quad \text{ using the change of variable }y=g(x)\\ &=g(0)+\int_{(0,\infty )}g'(x)\Pr [X>x]\mathop{}\!d x \end{align*} $$