8

To my surprise, I was able to evaluate the following expression in Mathematica:

$$E\left[e^X \left(1-(1-e^{-X}\right)^n) \right] = \frac{y}{y-1} \left(1-\frac{1}{\binom{n+y-1}{y-1}}\right)\quad X\sim\text{Exp}(y)$$ with the right hand side being equal to $\text{HarmonicNumber}(n)$ in the particular case $y=1$, and equal to $n$ when $y=0$.

If we define $\binom{n}{k} = \frac{\Gamma(n+1)}{\Gamma(n-k+1)\Gamma(k+1)}$ the results seems to hold for all $x,y\in\mathbb R$, though I mostly care about $n$ and $y$ as positive integers.

I have no idea how to prove this by hand. I tried a series expansion of the exponentials without realizing much. I also tried rewriting in terms of the uniform distribution, since $e^{-X}$ has the same distribution as $U^{1/y}$ for $U\sim\text{Uniform}(0,1)$.

Are there any tricks or properties I'm missing?

Thomas Ahle
  • 5,629
  • 1
    This isn't very insightful, but for a strict proof (not very motivating as to why), you could try induction. +1 because this looks interesting and I'll have to try it myself later. – Clarinetist Feb 08 '21 at 15:40
  • Also, to clarify, when you write $\text{Exp}(y)$, are you referring to the exponential distribution whose mean is $y$ or $1/y$? – Clarinetist Feb 08 '21 at 15:41
  • @Clarinetist Mean $1/y$. – Thomas Ahle Feb 08 '21 at 15:41
  • Maybe try rearranging it so that $(1-e^{-X})$ is used as an inverse of geometric progression - I'll try to see where it goes in the evening. Presumably, this would lead to some nice (or not) series of moments of exponential random variable... – defenestrator Feb 08 '21 at 16:08
  • In particular, if $q = e^{-X}$, then $\sum_1^n q^k(1-q) = (1-q^{n-1})$, so $E[ e^X \sum (e^{-X})^k(1-e^{-X})] = E[ e^X (1-e^{-X}) \sum_1^n (e^{-X})^k] = \sum_1^n E[ (e^{-X})^{k-1}] - E[ (e^{-X})^k]$ from where we should be able to somehow conclude by Laplace transform... – defenestrator Feb 08 '21 at 16:10
  • I wonder if there is a combinatorial proof. We can think of $e^{-X}$ as the minimum of $y$ uniform rvs in $[0,1]$. Then $E[(1 - e^{-X})^n]$ is the probability that $n$ new rvs are all smaller than the first $y$. This happens with probability $1/\binom{n+y}{y}$. – Thomas Ahle Feb 08 '21 at 16:46
  • Can anyone see the answer here by user @NN? It was nearly correct, reducing the problem to the sum $\sum_{k=0}^{n-1}1/\binom{k+y}{k}$, but now it has apparently been deleted? – Thomas Ahle Feb 08 '21 at 17:13

2 Answers2

4

This expectation is an integral,$$\begin{align}\int_0^\infty ye^{-(y-1)x}(1-(1-e^{-x})^n)dx&=\frac{y}{y-1}-\int_0^\infty ye^{-(y-1)x}(1-e^{-x})^ndx\\&\stackrel{z:=e^{-x}}{=}\frac{y}{y-1}-\int_0^1yz^{y-2}(1-z)^ndz\\&\stackrel{\star}{=}\frac{y}{y-1}-y\operatorname{B}(y-1,\,n+1)\\&=\frac{y}{y-1}\left(1-\frac{\Gamma(y)\Gamma(n+1)}{\Gamma(n+y)}\right)\\&=\frac{y}{y-1}\left(1-\frac{1}{\binom{n+y-1}{y-1}}\right),\end{align}$$with $\star$ using the Beta function.

J.G.
  • 118,053
2

Denote $$I = E\left[e^X \left(1-(1-e^{-X}\right)^n) \right] =\int_0^{+\infty}e^x(1-(1-e^{-x})^n)ye^{-yx}dx$$ Let's make a change of variables $z = 1-e^{-x}$ then $x = -\ln(1-z)$ and $dx = \frac{1}{1-z}dz$. The integral is from $z = 0$ to $z=1$. We have:

\begin{align} I & = \int_0^{+\infty}y(1-(1-e^{-x})^n)e^{-(y-1)x}dz \\ & = \int_0^{1}y(1-z^n)(1-z)^{y-1}\frac{1}{1-z}dz \\ & = \int_0^{1}y(1-z)^{y-1}(\sum_{k=0}^{n-1} z^k)dz \\ & = \sum_{k=0}^{n-1} \left( \int_0^{1}y(1-z)^{y-1}z^kdz \right) \\ \end{align}

By using Mathematica, we have $$\int_0^{1}y(1-z)^{y-1}z^kdz = \frac{\Gamma(1+k)\Gamma(y+1)}{\Gamma(1+k+y)}$$ Hence, $$I = \sum_{k=0}^{n-1} \left( \frac{\Gamma(1+k)\Gamma(y+1)}{\Gamma(1+k+y)} \right)$$

I resolve the case you asked where $y\in \mathbb{N}^*$, we have

\begin{align} I &= \sum_{k=0}^{n-1} \frac{y!k!}{(k+y)!} \\ &= y!\sum_{k=0}^{n-1} \frac{1}{(k+1)...(k+y)} \\ &= \frac{y!}{y-1}\sum_{k=0}^{n-1} \left( \frac{1}{(k+1)...(k+y-1)} - \frac{1}{(k+2)...(k+y)} \right) \\ &= \frac{y!}{y-1} \left( \frac{1}{(y-1)!} - \frac{n!}{(n+y-1)!} \right) \\ &=\frac{y}{y-1}\left(1-\frac{1}{\binom{n+y-1}{y-1}}\right) \end{align}

NN2
  • 20,162