15

Given $n$ random variables $X_1, X_2,\dots, X_n$, which are independent and exponentially distributed with rate parameter $\lambda$, I was able to prove that

$$\mathbb E[\max\{X_1, X_2,\dots, X_n\}] = \frac{1}{\lambda}\sum_{k=1}^n(-1)^{k-1}\frac{n \choose k}{k}$$

However, I learnt from a different source that the expected value is the more elegant sum

$$\mathbb E[\max\{X_1, X_2,\dots, X_n\}] = \frac{1}{\lambda}\sum_{k=1}^n\frac{1}{k}$$

Numerically, both series are identical. How do I go about proving this analytically though? i.e. proving that

$$\sum_{k=1}^n(-1)^{k-1}\frac{n \choose k}{k} = \sum_{k=1}^n\frac{1}{k}$$

I have tried a few techniques to no avail e.g. using induction and trying to prove that

$$\sum_{k=0}^n(-1)^k\frac{n \choose k}{k + 1} = \frac{1}{n + 1}$$

(The context is the mean time to failure of a RAID 1 array with $n$ identical disks).

Thanks!

RobPratt
  • 50,938

3 Answers3

18

To show $$\sum_{k=0}^n(-1)^k\frac{\binom{n}{k}}{k + 1} = \frac{1}{n + 1},$$ multiply both sides by $n+1$ to obtain the equivalent identity $$\sum_{k=0}^n(-1)^k\frac{n+1}{k + 1}\binom{n}{k} = 1, \tag1\label1$$ use the "absorption" identity $$\frac{n+1}{k+1} \binom{n}{k} = \binom{n+1}{k+1},$$ and then apply the binomial theorem \begin{align} \sum_{k=0}^n (-1)^k \binom{n+1}{k+1} &= \sum_{k=1}^{n+1} (-1)^{k-1} \binom{n+1}{k} \\ &= \sum_{k=0}^{n+1} (-1)^{k-1} \binom{n+1}{k} - (-1)^{0-1} \binom{n+1}{0} \\ &= -(1-1)^{n+1} + 1 \\ &= 1, \end{align} which derives \eqref{1}.

RobPratt
  • 50,938
7

If some elementary calculus is permitted we can proceed as follows

$$s_{1}(n) = \sum_{k=1}^{n}\frac{1}{k}\\ =1+\frac{1}{2}+\frac{1}{3}+...+\frac{1}{n}\\ =\int_{0}^1(1+x+x^2+...+x^{n-1})\;dx\\ \overset{\text{geometric sum}}=\int_{0}^1\frac{1-x^n}{1-x}\;dx\\ \overset{x\to 1-y}=\int_{0}^1\frac{1-(1-y)^n}{y}\;dy\\ \overset{\text{binomial expansion}}=\int_{0}^1 \sum_{k=1}^{n} (-1)^{k+1}\binom{n}{k} y^{k-1}\;dy\\ \overset{\text{integration}}=\sum_{k=1}^{n} (-1)^{k+1}\binom{n}{k}\frac{1}{k}\;dy\\ =s_{2}$$

Q.E.D.

Dr. Wolfgang Hintze
  • 13,265
  • 25
  • 49
5

You can compute the expectation in two different ways. Taking $Y_i = X_i/\lambda$ we can assume WLOG that the $X_i$ are i.i.d exponential random variables with rate $\lambda = 1$. For the alternating sum notice that the density of the maximum of the $X_i$ is given by $$ f_{X(n)}(u) = n(1-e^{-u})^{n-1}e^{-u} $$ for $u>0$ so that $$ E(X_{(n)}) = \int_0^\infty nu(1-e^{-u})^{n-1}e^{-u}\,du. $$ In particular expanding the binomial using the binomial theorem and using the fact that $\int_0^\infty ue^{-ku} = 1/k$ for $k>0$ will yield the expression using the alternating sum.

The other expression follows from the fact that the spacing variables $Z_j$ defined by $Z_1 = X_{(1)}$ and $Z_j = X_{(j)}- X_{(j-1)}$, for $2\leq j\leq n$ are i.i.d exponential random variables with rate equal to $n-j+1$. For details, see the answer here.