6

Assume $X_i\overset{i.i.d}{\sim} U[0,1]$. Let $S_n=\sum_{i=1}^nX_i$ and $T=\min\{n:S_n\geq 1\}$.

Find:

(1) $E[T]$, (2) $E[S_T]$, (3) $E[X_T]$

For the first one, $E[T]=e$ (see e.g. this question). For the second one we can use Wald's equation to get $E[S_T]=E[T]E[X_1]=\frac e2$. How about the third one?

  • You could say $X_T=S_T-S_{T-1}$. Take expectation of both sides and get $E(X_T)=1/2$. –  Jul 05 '21 at 00:03
  • 1
    How do you get $E[S_{T-1}]$? We can not use Wald's identity because $T-1$ is not a valid stopping time. In fact numerical experiments show that $E[X_T]$ is around 0.64. – True Light Jul 05 '21 at 00:09
  • Wald's Equation computes the expected value of a random sum of iid random variables using total law of expectation. The length of the random sum need not represent a stopping time in some random process. But if the answer of $1/2$ doesn't match numerical experiments maybe it's not correct. –  Jul 05 '21 at 00:19

2 Answers2

6

I assume you already know the volume of the simplex. For all $t\ge 1$, by Fubini's theorem,

$$\mathsf{E}\left[X_t \mathbf{1}_{T\ge t}\right]=\mathsf{E}\left[X_t \mathbf{1}_{X_1+\ldots+X_{t-1}\le 1}\right]=\int_{[0,1]^t} x_t \mathbf{1}_{x_1+\ldots+x_{t-1}\le 1}\, \,dx_1\ldots dx_{t}=\frac{1}{2(t-1)!}$$

and

\begin{align*} \mathsf{E}\left[X_t \mathbf{1}_{T>t}\right]&=\mathsf{E}\left[X_t \mathbf{1}_{X_1+\ldots+X_t\le 1}\right]\\ &=\int_0^1 x_t\left(\int_{[0,1]^{t-1}} \mathbf{1}_{x_1+\ldots+x_{t-1}\le 1-x_t} \,dx_1\ldots dx_{t-1}\right) \,dx_t\\ &=\int_0^1 x_t(1-x_t)^{t-1}\left(\int_{[0,1]^{t-1}} \mathbf{1}_{x_1+\ldots+x_{t-1}\le 1} \,dx_1\ldots dx_{t-1}\right) \,dx_t\\ &= \frac{1}{(t-1)!} \int_0^1 x_t(1-x_t)^{t-1} \, dx_t\\ &= \frac{1}{(t+1)!}. \end{align*}

Therefore,

$$\mathsf E X_T=\sum_{t\ge 1} \left(\mathsf{E}\left[X_t\mathbf{1}_{T>t}\right]-\mathsf{E}\left[X_t\mathbf{1}_{T\ge t}\right]\right)=2-\frac{e}{2}\approx 0.64.$$

md5
  • 2,853
  • Typo in last line; should be $\mathsf E X_T=\sum_{t\ge 1} \left(\mathsf{E}\left[X_t\mathbf{1}{T\color{red}\ge t}\right]-\mathsf{E}\left[X_t\mathbf{1}{T\color{red}> t}\right]\right)$. – StubbornAtom Sep 19 '24 at 08:14
3

In case you didn't know the volume of a simplex, here is an alternate approach that makes use of joint densities, total law of expectation, and the Irwin Hall Distribution for a sum of iid $\mathcal{U}[0,1]$ random variables.

Fix $n\geq 2$, then let $f_{S_{n-1}S_{n}}$ denote the joint density of the random variable $(S_{n-1},S_n)$. Notice $$f_{S_{n-1}S_n}(x,y)=f_{S_n|S_{n-1}}(y|x)f_{S_{n-1}}(x)=\frac{1_{A_n}}{(n-2)!}\sum_{k=0}^{\lfloor x \rfloor}(-1)^k{ n-1 \choose k}(x-k)^{n-2}$$ Here $A_n$ is the set $$A_n=\{(x,y):x\leq y \leq x+1,0\leq x\leq n-1\}$$ We can use our joint pdf to describe $P(T=n)=p_T(n)$:

$$\begin{eqnarray*} p_T(n)&=&P(S_n \geq 1,S_{n-1}<1) \\ &=& \int_0^1 \int _1^{x+1}f_{S_{n-1}S_n}(x,y)\mathrm{d}y\mathrm{d}x \\&=& \frac{1}{(n-2)!}\int_0^1\sum_{k=0}^{\lfloor x \rfloor}(-1)^k {n-1 \choose k}x(x-k)^{n-2}\mathrm{d}x\\&=& \frac{1}{(n-2)!}\int_0^1 x^{n-1}\mathrm{d}x \\ &=& \frac{1}{n(n-2)!} \end{eqnarray*}$$ We may also enforce our joint pdf to calculate the conditional expectation of $X_T$ given that $T=n$: $$\begin{eqnarray*} E(X_T|T=n) &=& E(X_n|S_n\geq 1,S_{n-1}<1) \\&=& E(S_n-S_{n-1}|S_n \geq 1,S_{n-1}<1) \\ &=& \frac{\int _0 ^1 \int_{1}^{x+1}(y-x)f_{S_{n-1}S_n}(x,y)\mathrm{d}y\mathrm{d}x}{\int _0 ^1 \int_{1}^{x+1}f_{S_{n-1}S_n}(x,y)\mathrm{d}y\mathrm{d}x} \\ &=& n \int_0^1 \int_1^{x+1}(y-x)x^{n-2}\mathrm{d}y\mathrm{d}x \\ &=& \frac{1}{2}+\frac{1}{2(n+1)} \end{eqnarray*}$$ With the total law of expectation and using the fact that $p_T(1)=P(X_1=1)=0$ we get $$\begin{eqnarray*}E(X_T)&=&\sum_{n=1}^{\infty}E(X_T|T=n)p_T(n) \\ &=& \sum_{n=2}^{\infty}\Bigg[\frac{1}{2}+\frac{1}{2(n+1)}\Bigg] \cdot \frac{1}{n(n-2)!} \\ &=& 2-\frac{e}{2}\end{eqnarray*}$$