4

There is a random number generator that obeys the standard normal distribution $X \sim N(\mu,\sigma^2)$, and then calculates the sum of the numbers generated until the sum is greater than $r$.

Specifically, it means to generate a random number, and then stop if it exceeds $r$, otherwise generate another random number. Sum all generated random numbers, stop if exceeds $r$, otherwise continue

How to find the expectation of the stop time $\mathbb{E}_r[X]$.

Similar to this question

Aster
  • 1,315
  • May related: https://math.stackexchange.com/questions/111314/choose-a-random-number-between-0-and-1-and-record-its-value-keep-doing-it-u – Aster Feb 12 '20 at 15:23
  • 1
    The standard normal distribution is $\ N(0,1)\ $. Do you want the answer only for that distribution, or for the normal distribution $\ N(\mu,\sigma^2 )\ $ with arbitrary mean $\ \mu\ $ and variance $\ \sigma^2\ $. – lonza leggiera Apr 02 '20 at 06:40
  • 1
    By stop time do you mean number of numbers generated? – Victor Gustavo May Apr 02 '20 at 19:10
  • 1
    Let $g(r)=\mathbb{E}r[X]$, let $f(x)$ be the density of $X$ . Then $$g(r)=1+\int{-\infty}^r g(r-t) f(t) dt$$ – leonbloy Apr 08 '20 at 12:51

2 Answers2

2

I have an iterated solution:

If $Z=X+X$ then $Z\sim N(\mu + \mu, \sigma^2+\sigma^2)$. For $Z_n=\sum_{k=1}^n X$ we have $Z_n\sim N(n\times \mu,n\times \sigma^2)$.

$P(X>r)=P_1(Z_1>r)= 1-\Phi\left({{r-\mu}\over{\sigma}}\right)=Q\left({{r-\mu}\over{\sigma}}\right)$ and for $n$:

$$P_n(Z_n>r) = Q\left({{r-n\times\mu}\over{\sqrt n\times\sigma}}\right)\times\left(1-\sum_{k=1}^{n-1}P_{k}\right)$$ or $$P_n(Z_n>r) = \left(1-\Phi\left({{r-n\times\mu}\over{\sqrt n\times\sigma}}\right)\right)\times\prod_{k=1}^{n-1}\Phi\left({r-k\times\mu}\over{\sqrt k\times\sigma}\right)$$

$E_r[X]=\sum_{n=1}^\infty n\times P_n$

Ymh
  • 314
  • I don't understand the $P_n(Z_n>r)$ equation. What is $P_k$ ? – leonbloy Apr 08 '20 at 15:22
  • 1
    @leonbloy $P_n$ is supposed to be the probability that the algorithm stops after generating n numbers. I was unsure whether I had to explicitly note the conditionality, because nth number is not generated at all if the sum exdeeded r previously. I didn't want the formula to take up the whole screen. I also noticed square roots are needed in the divisor. – Ymh Apr 08 '20 at 17:58
  • I am trying to simulate this with some normal distributions with $\mu = 0$ and I am not getting the correct results. Is $$\Phi\left( \frac{r}{\sigma}\right) = \frac{1}{2}\left(1+erf \left( \frac{r}{\sqrt{2}\sigma}\right)\right)$$? – user3141592 May 07 '23 at 14:47
0

Let $N$ be the number of generations needed. We need to find $E_r[N]$. It is natural to assume that $X_i$ are independent since they represent random number generations. Let $X_i$ be the output of the $i$th generation and $S_i=X_1+\cdots+X_i$. Then, $X_i\sim N(\mu,\sigma^2)$ and $S_i\sim N(i\mu,i\sigma^2)$. Let define the indicator random variable

$Y_i=1$ if $S_1\leq r,\ldots,S_i\leq r$

$Y_i=0$ otherwise.

Then, $N=1+\sum_{i=1}^{\infty}Y_i$. Since $Y_i$ are nonnegative,

$E[N]=1+\sum_{i=1}^{\infty}E[Y_i]=1+\sum_{i=1}^{\infty}P\{Y_i=1\}=1+\sum_{i=1}^{\infty}P\{S_1\leq r,\ldots,S_i\leq r\}$

$=1+\sum_{i=1}^{\infty}\int_{-\infty}^r\int_{-\infty}^{r-x_1}\cdots\int_{-\infty}^{r-x_1-\cdots-x_{i-1}}f(x_1)f(x_2)\cdots f(x_i)dx_idx_{i-1}\cdots dx_1$

where $f(x)$ is the probability density function of $N(\mu,\sigma^2)$.

bc78
  • 496
  • 2
    This seems wrong, it seems to assume that if $Y_i=0$ then $Y_j =0$ for $j>i$ but that's not true. In general, $\sum_{i=1}^{\infty}Y_i$ will diverge. – leonbloy Apr 08 '20 at 15:16
  • You are right @leonbloy, I have missed the point that the normal r.v. can take negative values :). I will try to update my solution. – bc78 Apr 08 '20 at 16:10