0

Consider the expected number of Bernouilli trials needed to get $n$ successes with the probability of one success each time being $p$ ($0 < p < 1$). This number should be $\frac{n}{p}$.

I tried to derive this case by case and got the expression
$$ \sum_{k=n}^{\infty}\frac{(k-1)!}{(n-1)!(k-n)!}\cdot p^{n}\cdot(1-p)^{k-n}\cdot k $$
The problem is, how does this equate to $\frac{n}{p}$? I can't seem to figure out how to derive it from the expression above.

Apologies if this question appeared before. I tried searching, but all I could find was the expected value of the binomial distribution, which may be similar but is not the case here.

On a side note, suppose instead that for each success I put a marble in a bag, but there are $b$ bags in total. What then is the expected value to reach $n$ marbles in all of them? Surely, this should not be $b\frac{n}{p}$. How would I approach this?

A.G.
  • 2,801

1 Answers1

1

I think it is easier than that.

Recall that the expected time to get a single success is $$ \sum_{k=1}^\infty k (1-p)^{k-1} p={1\over p}. $$

Now, call $N_i$ the random variable that counts the number of trials to get the $i$th success after getting the $(i-1)$th success (or after starting the game if $i=1$), so that the total number of trials to get $n$ successes is the random variable $$ N=\sum_1^n N_i. $$ Then use the fact that expectation is linear so that $$ E(N)=\sum_1^n E(N_i)= \sum_1^n {1\over p}={n\over p}. $$

A.G.
  • 2,801