1

To win a game, you need to hit the bullseye $n$ times in a row (if easier, let $n$ be $3$). The probability of hitting the bullseye is $p$ for each throw, so it's independent (if easier, let $p$ be $0.6$). What is the expected number of throws it takes you to win?

The difficulty I couldnt get past was the fact that it has to be in a row, so you could for example hit, hit, miss and then still have to hit three times in a row again, so every miss kind of resets you. If it helps, I wrote a short program which estimates the number of throws by simulating the game a $100000$ times, and the expected score for $n=3$ and $p=0.6$ turned out to be about $9.07$, but I need to get the exact value for any probability $p$, not just an estimate.

jlammy
  • 9,424
  • You have a Markov chain with four states corresponding to having the last $0,1,2,3$ shots being hits. The last state is absorbing. The transition probabilities are $p$ to advance one state and $1-p$ to go back to $0$. Define variables for the expected time to absorb from each state. You should get a set of three simultaneous equations. – Ross Millikan Jan 05 '21 at 02:45

1 Answers1

6

Let $k_n$ be the expected number of throws to get $n$ bullseyes in a row. Then

  • With probability $1-p$ we miss, so waste the first throw.

  • With probability $p(1-p)$ we score then miss, so waste the first two throws

    $\vdots$

  • With probability $p^{n-1}(1-p)$ we score $n-1$ then miss the $n$-th, so waste the first $n$ throws.

  • With probability $p^n$ we score $n$ in a row.

So putting that all together, we get $$k_n=p^nn+\sum_{i=1}^np^{i-1}(1-p)(k_n+i)=\left(1-p^n\right)k_n+\sum_{i=1}^{n+1}p^{i-1}\implies\boxed{k_n=\frac{1-p^n}{p^n(1-p)}}.$$


Alternatively, note that it takes $k_{n-1}$ throws in expectation to get $n-1$ in a row. Once we get there, either we toss a bullseye and win (with probability $p$), or we miss and start over (with probability $1-p$). So $$k_n=k_{n-1}+p+(1-p)(k_n+1)\implies k_n=\frac{k_{n-1}+1}{p},$$ and we are done by induction, as $k_1=\frac{1}{p}$.


Just for the record, here's a fun martingale solution.

Suppose at time $k$, a new gambler $g_k$ enters the casino and bets £$1$ that throw $k$ will hit the bullseye. If you miss, then $g_k$ loses so leaves the game; else $g_k$ wins £$p^{-1}$. In that case, $g_k$ then bets all their money that throw $k+1$ will hit the bullseye, and so on. This continues until either you miss, or you hit $n$ bullseyes in a row and $g_k$ wins £$p^{-n}$.

If $M_k$ is the total winnings for the casino at time $k$, then it's clear that $M_k$ is martingale. Let $\tau$ be the time at which you hit $n$ consecutive bullseyes. Clearly $\mathbb E\left[\lvert M_{n+1}-M_n\rvert\right]$ is bounded, so we can apply the optional stopping theorem: $$0=\mathbb E[M_0]=\mathbb E[M_\tau]=\mathbb E\left[\tau-p^{-1}-p^{-2}-\dots-p^{-n}\right]\implies\mathbb E[\tau]=\frac{1-p^n}{p^n(1-p)}.$$

jlammy
  • 9,424