0

... or more appropriately, with the tail twisted off. It is an extension to this other question.

I get it that the expected value of a binomial distribution equals $np$.

How would I get the expected value minus the extreme outcome? (e.g. expected number of tails in a sequence of coin tosses with at least one head) Or as pointed by @RobertIsrael, conditional outcome of X < n.

In below ,

$ {E}(X \mid X < n) = np \left(\sum_{r=0}^{n-1} \dbinom{n-1}{r} p^r (1-p)^{n-1-r} \right) - \text{[...something...]}\\ $

does something = ${n}{p^n}$ or just ${p^n}$. The reason for my confusion is that the former appears to be correct conceptually. However, the results of a simulation are agreeing with the latter.

MP Droid
  • 149
  • You really need to be more precise about what you mean. Do you mean the conditional expectation of $X$ given $X > 0$? or given $X < n$? – Robert Israel May 25 '16 at 22:38
  • I would have thought $\left(\sum_{r=0}^{n-1} \dbinom{n-1}{r} p^r (1-p)^{n-1-r}\right) = 1$ – Henry May 25 '16 at 22:47

1 Answers1

1

So the extreme case has value $n$ with probability $p^n$.

That makes $E[X \mid X \not=n] = \dfrac{np-np^n}{1-p^n} = np - \dfrac{np^n}{1+p+p^2+\cdots +p^{n-1}}$

Henry
  • 169,616
  • beginning to understand. As the number of points have been reduced by $p^n$, we have to reduce the denominator as well to compute the average. aha! - thanks. – MP Droid May 25 '16 at 22:48
  • 1
    @MPDroid Pretty much; by the Law of Total Probability: $$\begin{align}\mathsf E(X) =& \mathsf E(X\mid X<n)~\mathsf P(X<x)+\mathsf E(X\mid X=n)~\mathsf P(X=n)\[2ex]\mathsf E(X\mid X<n) =&~ \dfrac{\mathsf E(X)-\mathsf E(X\mid X=n)~\mathsf P({X=n})}{1-\mathsf P(X=n)} \[1ex] =&~ \dfrac{np-np^n}{1-p^n} \[1ex] =&~\dfrac{np(1-p^{n-1})}{(1-p^n)}\end{align}$$ – Graham Kemp May 26 '16 at 02:29