3

EDITED version to my original question...

For the coin toss problem the probability of getting exactly $k$ successes in $n$ trials is $$ f(k;n,p) = \Pr(X = k) = {n\choose k}p^k(1-p)^{n-k} $$

Here $p$ is fixed for all the trials. How can I modify this expression such that it allows me to use a different value for $p$ for every trial? Eventually I'd like to arrive at an expression for $Pr(X<k)$ which uses a time varying $p$. I've been trying to look at Permutation Matrices, Multinomial distribution and all, but really not sure how best to approach this problem.

ORIGINAL question was...

I'm trying to derive an expression for the "dynamic" binomial theorem. The "normal" binomial theorem is

$$ (x+y)^n = \sum_{k=0}^n \binom{n}{k} x^k y^{n-k} $$

How do I go about deriving an equivalent expression where $x$ and $y$ are varying for each $n$?

Piwi
  • 51
  • 4
  • Do you want $$\sum_{i=0}^m (x+y)^{a_i}$$. And you are not given that $a_is$are ordered natural numbers from 1 to m or so right? –  Mar 26 '13 at 14:45
  • is $\sum {n=0}^m\sum{k=0}^{a_n} \binom{a_n}{k} x^k y^{a_n-k}$ what you have in mind? –  Mar 26 '13 at 14:47
  • Hi, thanks for your reply. I mean it in the context of tossing a coin, but on each toss, I bias the coin (hence alter the probability of heads/tails on each toss.) The above formula extends to the scenario where the probability is fixed at all time steps. How about if the probability is changing at every time step? Thanks in advance! – Piwi Mar 26 '13 at 20:14
  • This is no where near but may be related distantly?without replacement cases –  Mar 26 '13 at 20:27
  • $n$ is a single number: the total number of tosses. So in your question, could you clear up what you mean by "$x$ and $y$ are varying for each $n$"? Do you want to replace $(x+y)^n$ with $(x_1+y_1)(x_2+y_2)\cdots(x_n+y_n)$? – 2'5 9'2 Mar 26 '13 at 20:35
  • Thanks for your reply. I've clarified the intention behind the problem. Hope it makes more sense. – Piwi Mar 26 '13 at 23:17

2 Answers2

1

The distribution of the r.v. $X$ that you're describing is the (well established), Poisson's binomial (PB) distribution, for which the pmf is given by...

$$ P\left[X = x\right] = \frac{1}{t + 1}\sum_{i = 0}^t \left\{\exp\left(\frac{-j2\pi i x}{t + 1}\right) \prod_{k = 1}^t \left\{p_k\left(\exp\left(\frac{j2\pi i}{t + 1}\right) - 1\right) + 1\right\}\right\} $$

Where $j = \sqrt{-1}$, $t =$ # trials, $x =$ # favorable outcomes, & $p_k =$ probability of success on the $k$th trial (whose value is stored inside the probability vector $p = \left[p_1\;p_2\;\ldots\;p_t\right]^T$).

You can refer to 1 of my questions for a solved example.

Landon
  • 698
0

Something like $(p_1x_1+q_1y_1)(p_2x_2+q_2y_2)\cdots(p_nx_n+q_ny_n)$ will give you the generating function of tossing a coin $n$ times with probability of heads $p_i$ at time $i$, where $q_i:=1-p_i$. If you set $x_i=x$ and $y_i=y$, then you'll be counting the probability of getting a specific number of heads (and therefore tails). The multi-binomial theorem will give you an expansion:

$$(a_1+b_1)^{n_1}\cdots (a_d+b_d)^{n_d}=\sum_{k_1=0}^{n_1}\cdots\sum_{k_d=0}^{n_d}\binom{n_1}{k_1}a_1^{k_1}b_1^{n_1-k_1}\cdots\binom{n_d}{k_d}a_d^{k_d}b_d^{n_d-k_d}$$

which will simplify if you set $x_i=x$ and $y_i=y$ and $n_i=1$.

Alex R.
  • 33,289