11

Let an urn contain $w$ white and $b$ black balls. Draw a ball randomly from the urn and return it together with another ball of the same color. Let $b_n$ be the number of black balls and $w_n$ the number of white balls after the $n$-th draw-and-replacement. Let $X_n$ be the relative proportion of white balls after the $n$-th draw-and-replacement.

I start with $b=w=1$, so the total number of balls after the $n$-th draw-and-replacement is $n+2$. Now I want to find the limit distribution of $X_n$; I already showed that $X_n$ is a martingale and that it converges a.s. It is

$$X_n = \dfrac{w_n}{n+2} \quad\text{for}\quad n \in \mathbb{N}_0. $$

I've read that the limit distribution is a beta distribution, but I don't know how to get there.
I could write $w_n$ as the sum of $Y_i$ where $Y_i$ is $0$, if the $i$-th ball is black and $1$, if the $i$-th ball is black. Then I'd have

$$ w_n = 1+\sum_{i=1}^{n} Y_i. $$

Does this help? How can I proceed?

Thanks! :)

Sangchul Lee
  • 181,930
Max93
  • 809
  • 9
  • 17

2 Answers2

7

Suppose that we have started with $w$ white balls and $b$ black balls. Then

\begin{align*} \mathbb{P}(w_n = w+k) &= \binom{n}{k} \frac{w(w+1)\dots(w+k-1)b(b+1)\dots(b+n-k-1)}{(w+b)(w+b+1)\dots(w+b+n-1)} \\ &= \frac{1}{B(w, b)} \binom{n}{k} \frac{\Gamma(w+k)\Gamma(b+n-k)}{\Gamma(w+b+n)} \\ &= \frac{1}{B(w, b)} \frac{k^{w-1} (n-k)^{b-1}}{n^{w+b-1}} \frac{E_k(w)E_{n-k}(b)}{E_n(b+w)} , \end{align*}

where $\Gamma(\cdot)$ is the gamma function, $B(\alpha, \beta) = \frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}$ is the beta function, and

$$ E_n(z) := \frac{\Gamma(n+z)}{n!n^{z-1}}. $$

Note that $E_n(z) \to 1$ as $n\to\infty$. So, if we write $p_k = k/n$, then the m.g.f. of $X_n$ is explicitly given by

\begin{align*} \mathbb{E}[e^{\lambda X_n}] = \frac{1}{B(w, b)} \sum_{k=0}^{n} \exp\biggl( \lambda \frac{p_k + w/n}{1 + (w+b)/n} \biggr) p_k^{w-1}(1 - p_k)^{b-1} \frac{1}{n} \cdot \frac{E_k(w)E_{n-k}(b)}{E_n(b+w)}. \end{align*}

Letting $n \to \infty$, this converges to

\begin{align*} \mathbb{E}[e^{\lambda X_{\infty}}] = \frac{1}{B(w, b)} \int_{0}^{1} e^{\lambda p} p^{w-1}(1 - p)^{b-1} \, \mathrm{d}p. \end{align*}

From this, we read out that the distribution of $X_{\infty}$ has the density

$$ f(p) = \frac{1}{B(w, b)} p^{w-1}(1 - p)^{b-1} \mathbf{1}_{(0,1)}(p), $$

proving that the limit distribution is $\operatorname{Beta}(w, b)$.

Sangchul Lee
  • 181,930
  • could you justify the step where you let $n \to \infty$? Since $k$ is fixed, then $p_k \to 0$ as $n\to\infty$. – s l Feb 28 '23 at 04:15
  • 1
    @sl, Think of the expansion of $\mathbb{E}[e^{\lambda X_n}]$ as a Riemann sum of the form $$\sum_{k=0}^{n} f\left(\frac{k}{n}\right)\frac{1}{n} \cdot a_{n,k},$$ where $f$ is a continuous function on $[0, 1]$ and $(a_{n,k}:0\leq k\leq n<\infty)$ is a bounded double-sequence such that $a_{n,k} \to 1$ whenever $n,k\to\infty$ (hence can be considered as a 'perturbation'). Then we can prove that this indeed converges to $$\int_{0}^{1} f(x) , \mathrm{d}x. $$ – Sangchul Lee Feb 28 '23 at 04:49
  • @Sangchul Lee: Great answer! I am just a bit confused ... the other user showed that the distribution is a Uniform Distribution but you showed that this is a Beta Distribution. What is the difference? Thanks! – konofoso Jul 26 '24 at 04:23
  • @konofoso, The limit distribution is the beta distribution, as you can also see from the link in the other answer. When $w=b=1$, the limiting distribution $\text{Beta}(1,1)$ is the same as the uniform distribution, and I guess this is what BCLC did. – Sangchul Lee Jul 26 '24 at 04:52
  • Thank you! I think I understand now! I posted two questions about Polya Urn. Can you please look if you have time? – konofoso Jul 26 '24 at 05:07
  • https://math.stackexchange.com/questions/4950619/probability-of-seeing-all-original-balls-in-polyas-urn – konofoso Jul 26 '24 at 05:07
  • https://math.stackexchange.com/questions/4950647/how-long-will-it-take-for-polyas-urn-to-have-the-same-ratio-as-it-had-in-the-be – konofoso Jul 26 '24 at 05:07
1

Refer to this?


Assuming $B_n$ is uniform on $\{0,1,...,n\}$ (proven by induction):

$$M_{\Theta}(t) = E[\exp(t\Theta)]$$

$$= E[\exp(t\lim \frac{B_n + 1}{n+2})]$$

$$= E[\lim\exp(t \frac{B_n + 1}{n+2})]$$

$$= \lim E[\exp(t \frac{B_n + 1}{n+2})]$$

$$= \lim \frac{1}{n+1}[\exp(t \frac{1}{n+2}) + \exp(t \frac{2}{n+2}) + ... + \exp(t \frac{n+1}{n+2})]$$

Case 1: $t \ne 0$

$$= \lim \frac{a(n)}{(n+1)(1-a(n))} (1-a(n)^{n+1}), \ \text{where} \ a(n) := e^{\frac{t}{n+2}}$$

$$= \lim \frac{a(n)}{(n+1)(1-a(n))} \lim (1-a(n)^{n+1})$$

$$= \lim \frac{a(n)}{(n+1)(1-a(n))} (1-e^t)$$

$$= \frac{1-e^t}{-t}$$

$$= \frac{e^t-1}{t}$$

Case 2: $t = 0$

$$= \lim \frac{1}{n+1}[\exp((0) \frac{1}{n+2}) + \exp((0) \frac{2}{n+2}) + ... + \exp((0) \frac{n+1}{n+2})]$$

$$= \lim \frac{1}{n+1} (1)(n+1) = 1$$

This is the mgf of $Unif(0,1)$

BCLC
  • 14,197
  • 2
    @Did Edited. I accounted for $t=0$ this time – BCLC Feb 12 '16 at 22:59
  • This answer takes as granted the fact that $B_n$ has a uniform distribution on the set ${0, \ldots, n}$ from which the fact that the limit in law (in fact, an a.s. limit by martingale cvg) $\Theta$ of $B_n/n+1$ is uniform on [0,1] follows immediately. – Olivier May 27 '20 at 15:15
  • In other words this is not an answer. See Durrett "Probability, theory and examples", section 4.3.2 in the last ediition (available online) for the classical method relying on the combinatorics of the problem+ exchangeability. – Olivier May 27 '20 at 15:17
  • @Olivier which part is wrong exactly please? like $B_n$ isn't uniform? – BCLC Dec 14 '20 at 17:41
  • 1
    Please read carefully my comment : the answer assumes $B_n$ is uniform on ${0...n}$ to conclude $(B_n+1)/(n+2)$ converges to the uniform distribution (this time on the interval) : this is the trivial part in the analysis of the Polya urn; the "difficulty" here is to prove that $B_n$ is uniform on ${0...n}$ – Olivier Dec 15 '20 at 18:16
  • 1
    That $B_n$ is uniform on ${0...n}$ is typically done by induction on n, see any textbook with Polya urn (Durrett book for instance). – Olivier Dec 15 '20 at 18:17
  • @Olivier oh ok so at least i did 1 of the 2 things required? (2 things required: 1 - prove $B_n$ discrete uniform on 0 to n 2 - prove continuous uniform on $(0,1)$) – BCLC Dec 16 '20 at 07:28
  • @Olivier Edited answer. thanks! – BCLC Dec 16 '20 at 07:40
  • @BCLC : Great answer! I am just a bit confused ... the other user showed that the distribution is a Beta Distribution but you showed that this is a Uniform Distribution. What is the difference? Thanks! – konofoso Jul 26 '24 at 04:22