0

Problem. Let $X_1, \cdots, X_n$ be a sequence of independent Bernoulli random variables with success probability 0.5, i.e. $P(X_i = 0) = P(X_i = 1) = \frac{1}{2}$.

Define $Y_n = \sum_{k=1}^n X_k \left(\frac{1}{2}\right)^k$ for $n= 1, 2, ...$ Show that $Y_n$ converges in distribution to $U(0,1)$.

Attempt. So I'm thinking of proving using mgf (as the question hinted). I have found that $E(e^{-sY}) = \frac{1-e^{-s}}{s}, s \geq 0$ where $Y \sim U(0,1)$.

Now I try to show that $\lim_{n\to\infty}E(e^{-sY_n})=E(e^{-sY}) = \frac{1-e^{-s}}{s}$. So far I've got

$$E(e^{-sY_n}) = E(e^{-s\sum_{k=1}^n X_k \left(\frac{1}{2}\right)^k}) = \Pi_{i=1}^n E(e^{-sX_k (\frac{1}{2})^k})$$ and $$E(e^{-sX_k (\frac{1}{2})^k})=\frac{1}{2}e^{-s (\frac{1}{2})^k}$$ for all $k$. So I'm getting $E(e^{-sY_n})=(\frac{1}{2})^n e^{-s(1-0.5^n)}$ which goes to $0$ as $n$ approaches infinity.

I don't know if I made an error in calculations or something. Would appreciate if someone could point me in the right direction.

  • 1
    $$E(e^{-sX_k (\frac{1}{2})^k})=\frac{1}{2} + \frac{1}{2}e^{-s (\frac{1}{2})^k}$$ – Brian Moehring Aug 25 '19 at 07:58
  • These pages ask about this problem too: https://math.stackexchange.com/questions/3332510/simplifying-mgf-of-sum-of-bernoulli, https://math.stackexchange.com/questions/1268881/limit-distribution-of-infinite-sum-of-bernoulli-random-variables. – Minus One-Twelfth Aug 25 '19 at 08:08

0 Answers0