Let $T$ be a non-negative, integer valued random variable, and $X_i$ independent Bernoulli s.t. $P(X_i=0)=p$. Of $T$, I know only that it has mean $\mu$ with $1 \lt \mu \lt \infty$. How do I find the distribution of the random sum $\sum_{n=1}^T X_n$? I know it will be a binomial sum, but how explicit can I be when I don't know the distribution of $T$?
Asked
Active
Viewed 482 times
0
-
Bernoulli distributions more usually have $P(X_i=1)=p$ and $P(X_i=0)=1-p$. If you know the mean and variance of $T$, you can find the mean and variance of $X_1+\cdots+X_T$ – Henry Dec 06 '20 at 13:07
-
@Karl : it is not clear if you are looking at the distribution of $Z=T+X$ or $Z=T+X_1+X_2+...+X_n$ – tommik Dec 06 '20 at 13:17
-
Sorry. I am looking at the distribution of X_1+...+X_T – Karl Dec 06 '20 at 13:40
-
@Henry. I only know the mean. Is it enough to find the mean of the sum $X_1+\cdots+X_T$? Should it be just the product of the means of $T$ and $X_i$? – Karl Dec 06 '20 at 14:28
-
1@Karl - yes: the law of total expectation – Henry Dec 06 '20 at 15:27
1 Answers
0
Set
$$Y=\sum_{i=1}^T X_i$$
The conditional density $Y|T=t$ is a binomial $Bin(t;q)$ where $q=1-p$.
Thus the pmf of Y (marginal is the following)
$$P(Y=y)=\Sigma_t p_T(t)p_{Y|T}(y|t)$$
If one wanted $E(Y)$ he can use the conditional expectation properties
$$E(Y)=E[E(Y|T)]=E[T(1-p)]=(1-p)E(T)=\mu(1-p)$$
tommik
- 33,201
- 4
- 17
- 35
-
-
-
And if one wanted the expectation, would one get the products of the expectations of T and $X_i$? – Karl Dec 06 '20 at 14:35
-
@Karl : edited with answer... As an exercise you can set $T\sim Pois(\theta)$ and calculate $P(Y=y)$ explicitly – tommik Dec 06 '20 at 14:47