The Coupon Collector problem off Wikipedia:
Suppose that there is an urn of $n$ different coupons, from which coupons are being collected, equally likely, with replacement. How many coupons do you expect you need to draw with replacement before having drawn each coupon at least once?
The standard method to solve it divides the time taken to collect all coupons - random variable $T$ - into time taken to collect each new coupon - random variables $T_i$. My question is why do we immediately assume that we get all $n$ coupons in each outcome?
This is because the random variable $T$ is not well-defined for sequences of draws where we never get everything. Thus the step $T=T_1 + T_2+ \dots +T_n$ is not well defined for such outcomes in the sample space.
I'm thinking about the possibility of defining $T(\omega) = \infty$ for outcomes $\omega$ where we never get everything, as well as defining $T_i(\omega) = \infty$ for outcomes $\omega$ that have less than $i$ distinct coupons. Is this definition legal? One problem I see is that when calculating the expectation, which is $E(X_i) = \sum_{u=1}^\infty P(X_i = u)u + P(X_i = \infty)\infty$, the right part seems to be not well-defined if $P(X_i = \infty) = 0$.
The next part is calculating $P(X_i = k)$. How do we prove, rigorously, that this is indeed $\left(\frac{n-i}{n-(i-1)}\right)^{k-1}\frac{1}{n-(i-1)}$, in accordance with the geometric distribution? By prove rigorously I mean based on defining a probability space formally (e.g. one where the sample space consists of infinite sequences of coupon draws).