Let $\{X_n\}_n$ be a sequence of i.i.d. (discrete) random variables with expected value $\mu$, and let $\{B_n\}_n$ be a sequence of finite and bounded random variables. Assume that for all $n$, $B_n$ is independent from $X_n$ but not necessarily from $X_i$ for $i<n$. I am interested in whether the almost sure convergence $$\frac{B_1X_1+\cdots + B_nX_n}{B_1+\cdots + B_n}\to \mu$$ holds. Of course, when $B_n = 1$ for all $n$, the result holds by the strong law of great numbers, and when the $B_i$ are (possibly different but) constant, there are some results in literature about the weighted sums.
I am interested in finding a reference more general case, where the weights are random variables. As a concrete example, think of $X_n$ as the result of a unit bet on red in roulette, so $X_n \in \{0, 2\}$, and $B_n$ as the size of the bet at time $n$, chosen based on previous outcomes but independent of the current spin. When the bets are equal to $1$ (or to some constant $b$), the sequence clearly converges to $36/37$ (the probability of winning times the payoff). It seems reasonable that any bounded betting strategy depending on the past should not change this limit, but I do not currently see a clear proof.
Does anyone know of a reference for this kind of result, or have an idea of how to prove it?