Let $U$ be a Uniform$[0,1]$ variate and conditioned on $U$, let $\{X_{n}\}_{n\geq 1}$ be iid $\operatorname{Bern}(U)$ variates. Then show that $E(U|\sigma(X_{1},...,X_{n}))\xrightarrow{a.s.} U$
My attempt(s): Firstly, I can immediately notice that $Y_{n}=E(U|\sigma(X_{1},...,X_{n}))$ is a Doob Martingale sequence that is uniformly bounded (by $1$) in $L^{\infty}$ . So, by Martingale convergence theorem $Y_{n}\xrightarrow{a.s\, ,\, L^{p},\,p<\infty} X $ for some $X\in L^{p}$ for all $p<\infty$. But I don't know how to show that $X$ is equal to $U$.
I also tried to use the method of moments as $U$ is compactly supported. I tried to show that $E(U^{m})=E(E(U^{m}|X_{1},...,X_{n}))\to E(X^{m})$ but the issue is that I need to consider $\bigg(E(U|X_{1},...,X_{n})\bigg)^{m}$ instead of $E(U^{m})$ and that we only have $\bigg(E(U|X_{1},...,X_{n})\bigg)^{m}\xrightarrow{L^{m}} X^{m}$. This does not give us that $X^{m}$ has the same moments as $U^{m}$.
I also know from the Convergence Theory that $Y_{n}=E(X|X_{1},...,X_{n})$ which would mean that $E(X|X_{1},...,X_{n})=E(U|X_{1},...,X_{n})$. From this can we conclude that $X=U$ as we have that $\int_{A} (X-U)\,dP=0$ for all $A\in \sigma(X_{1},...,X_{n})$. I am very unsure about this as I am not using any property of $X_{1},...,X_{n}$ let alone the fact that conditioned on $U$, they are iid Bernoulli$(U)$ variates.
Since $X_{n}$ are iid, I tried to think of using Kolmogorov's Zero-One law but I couldn't see how that could help.
Can anyone provide a hint or help me with this?