I have came across this problem. Consider a random variable $X$ which, at any time $t$, takes values in $(0,1)$. This variable has not a known probability distribution, but both the first and the second moments are finite. Moreover, the distribution is time invariant (it does not change across time) but the $X$s are not independent.
My aim is to prove that $$\lim_{t\to\infty}\frac{1}{t}\sum_{\tau=1}^t\log(X_{\tau})$$ converges almost everywhere. I just managed to show that it does not diverge, but not that the limit exists.
From Montecarlo experiments (where I change some parameters of the process -that I did not specify not to complicate my question- which in the end change the distribution of the variable $X$) I see that it converges to $\mathbb{E}(\log X_t)$ but I am neither sure that this convergence is $\mathbb{P}-$a.s.
Do you know how to sketch a proof?
Many thanks.