2

I have came across this problem. Consider a random variable $X$ which, at any time $t$, takes values in $(0,1)$. This variable has not a known probability distribution, but both the first and the second moments are finite. Moreover, the distribution is time invariant (it does not change across time) but the $X$s are not independent.

My aim is to prove that $$\lim_{t\to\infty}\frac{1}{t}\sum_{\tau=1}^t\log(X_{\tau})$$ converges almost everywhere. I just managed to show that it does not diverge, but not that the limit exists.

From Montecarlo experiments (where I change some parameters of the process -that I did not specify not to complicate my question- which in the end change the distribution of the variable $X$) I see that it converges to $\mathbb{E}(\log X_t)$ but I am neither sure that this convergence is $\mathbb{P}-$a.s.

Do you know how to sketch a proof?

Many thanks.

StubbornAtom
  • 17,932
DreDev
  • 91
  • The sum looks wrong to me. Anyhow, this looks like it should result from the paper by Erdos: On a theorem of hsu and robbins – Andrew Jun 30 '23 at 00:50
  • Yes there was a typo. I substituted the subscript. Thanks. – DreDev Jun 30 '23 at 01:18
  • The theorem requires that the variables are iid, but they are not in this case. Either there exists an extension to non-independent variables? – DreDev Jun 30 '23 at 01:19
  • Ah yes, you are correct. I think without any further knowledge, it will be quite difficult to get anything. Since you do not specify the interdependence between the $X_t$, then any proof necessarily will need to be a BC argument. But then you need some specific conditions on your RVs. I guess you can check the literature to see if any of these conditions make sense for your problem – Andrew Jun 30 '23 at 01:27
  • I would be curious to see how you managed to show that the series does not diverge. I am fairly certain that you can pick random variables $X_t$ that satisfy your criterie but for which the series diverges a.s.. – Small Deviation Jun 30 '23 at 13:22

1 Answers1

2

In such generality, the almost-sure convergence of the series cannot be guaranteed.

Consider a sequence of i.i.d. random variables $(Z_n)_{n\in\mathbb N}$ with Pareto(1) distribution, i.e. their distribution function is given by $$F_Z(t)=1-\max(1,t)^{-1}.$$ Define the random variables $X_n:=e^{-Z_n}$, then clearly $$\frac{1}{t}\sum_{\tau=1}^t \log(X_\tau)=-\frac{1}{t}\sum_{\tau=1}Z_\tau.$$ Then by the strong Law of Large Numbers for distributions with infinite mean we get that $$\frac{1}{t}\sum_{\tau=1}^t \log(X_\tau)\rightarrow -\infty.$$

Hence divergence cannot be ruled out.

Even if you assume that $\log(X_\tau)$ has finite mean, the series might still not converge. One can easily modify the example given in this answer to derive a (dependent) sequence of random variables $X_\tau$ such that their average oscillates.

Small Deviation
  • 2,442
  • 1
  • 6
  • 24
  • But I explicitly said that both the first and the second moment are finite, so why should your counterexample be valid? Anyway I think I have found the solution, by using the theorem of Bernstein of LLN. I am just left that the autocovariances tend to 0 when the distance tends to infinity. – DreDev Jul 03 '23 at 23:39
  • @DreDev You said the first and second moments of $X$ are finite. If i am reading your post correctly, you mentioned nothing about $\log(X)$. Even assuming finite moments, the thread i linked gives a counterexample that shows that the sum doesn't necessarily converge. – Small Deviation Jul 04 '23 at 04:52
  • Yes you are right about $\log(X)$, but yet I can't understand how the logarithm could have a Pareto(1) distribution if $X\in (0,1)$ (please notice that it is $(0,1)$ and not $[0,1)$ -if this may be the misunderstanding). – DreDev Jul 05 '23 at 01:37
  • @DreDev The logarithm maps the interval $(0,1)$ to the interval $(-\infty,0)$, therefore it is possible to get a very heavy-tailed (negative) random variable by taking $\log(X)$, even if $X$ is constrained to $(0,1)$. – Small Deviation Jul 05 '23 at 07:09