2

Let $X_{1}, X_{2}, X_{3},...$ be a sequence of independent and identically distributed random variables with common (finite) mean $\mu$. Prove that there exists and event A such that $P(A) = 1$ and for all $w \in \Omega$, the quantity $\lim_{n \to \infty}(X_{1}(w)X_{2}(w)...X_{n}(w))^\frac{1}{n}$ converges and determine this limit.

I get the part where these random variables are identically distributed, so the expectation of the sequence is also $\mu$. But how do I proceed further? Will central limit theorem help here?

  • 2
    $n-th$ roots of negative numbers don't exist in the real line for $n$ even . I think you have missed the hypothesis that $X_n$'s are positive random variables. – Kavi Rama Murthy Jun 23 '22 at 05:42
  • @geetha290krm It was not mentioned in the question that the $X_{n}'s$ are positive r.v. But, let's assume they are, how do I prove it? – user1070420 Jun 23 '22 at 05:46
  • 1
    Let me also point out that CLT is of no use in proving convergence with probability $1$ since this theorem only gives convergence in distribution. – Kavi Rama Murthy Jun 23 '22 at 06:07
  • 1
    Please check your source and see if there is any other hyptothesis. The result in the present form cannot be proved. – Kavi Rama Murthy Jun 23 '22 at 07:37
  • Presumably you use the law of large numbers on $\log(X_i)$ to get convergence to the geometric expectation (lower than the arithmetic expectation) – Henry Jun 23 '22 at 07:52
  • @Henry That requires $X_i$ to be positive and $\ln X_1$ to have expectation. I doubt even the existence of the limit if $E\ln X_1$ does not exist. I deleted my answer because the result is false as stated. – Kavi Rama Murthy Jun 23 '22 at 07:58
  • 1
    Is it "for all $\omega\in \mathbf A$, the quantity $\lim$ [...]" ? – P. Quinton Jun 23 '22 at 08:12
  • @geetha290krm - I recognise it requires $X_i$ to be non-negative (otherwise take $\mathbb P(X_i=-1)=1$ as a counterexample) and gives convergence to $e^{\mathbb E \log(X_i)} \in [0,\mu]$. But I think it will tolerate $\mathbb E \log(X_i)=-\infty$ in which case the limit will be $0$ – Henry Jun 23 '22 at 08:12
  • It might be enough to have $\mu>0$, here we don't ask that the limit converges or even exists for all $\omega$, simply those in $A$ (Assuming the OP meant to use $A$ in the statement). – P. Quinton Jun 23 '22 at 08:22
  • Let $P(X_i=1)=P(X_i=0)=1/2$, then $\mu=1/2$. But you can check that $(X_1\cdots X_n)^{1/n}$ converges to $0$ in probability, which is a counter example. – Kervyn Jun 23 '22 at 08:24
  • @Kervyn I don't understand how this is a counter example, here you can take $A=\Omega$ and then the limit is $0$ right (except for the $\omega$ that yields all $1$ where the limit is $1$) ? – P. Quinton Jun 23 '22 at 08:25
  • @Kervyn That looks like a case where $\mathbb E \log(X_i)=-\infty$ and so the limit is $0$ with probability $1$ – Henry Jun 23 '22 at 08:28
  • @geetha290krm If $\mathbb E X_i= \mu$ finite then i do not think $ \mathbb E(\log X_1)^{+}=\infty$ is possible – Henry Jun 23 '22 at 08:29
  • @Henry Oh, I misunderstood the question... I'm showing the limit is not necessarily $\mu$. – Kervyn Jun 23 '22 at 08:33
  • @Kervyn not at all, sorry about expressing myself baddly. What I am saying is that the OP is not talking about almost sure convergence nor convergence in probability. For instance I don't think that the limit has to be the same for all $\omega\in A$ and in your example the limit always exists and is either $0$ or $1$, coincidentally $1$ will happen with probability $0$ and so we could take $A$ to be an event where the limit is always the same (which correspond to almost sure convergence) – P. Quinton Jun 23 '22 at 08:33
  • @P.Quinton I see. Maybe we can take logarithm and use law of large number, since $(\log X)^+\leq X$ which implies $E(\log X_i)^+<\infty$. See https://math.stackexchange.com/questions/1644218/slln-when-the-expectation-in-infinite. – Kervyn Jun 23 '22 at 08:44

1 Answers1

2

Assuming that $X_i$'s are positive and $E\ln X_1 >-\infty$ which implies $E |\ln X_1| <\infty$ note that $\ln [(X_1X_2...X_n)^{1/n}]=\frac 1 n \sum\limits_{k=1}^{n} \ln X_i \to E\ln X_1$ almost surely by SLLN's since $(\ln X_i)$ is also an i.i.d. sequence. Taking exponential we get $ (X_1X_2...X_n)^{1/n} \to e^{E\ln X_1}$.

Proof for the case $E \ln X_1=-\infty$:

Let $0<\epsilon <1$ and $Y_j=\max \{\epsilon, X_j\}$. Then $0 \leq (X_1X_2...X_n)^{1/n} \leq (Y_1Y_2...Y_n)^{1/n} \to e^{E\ln Y_1}$ by the prevous case. I leave it to you to check the fact that $E\ln Y_1 \to -\infty$ as $\epsilon \to 0$. It follows that $(X_1X_2...X_n)^{1/n} \to 0$ almost surely.