1

Let $X_n$ be a sequence of bounded random variables, i.e. there exists $C$ such that for all $n$, $|X_n|<C$ almost surely. Is there anything one can say about whether the average $$\lim_{n\to \infty}\frac{1}{n}\sum_{k=1}^n X_k $$ converges? As each random variable is bounded, all moments exist, however they may not be independent nor identically distributed, so I am unsure if there is a version of LLN which I can use.

Tyler6
  • 1,307
  • Without this boundedness, all you need is that $E[X_i]=m$ for all $i$ and $\frac{1}{n^2}\sum_{i=1}^n\sum_{j=1}^n Cov(X_i, X_j)\leq O(1/n^{\epsilon})$ for some $\epsilon>0$, then $\lim_{n\rightarrow\infty} \frac{1}{n}\sum_{i=1}^n X_i = m$ almost surely. – Michael Sep 02 '22 at 19:40

1 Answers1

3

No, there's very little useful that you can say.

Not having identical distribution is an issue, but you could probably still make some rough statements about the sum. Losing independence, though, is the end of the line.

Example 1: Let $X_i$ be i.i.d. Bernoulli variables; then the average converges almost surely to $1/2$.

Example 2: Let $X_1$ be a Bernoulli variable, and let $X_i = X_1$ for all $i > 1$. Then the average is either $0$ or $1$, each with probability $1/2$.

You can get the weakest possible guarantee, which is that if the average exists, it's something between $-C$ and $C$.

Example 3: Let $X_i$ be a family of constant random variables defined as:

  • $X_1 = 1$
  • $X_2 = -1$
  • $X_3, X_4 = 1$
  • $X_5, X_6 = -1$
  • $X_7, \dots, X_{12} = 1$
  • $X_{13}, \dots, X_{18} = -1$

The scheme for constructing these is:

  • start with a $1$ and a $-1$
  • beginning with $n = 2$, declare the next $n$ terms to all be $1$, and the next $n$ terms after that all be $-1$

By construction, $\frac 1 n \sum X_i$ will oscillate between $1/2$ and $0$ and will fail to converge.


Assuming independence on its own is still not quite enough to guarantee convergence; Example 3 illustrates what can go wrong, since all constant variables are independent of each other.

You could relax the assumption some by specifying that each $X_i$ has the same mean; in this case, you could prove that $\frac 1 n \sum X_i$ still converges to that mean thanks to your uniform bounds on the collection of variables. (Note that you still need independence here; see Example 2.)

I think you could even relax this further by specifying not that the means are the same, but perhaps just that the sequence $\frac 1 n \sum \mu_i$ converges -- but I'm not quite as sure on this one and would need to write down a proof before being sure. You could view the Example 3 as having a fatal flaw of not meeting this condition.

I'm not sure what other conditions there could be to guarantee convergence, but I'm pretty convinced by Examples 2 and 3 that you need independence to get much at all (aside from trivial cases) and that you need to get a handle on the behavior of the means in some fashion.

  • This makes sense.. do you know of any conditions on when the average will exist? – Tyler6 Sep 01 '22 at 19:17
  • Do you mean at this level of generality (no independence, no identical distribution), or can we impose some constraints on the variables? – Aaron Montgomery Sep 01 '22 at 19:22
  • Ideally at this level, but I'm still interested if anything can be said when we impose constraints if needed (I would rather assume independence before assuming identical distribution, if possible). – Tyler6 Sep 01 '22 at 19:24
  • I added a new example to show that the average need not exist at all at this level of generality. Without any further assumptions, the problems you encounter aren't even probability problems, but rather just good old-fashioned real analysis problems, like the example I listed. – Aaron Montgomery Sep 01 '22 at 19:32
  • For a version of the question more in the style of "what can we get away with when relaxing the assumptions," see this: https://math.stackexchange.com/questions/2823211/law-of-large-numbers-without-iid-assumption – Aaron Montgomery Sep 01 '22 at 19:34