1

Let $(X_n)_n$ be i.i.d. of mean $0$ and positive and finite variance, and let $S_n=X_1+\cdots+X_n$. I want to see a proof of the following statement, without resorting to characteristic functions

$$\lim_n\mathbb P(S_n>0,S_{2n}<0)=\frac18$$

I've already seen a proof using characteristic functions, but I would like a proof that doesn't resort to the theory of characteristic functions.


What I've tried is the following. Let $S'_{n}=X_{n+1}+\cdots+ X_{2n}$, then we want to find

$$\lim_n\mathbb P(S_n>0,S_n+S'_{n}<0)$$

Both $S_n/\sqrt n,S'_n/\sqrt n\xrightarrow{d}\mathcal N(0,\sigma^2)$, and if $X,Y\sim\mathcal N(0,\sigma^2)$ are independent, a simple symmetry argument shows

$$\mathbb P(X>0,X+Y<0)=\frac18$$

So I would love to "interchange the limit with the probability", of course, this is nonsense mathematically, but I would like to turn this intuition into a proof

  • You need to divide by $\sqrt{n}$ to say both $\frac{S_n}{\sqrt{n}}$ and $\frac{S'_n}{\sqrt{n}} \xrightarrow{d}\mathcal N(0,\sigma^2)$ and they are independent and identically distributed. So I do not see a problem with your hope to "interchange the limit with the probability". – Henry May 26 '25 at 16:21
  • @Henry about the $\sqrt n$ thing you're absolutely right, I forgot. I guess my main problem is about $S_n+S'n$ not being independent from $S_n$. There's at least some argument needed, as this is a joint probability distribution. You would need to argue that $(S_n,S{2n})\xrightarrow{d}(X,X+Y)$ which is not awfully obvious to me – Bruno Andrades May 26 '25 at 17:02
  • Yes, but your $X+Y$ is not independent of your $X$ either (if it were then the answer would be $\frac14$ rather than $\frac18$) so I do not see that as a particular problem – Henry May 26 '25 at 18:29
  • @Henry I still feel that you're somehow using $(S_n/\sqrt n,S_n/\sqrt n+S'_n/\sqrt n)\xrightarrow{d}(X,X+Y)$ and I still don't see why that would be true, in any case it's not trivial I don't think. If you're using something else please do correct me – Bruno Andrades May 26 '25 at 18:43
  • Yes - I am saying that the independence, combined with the convergence in distribution to a normal distribution, should be enough. – Henry May 26 '25 at 18:56

2 Answers2

1

Here's a proof I find satisfactory, following the ideas laid out in the other answer. Notice that by independence $(S_n/\sqrt n,S'_n/\sqrt n)\xrightarrow{d}(X,Y)$ where $X,Y\sim\mathcal N(0,\sigma^2)$ are independent. Now, by the continuous mapping theorem, using $g(x,y)=(x,x+y)$ we get \begin{align}(S_n/\sqrt n, S_{2n}/\sqrt n)&=g(S_n/\sqrt n,S'_n/\sqrt n)\\&\xrightarrow{d}g(X,Y)\\&=(X,X+Y)\end{align} and therefore \begin{align}\lim_n\mathbb P(S_n>0,S_n+S'_n<0)&=\lim_n\mathbb P\left(\frac{S_n}{\sqrt n}>0,\frac{S_n}{\sqrt n}+\frac{S'_n}{\sqrt n}<0\right)\\&=\mathbb P(X>0,X+Y<0)\\ &=\frac18\end{align}

0

This is not too difficult of an integral to compute, note that $$ \mathbb P(X +Y < 0, X > 0) = \mathbb P(Y < -X, X > 0) = \frac{1}{2 \pi \sigma ^2} \int_{0}^\infty \int_{-\infty}^{-x} \exp \left( \frac{-x^2 - y^2}{2 \sigma^2} \right) \, \mathrm d y \, \mathrm d x$$ Converting to polar coordinates gives us with $(x, y) = (r \cos \theta, r \sin \theta)$ changes the bounds to $\theta \in [-\pi/2, -\pi/4], r \in (0, \infty)$ hence we get $$ \mathbb P(X + Y < 0, X > 0) = \frac{1}{2 \pi} \int_{-\pi/2}^{- \pi/4} \int_0^\infty e^{-r^2/2}r \, \mathrm d r \mathrm d \theta = \frac{1}{2 \pi} \cdot \frac{\pi}{4} = \frac{1}{8}$$ The only difficult part is to get the bounds on $\theta$, note that $x > 0$ means $\theta \in (-\pi/2, \pi /2)$ and $y < -x \implies \tan(\theta) < -1$, hence $\theta \in (-\pi/2, \pi /2) \cap (- \pi, - \pi/4) = (-\pi/2, -\pi/4)$.

The only part of this that we rely on the fact that $(X_n, Y_n) \Rightarrow (X, Y)$ which is a very basic fact of weak convergence. Note that $$ \mathbb P((X_n, Y_n) \in A \times B) = \mathbb P(X_n \in A) \mathbb P(Y_n \in B) \to \mathbb P(X \in A) \mathbb P(Y \in B)$$ Which is of course $\mathbb P((X, Y) \in A \times B)$.

Edit: Using the work done here all you would need is 1. $X_n \Rightarrow X$ and $Y_n \Rightarrow Y$, independent of each other implies that $(X_n, Y_n) \to (X, Y)$ and 2. by continuous mapping theorem with $g:(x, y) \mapsto x + y$ we can get $$g((X_n, Y_n)) = X_n + Y_n \Rightarrow X + Y$$

Robertmg
  • 2,155
  • I didn't have a doubt about how to prove that probability is 1/8 for $X,Y$, as an even simpler symmetry argument works. Did you perhaps mean $g:(x,y)\mapsto (x+y,x)$? I think that if that's the case then that might solve my question – Bruno Andrades May 26 '25 at 19:29
  • And also, if I'm not mistaken, you do use independence to guarantee $(S'_n/\sqrt n,S_n/\sqrt n)\implies (X,Y)$. If you add and change the $g$ I'll accept the answer – Bruno Andrades May 26 '25 at 19:34
  • No, you want to use $(X,Y)$, as this is an independent bivariate normal. Then you integrate over the region ${(x,y): x > 0, y < -x}$. $( X + Y, X)$ is not an independent bivariate normal, so it’s more difficult to integrate since you would need to derive the pdf. – Robertmg May 26 '25 at 19:34
  • I do not have any trouble calculating said probability, as I said in the body of the question. My main worry was showing that $(S'_n/\sqrt n+S_n/\sqrt n,S_n/\sqrt n)\implies (X+Y,X)$ and I think that the proof for that is just continuous mapping theorem. I don't understand the way you use the continuous mapping theorem, as it doesn't say anything when considering the identity, you would still have to prove $(S_n/\sqrt n,S'_n/\sqrt n)\implies (X,Y)$ and that does follow from independence, or did I misunderstand something? – Bruno Andrades May 26 '25 at 19:51
  • How do you calculate the probability for $(X + Y, X)$? – Robertmg May 26 '25 at 20:59
  • Doing something like this https://math.stackexchange.com/a/3439651/578498 – Bruno Andrades May 26 '25 at 23:07
  • Again I feel like it's missing an argument, as $X+Y$ is not independent of $X$, so it doesn't just follow from your last line. You have to use the mapping $(x,y)\mapsto (x+y,y)$, I have already constructed a proof from your ideas, but I'll still accept your answer when it's complete (just for the record, it wasn't me who down voted you) – Bruno Andrades May 26 '25 at 23:20
  • I edited to hopefully help your confusion, the convergence $(X_n, Y_n) \to (X, Y)$ is from independence, continuous mapping gives $X_n + Y_n \to X + Y$ under the assumption that $(X_n, Y_n) \to (X, Y)$. If you want $(X_n, X_n + Y_n) \to (X, X + Y)$, you should use the function $g((x,y)) = (x, x + y)$, but note it relies on the convergence of the joint distribution not each particular distirbution. – Robertmg May 26 '25 at 23:24