1

According to https://en.wikipedia.org/wiki/Convergence_of_random_variables#Properties_4

$X_n \rightsquigarrow X$ and $Y_n \rightsquigarrow Y$ does not guarantee that $X_n + Y_n \rightsquigarrow X + Y$ (convergence in distribution).

What is a counterexample?

I thought for sure I'd be able to find this question already asked and answered, but all I found was this which seems to say that convergence is guaranteed if we assume convergence of the pair of sequences in the product space.

tarski
  • 119
  • The whole point is that convergence in distribution is not unique with respect to random variables. That is, you can have $X_n \to X$ and $X_n \to X'$ (both in distribution) with $X \neq X'$ almost surely. In particular, $X+Y, X'+Y$ may have different distributions. – Dominik Kutek Feb 06 '24 at 21:07

1 Answers1

3

Let $X_n$ be an iid sequence of $N(0,1)$ RVs, $Y_n = -X_n$ and $X,Y \sim N(0,1)$ independent from the rest and each other.

Then, $X_n \rightsquigarrow X$, $Y_n \rightsquigarrow Y$, $0=X_n +Y_n \rightsquigarrow 0$, but $X+Y \sim N(0,2)$.

Jose Avilez
  • 13,432
  • Sorry, why isn't $Y = -X$ here? Also what role does independence play? – tarski Feb 06 '24 at 21:05
  • @tarski $X$ and $Y$ are defined to be independent from one another, so that it's easy for me to write the distribution of $X+Y$. Remember that convergence in distribution is convergence of the CDFs. So in particular, $X_n, Y_n, X,Y$ all have the same CDF, but $X_n + Y_n$ does not have the same CDF as the other RVs do – Jose Avilez Feb 06 '24 at 21:19