I am trying to prove $$X_n \xrightarrow{d} X, Y_n \xrightarrow{d} a \implies Y_n X_n \xrightarrow{d} aX$$
where $a$ is a constant.
What I tried:
Let $g:\mathbb R\to \mathbb R$ an arbitrary uniformly continuous, bounded function. It suffices to show $\mathbb E[g(Y_n X_n)] \to \mathbb E[g(aX)]$. We have
$$\left \lvert \int g(Y_nX_n) - g(aX) \,dP \right \rvert \leq \left \lvert \int g(Y_n X_n) - g(aX_n) \, dP \right \rvert +\left \lvert \int g(aX_n) - g(aX) \, dP \right \rvert,$$
where the right summand goes to $0$ by assumption as $a$ is constant. Now I want to use uniform continuity of $g$ to estimate the left summand: Choose $\delta > 0 $ such that $$\left \lvert g(Y_n X_n) - g(aX_n) \right \rvert < \epsilon,$$
whenever $|Y_n X_n - aX_n| < \delta.$ Since convergence in distribution to a constant implies convergence in probability, we have can control $P(|Y_n - a | > \delta)$. But is it possible to get a bound on $|X_n|$? I assume that probability in distribution does not imply some form of boundedness... Any help is appreciated.
- 2,807
-
Duplicate of https://math.stackexchange.com/questions/2733508/x-n-rightarrow-x-and-y-n-rightarrow-c-c-a-constant-implies-x-ny-n-righ – Kavi Rama Murthy May 13 '18 at 12:00
1 Answers
First, you can choose $g$ to be Lipschitz bounded, with $C$ its Lipschitz constant. It follows that $\forall \epsilon >0, A > 0$: $$\begin{gather} |\mathbb{E}g(X_n Y_n) - \mathbb{E}g(X_n a)| \leq \mathbb{E}|g(X_n Y_n) - g(X_n a)|1_{|X_n Y_n - X_n a| > \epsilon} + \mathbb{E}|g(X_n Y_n) - g(X_n a)|1_{|X_n Y_n - X_n a| \leq \epsilon} \\[0.5em] \leq C \epsilon + ||g||_\infty \mathbb{P} (|X_n Y_n - X_n a | > \epsilon) \\[0.5em] \leq C \epsilon + \mathbb{P} (|X_n Y_n - X_n a | > \epsilon, |X_n|> A) + \mathbb{P} (|X_n Y_n - X_n a | > \epsilon, |X_n| \leq A)\\[0.5em] \leq C \epsilon + \mathbb{P} (|Y_n - a | > \epsilon / A) + \mathbb{P} (|X_n| \geq A) \end{gather}$$
Now, $\lim_{n \rightarrow \infty} \mathbb{P} (|Y_n - a | > \epsilon / A) = 0,$ so $$\limsup_{n \rightarrow \infty} |\mathbb{E}g(X_n Y_n) - \mathbb{E}g(X_n a)| \leq C \epsilon + \mathbb{P} (|X| \geq A)$$ I used here that fact that $X_n$ goes in law to $X$ and that $[A, \infty[$ is closed (see the Portemanteau lemma). You can now let $\epsilon$ go to zero and $A$ go to $\infty$.
-
1
-
No I think $g$ have to be Lipschitz. But it's not a problem as Lipschitz bounded test functions characterize the convergence in law – Quentin May 13 '18 at 12:31
-
1However if $g$ is uniformly continuous it seems that one can use the same truncation method to control $\mathbb{P}(|X_n||Y_n - a | > \delta)$ by splitting between $|X_n|> A$ and $|X_n| \leq A$ then let $A$ goes to infinity. – Quentin May 13 '18 at 20:39