Suppose $\{X_n\}$ and $\{Y_n\}$ converge in probability to $X$ and $Y$, respectively. Will $X_n Y_n$ converge in probability to $X Y$?
I know the answer is yes. If we treat $(X_n,Y_n)$ as a random vector, and it converges in probability to $(X,Y)$ by the assumption. Then $g(x,y) = xy$ is a continuous function and according to the continuous mapping theorem, $g(X_n,Y_n)$ converges in probability to $g(X,Y)$.
My question is how to go from the definition without using the continuous mapping theorem. My attempt is as follows.
$$P(|X_nY_n-XY|>\epsilon)=P(|X_nY_n-X_nY+X_nY-XY|>\epsilon)$$ $$\leq P(|X_n(Y_n-Y)|+|Y(X_n-X)|>\epsilon)$$
It seems tempting to conclude that the last term goes to zero as $n$ goes to infinity. But I am not sure about it. Am I right or did I miss something?