0

If $X, Y, Z$ are i.i.d $N(0,1)$ , $P(X>YZ)$=?

I thought of 2 different ways but both are lengthy and integration is quite tough:

  1. $$\int_{-\infty}^\infty\int_{-\infty}^\infty\int_{y*z}^\infty \frac{e^\frac{-x^2}{2}}{\sqrt{2\pi}}*\frac{e^\frac{-y^2}{2}}{\sqrt{2\pi}}*\frac{e^\frac{-z^2}{2}}{\sqrt{2\pi}}\,dx\,dy\,dz$$

  2. By using transformation i.e. finding Distribution of YZ.

Is there any easy way to solve this? If we take $\frac{X}{Y}$~ $C(0,1)$ but then how we will find joint pdf as $\frac{X}{Y}$ and Z are not independent right? ($Cov(\frac{X}{Y},Z)$= Not Defined)

Zhanxiong
  • 15,126

2 Answers2

4

Notice that for $z \in \mathbb{R}$ the distribution of the random variable $X-YZ$ given $Z=z$ is normal with mean equal to $0$, so $P(X-YZ > 0|Z=z) = 1/2$.

Now if we denote $A$ the event that $X-YZ>0$ the law of double expectation says that $$P(X>YZ)=\mathbb{E}[\mathbb{1}_A] = \mathbb{E}[P(X-YZ>0|Z)]=\mathbb{E}[1/2]=1/2$$

1

Let the CDF of $YZ$ be $F$, $\Phi$ and $\varphi$ be the CDF and PDF of the standard normal random variable. For any $u \in \mathbb{R}$, by the independence of $Y$ and $Z$, we have \begin{align} & F(u) = P[YZ \leq u] = \int_{-\infty}^\infty P[Yz \leq u]\varphi(z)dz \\ =& \int_{-\infty}^0 P[Y \geq uz^{-1}]\varphi(z)dz + \int_0^\infty P[Y \leq uz^{-1}]\varphi(z)dz \\ =& \int_{-\infty}^0 (1 - \Phi(uz^{-1}))\varphi(z)dz + \int_0^\infty \Phi(uz^{-1})\varphi(z)dz. \end{align}

It then follows by the symmetry of $\varphi$ that \begin{align} & F(-u) = \int_{-\infty}^0 (1 - \Phi(-uz^{-1}))\varphi(z)dz + \int_0^\infty \Phi(-uz^{-1})\varphi(z)dz \\ =& \int_0^\infty (1 - \Phi(ut^{-1}))\varphi(-t)dt + \int_{-\infty}^0 \Phi(ut^{-1})\varphi(-t)dt \\ =& \int_0^\infty (1 - \Phi(ut^{-1}))\varphi(t)dt + \int_{-\infty}^0 \Phi(ut^{-1})\varphi(t)dt \\ =& \int_0^\infty\varphi(t)dt - \int_0^\infty\Phi(ut^{-1})\varphi(t)dt \\ & + \int_{-\infty}^0 (\Phi(ut^{-1}) - 1)\varphi(t)dt + \int_0^\infty\varphi(t)dt \\ =& \int_{-\infty}^\infty\varphi(t)dt - \int_{-\infty}^0 (1 - \Phi(ut^{-1}))\varphi(t)dt - \int_0^\infty\Phi(ut^{-1})\varphi(t)dt \\ =& 1 - F(u). \end{align}

Now by noting $X$ is independent of $YZ$, similar argument as above gives: \begin{align} & I := P[X > YZ] \\ =& \int_{-\infty}^\infty P[YZ < x]dx \\ =& \int_{-\infty}^\infty F(x)\varphi(x)dx \\ =& \int_{-\infty}^\infty (1 - F(-x))\varphi(x)dx \tag{$F(x) = 1 - F(-x)$} \\ =& 1 - \int_{-\infty}^\infty F(-x)\varphi(x)dx \\ =& 1 - \int_{-\infty}^\infty F(u)\varphi(-u)du \\ =& 1 - \int_{-\infty}^\infty F(u)\varphi(u)du \tag{$\varphi(-u) = \varphi(u)$}\\ =& 1 - I. \end{align} Hence $2I = 1$, i.e., $I = 1/2$.

The above argument clearly also generalizes in deriving $P[X_1 > X_2] = 1/2$, where $X_1, X_2$ are independent and both of them are symmetric around $0$.


A more direct proof (which is essentially same as the first answer but does not initialize any conditional argument):
\begin{align} & P[X > YZ] \\ =& \int_{-\infty}^\infty P[X > Yz]\varphi(z)dz \\ =& \int_{-\infty}^\infty P[X - zY > 0]\varphi(z)dz \\ =& \frac{1}{2}\int_{-\infty}^\infty \varphi(z)dz = \frac{1}{2}. \end{align}

The first equality follows from Theorem 20.3 in Probability and Measure by Patrick Billingsley. In the third equality, we used $X - zY \sim N(0, 1 + z^2)$ given $X, Y \text{ i.i.d. } \sim N(0, 1)$ for any fixed $z \in \mathbb{R}$, whence $P[X - zY > 0] = 1/2$ for every $z$.

Zhanxiong
  • 15,126