1

Let X and Z two independent normal random variables centered reduced. I want to calculate $ P(X+Z<0,Z>0) $, so i have done : $$ P(X+Z<0,Z>0)=P(|Z|<|X|,Z>0,X<0) $$ And I am blocked here.

But the correction says only that it is equal to $ 1/8 $ because the r.vs are independants and centered (and no more details).

However my question is : Could we split like that $$ P(|Z|<|X|,Z>0,X<0)=P(|Z|<|X|)P(Z>0)P(X<0) $$ ? And if yes, why ?

Al Bundy
  • 502
  • 3
    Any valid solution must use some specifics of the joint 2D standard normal distribution. The specifics which leads to the shortest and most illuminating solution is that this distribution is invariant by the vector rotations of the plane. Hence, in the $(X,Z)$-plane, you are asking for the relative extant of the angular sector limited by the angles $\frac{3\pi}4$ and $\pi$, QED. – Did Jan 17 '17 at 17:05
  • 1
    Rotationally invariant = Their distributions do not change under an orthogonal transformations – Al Bundy Jan 17 '17 at 21:28

2 Answers2

2

Maybe a simulation will help you visualize the relationships among variables. I simulated 100,000 realizations of $X \sim Norm(0,1)$ and independently the same number of realizations of $Z \sim Norm(0,1)$ in R statistical software.

Then I plotted the points with $X + Z < 0$ in orange. The points of interest to you are the orange ones above the x-axis. (Of course, you can draw a similar sketch without any simulation, if you understand the symmetry of the bivariate uncorrelated standard normal distribution.)

m = 10^5;  x = rnorm(m);  z = rnorm(m)
plot(x, z, pch=".")
  cond = (x + z < 0)
  points(x[cond], z[cond], pch=".", col="orange")
  abline(h = 0, col="green", lwd=2)
  abline(v = 0, col="green", lwd=2)
mean(z > 0);  mean(x + z < 0)
## 0.49889  # aprx P(Z > 0) = 1/2
## 0.49951  # aprx P(X + Z < 0) = 1/2
mean(x + z < 0 & z > 0); 1/8 
## 0.12254  # aprx P(X + Z = 0, Z > 0) = 1/8
## 0.125

enter image description here

BruceET
  • 52,418
  • ok I understand more with the graph : The event ${\mid Z\mid <\mid X\mid }\cap{Z>0}\cap{X<0}$ is the upper orange slice (Between $\frac{3}{4}\pi$ and $\pi$ as Did has mentioned above + http://math.stackexchange.com/questions/1074218/what-does-rotational-invariance-mean-in-statistics explain the rotational invariance of the joint distribution) – Al Bundy Jan 17 '17 at 21:08
  • ... and this slice corresponds to 1/8 of the "circle" – Al Bundy Jan 17 '17 at 21:17
  • 1
    Glad the graph helped you understand this---and that you took the time to understand @Did's Comment. – BruceET Jan 17 '17 at 22:00
2

$\newcommand{\bbx}[1]{\,\bbox[15px,border:1px groove navy]{\displaystyle{#1}}\,} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\ic}{\mathrm{i}} \newcommand{\mc}[1]{\mathcal{#1}} \newcommand{\mrm}[1]{\mathrm{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ With $\ds{\sigma > 0}$, the answer is given by the following expression: \begin{align} &\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \bracks{{1 \over \root{2\pi}\sigma}\, \exp\pars{-\,{x^{2} \over 2\sigma^{2}}}} \bracks{{1 \over \root{2\pi}\sigma}\, \exp\pars{-\,{z^{2} \over 2\sigma^{2}}}}\bracks{x + z < 0}\bracks{z > 0} \dd x\,\dd z \\[5mm] = &\ {1 \over 2\pi\sigma^{2}}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \exp\pars{-\,{{x^{2} + z^{2} \over 2\sigma^{2}}}} \bracks{0 < z < -x}\dd x\,\dd z \\[5mm] = &\ {1 \over \pi}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \expo{-x^{2}\ -\ z^{2}}\,\,\bracks{0 < \root{2}\sigma z < -\root{2}\sigma x} \dd x\,\dd z \\[5mm] = &\ {1 \over \pi}\int_{0}^{2\pi}\int_{0}^{\infty} \expo{-r^{2}}\,\,\bracks{0 < r\sin\pars{\theta} < -r\cos\pars{\theta}}r \,\dd r\,\dd\theta \\[5mm] = &\ {1 \over \pi}\int_{0}^{2\pi}\bracks{0 < \sin\pars{\theta} < -\cos\pars{\theta}}\ \underbrace{\int_{0}^{\infty}\expo{-r^{2}}r\,\dd r}_{\ds{1 \over 2}}\ \dd\theta = {1 \over 2\pi}\int_{0}^{\pi} \bracks{\sin\pars{\theta} < -\cos\pars{\theta}}\,\dd\theta \\[5mm] = &\ {1 \over 2\pi}\int_{3\pi/4}^{\pi}\,\dd\theta =\ \bbox[#ffe,5px,border:1px dotted navy]{\ds{1 \over 8}} \end{align}

Felix Marin
  • 94,079
  • Nice! (+1) I understood 'centered and reduced' to mean $\mu = 0$ and $\sigma = 1,$ so that's what I used in my figure. But the result does not depend on $\sigma.$ – BruceET Jan 17 '17 at 20:47
  • I add a detail : Let $X=N(0,1)$ so then : $\mathbb E(X) = \frac{1}{2\pi}\int_{\mathbb R}e^{-r^2}rdr= 0$ then $\int_{\mathbb R^+}e^{-r^2}rdr =-\int_{\mathbb R^-}e^{-r^2}rdr=-\int_{\mathbb R^+}\frac{1}{2}e^{-x}dx=\frac{1}{2}$ with the following changement : $r^2=x\in\mathbb R^+$ – Al Bundy Jan 18 '17 at 11:18
  • @BruceET Thanks. That's true: It's $\sigma$-independent. – Felix Marin Jan 18 '17 at 22:39
  • @the-owner Thanks for your detail. The main clue was to integrate first over $r$. – Felix Marin Jan 18 '17 at 22:41