1

$X,Y \sim N(\mu,\sigma^2)$

Suppose $\sigma_X^2=\sigma_Y^2 \neq 0 \implies X+Y,X-Y$ are independent.

I know that $\sigma_{X+Y}^2=\sigma_X^2+\sigma_Y^2 , \sigma_{X-Y}^2=\sigma_X^2+\sigma_Y^2$.

If I'll show that $\phi_{X+Y}(t)\cdot\phi_{X-Y}(t)=\phi_{(X+Y)-(X-Y})(t)$ , it's enough ?

$\phi:=$characteristic function

There is another way ?

Thanks !

Algo
  • 2,342
  • 5
  • 12
  • It suffices to show the joint MGF $\phi_{X+Y, X-Y}(t_1, t_2)$ equals the product of the marginal MGFs $\phi_{X+Y}(t_1) \phi_{X-Y}(t_2)$. Alternatively, you can show that the joint density $f_{X+Y, X-Y}(u, v)$ equals the product of the marginal densities $f_{X+Y}(u) f_{X-Y}(v)$. – angryavian Jan 13 '22 at 16:23
  • @OliverDiaz if $COV(X+Y, X-Y)=0$ isn't necessary that both of them independent, am I wrong? – Algo Jan 13 '22 at 16:38
  • Roughly speaking, the approach that immediately came to mind for me is: the joint density of $X,Y$ is a radially symmetric function; and $(X,Y) \mapsto (X+Y, X-Y)$ is the composition of a rotation by $45^{\circ}$ and scaling by $\sqrt{2}$. So the joint density for $(X+Y, X-Y)$ is the same as the joint density for $(X\sqrt{2}, Y\sqrt{2})$. – Daniel Schepler Jan 13 '22 at 19:22

1 Answers1

1

Without additional assumptions on the joint distribution of $(X,Y)$ (similar questions in MSE assume for example that $X$ and $Y$ are i.i.d) the claim in the OP may not hold.

  • Consider for example the following situation: $X\sim N(0,1)$, $\epsilon\sim Be(\pm1,1/2)$, $\epsilon$ and $X$ independent, and $Y=\epsilon X$. Since $$E[e^{itY}]=\frac12\big(E[e^{itX}]+E[e^{-itX}]\Big)=e^{-t^2/2}$$ we have that $Y\sim N(0,1)$. On the other hand, \begin{align} E[e^{it(X+Y)}]&=\frac12E[e^{i2tX}]+\frac12=\frac12e^{-2t^2}+\frac12\\ E[e^{it(X-Y)}]&=\frac12E[e^{i2tX}]+\frac12=\frac12e^{-2t^2}+\frac12 \end{align} but \begin{align} E[e^{it(X+Y)+is(X-Y)}]&=\frac12(E[e^{2itX}]+E[e^{2isX}])\\ &=\frac12(e^{-2t^2} + e^{-2s^2})\\ &\neq E[e^{it(X+Y)}]E[e^{is(X-Y)}] \end{align}

  • If the joint distribution of $X$ and $Y$ is a binormal, then any linear combination $aX+bY$ is normally distributed; furthermore, for any $a,b,c,d$, $(aX+bY,cX+dY)$ is binormal (possibly degenerated). If in addition, $E[X]=\mu=E[Y]$, $\sigma_X=\sigma=\sigma_Y$, and $\operatorname{cov}(X,Y)=0$, then $X+Y\sim N(2\mu,2\sigma^2)$, $X-Y\sim N(0,2\sigma^2)$ and so, \begin{align} E[e^{it(X+Y)+is(X-Y)}]&=E[e^{i(t+s)X}e^{i(t-s)Y}]\\ &= e^{i\mu (t+s) -\sigma^2(t+s)^2/2}e^{i\mu (t-s) -\sigma^2(t-s)^2/2}\\ &=e^{2i\mu t -\sigma^2(t^2+s^2)}\\ &=E[e^{it(X+Y)}]E[e^{is(X-Y)}] \end{align}

Mittens
  • 46,352