In section 11.3.1 of Introduction to probability models by Ross (10th edition), a very strange phenomenon is described. If you take two independent standard normal distributions and sum their squares, you get an exponential distribution with rate $\frac{1}{2}$. This is proven mechanically. I'm looking into some intuitive insight into this. The exponential distribution is known to be memory-less. And it seems this sum of squares of two i.i.d. Gaussians is also memory-less. For the exponential distribution ($X$), this means (from some $t>s$):
$$P(s <X\; \& \; X<t) = P(X<t | s<X)P(s<X) = P(X>s)P(X<t-s)$$
Now for two Gaussians, $Y$ and $Z$ we get:
$$P(Y^2+Z^2 < r^2+t | Y^2+Z^2>r^2) = P(Y^2+Z^2<t)$$
Is there some connection to a fundamental property of the Gaussian that helps get an intuitive explanation for this memoryless behavior?