Suppose I have two (independent) continuous random variables $X$ and $Y$ with pdfs $f(x)$ and $g(x)$ respectively. It is well-known that $f(x)g(x)$ is not the pdf of $XY$; in fact, $f(x)g(x)$ may not be a pdf at all (see Appendix).
On the other hand, (assuming $X$ and $Y$ have common support — h/t Thomas Andrews) it's easy enough to make $f(x)g(x)$ a pdf: just rescale by $\left(\int{f(x)g(x)\,dx}\right)^{-1}$. Then we have the following interesting facts:
- If $f(x)=g(x)=1[0\leq x\leq 1]$ (uniform distribution), then $f(x)g(x)$ is also the pdf of the uniform distribution.
- If $f(x)=\lambda_fe^{-\lambda_fx}\cdot1[0\leq x]$, $g(x)=\lambda_ge^{-\lambda_gx}\cdot1[0\leq x]$ (exponential distribution), then $f(x)g(x)\propto(\lambda_f+\lambda_g)e^{-(\lambda_f+\lambda_g)x}\cdot1[0\leq x]$, also the pdf of the exponential distribution.
- Same for two normal distributions.
- Multiplying a normal and an exponential gives a normal.
- If $X$ is as in the appendix ($2x1[0\leq x\leq 1]$) and $Y$ is supported on $[0,1]$ with standard deviation $\sigma$, then $f(x)g(x)$ after rescaling has mean $2\sigma^2$.
So clearly there's something going on here. Is there a probabilistic interpretation for $f(x)g(x)$?
Appendix
For example, let $$f(x)=g(x)=2x\cdot 1[0\leq x\leq1]$$ (where $1[A]$ is the indicator function of $A$). Then $$\int_{\mathbb{R}}{f(x)g(x)\,dx}=\int_0^1{4x^2\,dx}=\frac{4}{3}\neq1$$ Thus $f(x)g(x)$ isn't even a pdf.
For completeness, the law of $XY$ is as follows: \begin{align*} \mathbb{P}[XY\leq x]&=\int_{\mathbb{R}}{f(s)\mathbb{P}\left[Y\leq\frac{x}{s}\right]\,ds} \\ &=\int_0^1{2s\min{(1,(x/s)^2)}\,ds} \\ &=\int_0^x{2s\,ds}+\int_x^1{2x/s\,ds} \\ &=x^2-2x\ln{(x)} \end{align*} To get the pdf, differentiate; the result is precisely $2(x-\ln{(x)}-1)1[0\leq x\leq1]$.