0

i am refreshing my knowledge on error analysis and just as a little exercise i was trying to algebraically derive the product rule of error propagation \begin{equation}\tag{1}\label{error} \frac{\delta(x\cdot y)}{|x\cdot y|} = \sqrt{\left(\frac{\delta x}{x}\right)^{2}+\left(\frac{\delta y}{y}\right)^{2}} \end{equation} To simplify the algebra, i assumed $x$ and $y$ to be distributed around zero. My approach was the following: Because $x$ and $y$ are independently measured, the probability of obtaining any given $x$ and $y$ is given by the product (i have omitted the leading factors for simplicity) $$ P(x,y) \propto \exp\left(-\frac{1}{2} \frac{x^{2}}{\sigma_{x}^{2}}\right)\cdot \exp\left(-\frac{1}{2} \frac{y^{2}}{\sigma_{y}^{2}}\right) = \exp\left[-\frac{1}{2} \left(\frac{x^{2}}{\sigma_{x}^{2}} + \frac{y^{2}}{\sigma_{y}^{2}}\right)\right]. $$ Now my plan was to rewrite the exponent in terms of $x\cdot y$ and read off the standard deviation. This gives $$ \frac{x^{2}}{\sigma_{x}^{2}} + \frac{y^{2}}{\sigma_{y}^{2}} = \frac{x^{2}\sigma_{y}^{2}+y^{2}\sigma_{x}^{2}}{\sigma_{x}^{2}\sigma_{y}^{2}} = \frac{x^{2}y^{2}\left(\frac{\sigma_{y}^{2}}{y^{2}} + \frac{\sigma_{x}^{2}}{x^{2}}\right)}{\sigma_{x}^{2}\sigma_{y}^{2}} $$ Now i just assumed that the "new" $\sigma$ of the distribution describing $x\cdot y$ was therefor given by $$ \sigma^{2} = \frac{\sigma_{x}^{2}\sigma_{y}^{2}}{\left(\frac{\sigma_{y}^{2}}{y^{2}} + \frac{\sigma_{x}^{2}}{x^{2}}\right)} $$ However, this is not in agreement with formula \eqref{error}. Weirdly, it is exactly the inverted fraction, as can be seen below: $$ \frac{\sigma^{2}}{x^{2}y^{2}} = \frac{\sigma_{x}^{2}\sigma_{y}^{2}}{\sigma_{y}^{2}x^{2} + \sigma_{x}^{2}y^{2}} = \frac{1}{\frac{\sigma_{y}^{2}}{y^{2}} + \frac{\sigma_{x}^{2}}{x^{2}}} $$ Now i am questioning if that is even the right approach. Or maybe i just made a stupid algebraic mistake? Could you help me out? Thank you in advance!

RobPratt
  • 50,938

1 Answers1

1

Unfortunately, when working with continuous pdfs, the distribution of the product isn't the product of the distributions. In fact, the product of the distributions is almost how you get the distribution of the sum of the random variables, although to do it properly there's a step or two missing.

As shown in this answer, the distribution of the product of two normally distributed random variables is actually a mixture distribution of two $\chi_1^2$ variables.

To derive the formula you were trying to get to at the start, it's actually quite a bad idea to make your $X$ and $Y$ zero-mean, because that means you're dividing by something that is zero in expectation, which has the risk of making things turn out singular - or to look at it another way, what's the relative error on a measurement of zero?

Instead, without assuming any particular distribution for $X$ and $Y$, you can derive an expression for $Var(XY)$, and if we assume that $X$ and $Y$ are independent so that $E(XY) = E(X)E(Y)$, then we get something like this:

$$\begin{eqnarray} Var(XY) & = & E((XY)^2) - [E(XY)]^2 \\ & = & E(X^2) E(Y^2) - (E(X))^2 (E(Y))^2 \\ & = & E(X^2) E(Y^2) - E(X^2) (E(Y))^2 + E(X^2) (E(Y))^2 - (E(X))^2 E(Y))^2 \\ & = & E(X^2) \left[E(Y^2) - (E(Y))^2 \right] + (E(Y))^2 \left[ E(X^2) - (E(X))^2 \right] \\ & = & E(X^2) Var(Y) + (E(Y))^2 Var(X) \\ & = & \left[Var(X) + (E(X))^2\right] Var(Y) + Var(X) (E(Y))^2 \\ & = & Var(X) Var(Y) + Var(X) (E(Y))^2 + Var(Y) (E(X))^2 \\ \sigma^2_{XY} & = & \sigma^2_X \sigma^2_Y + \sigma^2_X \mu^2_Y + \sigma^2_Y \mu^2_X \\ \frac{\sigma_{XY}}{\mu_X \mu_Y} & = & \sqrt{\left(\frac{\sigma_X}{\mu_X}\right)^2 + \left(\frac{\sigma_Y}{\mu_Y}\right)^2 + \left(\frac{\sigma_X \sigma_Y}{\mu_X \mu_Y}\right)^2} \end{eqnarray}$$

Notice that this essentially looks the same as your formula except for one additional term inside the square root. We commonly assume that the last term is negligible compared to the other two since it's their product, and they're already expected to be small in most practical cases.

ConMan
  • 27,579