1

Although I have been learning probability theory for a long time, it is the following problem that made me notice the question that when is it an appropriate time to remove the condition of a conditional probability.

The problem is: $X$ and $Y$ are independent exponential random variable with rate $\lambda$ and $\mu$ respectively and let $M = \min(X,Y)$. Try to find $E[MX|M=X]$.

The answer says that $E[MX|M=X] = E[M^2|M=X] = E[M^2] = \frac{2}{(\lambda+\mu)^2}$. I know what is going on about the first and the last equal sign, however, for the second equal sign which connects $E[M^2|M=x]$ and $E[M^2]$, I don't really get it, in fact I used to believe that the only way that you can eliminate a condition is that such a condition is independent with the random variable you are interested.

And in this exercise, it seems like after we have substituted the condition $M = X$ to the rv $MX$, then the condition is not useful anymore. I wander if we can eliminate the condition of an conditional probability once the condition is used.

1 Answers1

1

We are however conditioning on $\{M=X\}$, which is an event. First we note it has nonzero probability $$P(M=X)=P(Y\geq X)=\int_{[0,\infty)} P(Y\geq x)\lambda e^{-\lambda x}dx=\lambda\int_{[0,\infty)}e^{-(\lambda +\mu)x}dx=\frac{\lambda}{\lambda +\mu}>0$$ and $$E[\mathbf{1}_{\{X=M\}}M^2]=\int_{[0,\infty)}x^2\int_{[x,\infty)}\mu e^{-\mu y}\lambda e^{-\lambda x}dydx=\lambda \int_{[0,\infty)}x^2e^{-(\mu+\lambda) x}dx=\frac{\lambda}{\lambda + \mu}\frac{2}{(\mu+\lambda)^2}$$ and $$\begin{aligned}E[M^2]&=E[M^2\mathbf{1}_{\{M=X\}}]+E[M^2\mathbf{1}_{\{M=Y\}}]\end{aligned}$$ and $$E[\mathbf{1}_{\{M=Y\}}M^2]=\frac{\mu}{\lambda + \mu}\frac{2}{(\mu+\lambda)^2}$$ so $$E[M^2]=\frac{2}{(\mu+\lambda)^2}=\frac{E[\mathbf{1}_{\{X=M\}}M^2]}{P(M=X)}=E[M^2|M=X]$$

Snoop
  • 18,347