0

On a computer running two processes $ X_1, X_2 $ at the same moment. $ X_1, X_2$ mean time work processes, respectively.

$ X_1, X_2 $ have exponential distribution. $$ E (X_1) = E (X_2) = 60s.$$ Let $T$ denotes the time of this process, which worked longer. Calculate $ E (T) $

user180834
  • 1,463

2 Answers2

2

Hint: $T=\max\{X_1,X_2\}$. So it would be wise to first deduce the distribution of $T$ and then obtain the expectation of $T$. Notice that $X_1,X_2$ are independent with exponential parameter $\dfrac{1}{60}$. Can you take it from here?

Landon Carter
  • 13,462
  • 4
  • 37
  • 90
1

Approach to solution: First, determine the CDF of the needed exponential distributions

$$ F(x) \equiv P(X_i < x), \qquad i = 1, 2 $$

You will need to figure out the actual formula for $F(x)$; the above is only its definition. Now, determine $G(x) \equiv P(T < x)$ for any $x$, the CDF of the maximum of $X_1$ and $X_2$. Note that $G(x) = P(X_1, X_2 < x)$, $X_1$ and $X_2$ are independent, and $P(X_i > x) = 1-P(X_i < x) = 1-F(x)$.

Given any CDF $G(x)$, one can determine the average as

$$ E(T) = \int_{x=0}^\infty [1-G(x)] \, dx $$

Brian Tung
  • 35,584