2

Let $X,Y$ be independent continuous random variables with support on some neighbourhood of infinity. Let $a, b\geq 0$, and $t,u\in\mathbb{R}$ be constants. Then it holds that $$P(X+Y>t\mid aX+bY>u)\geq P(X+Y>t). $$

Do you know how to prove this? it sounds very intuitive and easy, but all that I tried failed.

Thomas Andrews
  • 186,215
Albert Paradek
  • 897
  • 1
  • 7
  • 19
  • Thanks for an interesting question. I am not sure my answer below is the quickest or most elegant, but it is an answer nonetheless. I hope it is useful. –  Feb 11 '21 at 03:04

2 Answers2

3

For simplicity we assume $a>0$ rather than only $a\geq0$.

By Bayes' formula, we may equivalently show that \begin{align*} \mathbb{P}(aX + bY > u \, | \, X + Y > t) \geq \mathbb{P}(aX + bY > u). \end{align*} Using the tower property in both the numerator and the denominator, \begin{align*} \mathbb{P}(aX + bY > u \, | \, X + Y > t) &= \dfrac{\mathbb{P}(aX + bY > u, X + Y > t)}{\mathbb{P}(X+Y>t)} \\ &= \dfrac{\mathbb{E}[\mathbb{P}(aX + bY > u, X + Y > t \, | \, Y)]}{\mathbb{E}[\mathbb{P}(X + Y > t \, | \, Y)]}. \end{align*} Similarly, \begin{align*} \mathbb{P}(aX + bY > u) = \mathbb{E}[\mathbb{P}(aX + bY > u \, | \, Y)]. \end{align*} Since $X$ and $Y$ are independent and $a>0$, \begin{align*} \mathbb{P}(aX + bY > u, X + Y > t \, | \, Y) &= S_X(Y_1 \vee Y_2), \\ \mathbb{P}(X + Y > t \, | \, Y) &= S_X(Y_2), \\ \mathbb{P}(aX + bY > u \, | \, Y) &= S_X(Y_1), \end{align*} where $S_X(x) := 1 - \mathbb{P}(X \leq x)$ and $x_1 \vee x_2 = \max\{x_1,x_2\}$, while \begin{align*} Y_1 = \dfrac{u - bY}{a}, \hspace{5mm} Y_2 = t-Y. \end{align*} Note that $Y_1$ and $Y_2$ are functions of $Y$, and thus $S_X(Y_1 \vee Y_2)$, $S_X(Y_2)$, and $S_X(Y_1)$ are also random variables (more precisely, transformations of $Y$).

Collecting terms, we want to show that \begin{align*} \dfrac{\mathbb{E}[S_X(Y_1 \vee Y_2)]}{\mathbb{E}[S_X(Y_1)]\mathbb{E}[S_X(Y_2)]} \geq 1. \end{align*} Now obviously, $S_X(Y_1 \vee Y_2) \geq S_X(Y_1)S_X(Y_2)$. So it suffices to show that \begin{align*} \text{Cov}[S_X(Y_1),S_X(Y_2)] = \mathbb{E}[S_X(Y_1)S_X(Y_2)] - \mathbb{E}[S_X(Y_1)]\mathbb{E}[S_X(Y_2)] \geq 0. \end{align*} Note that $S_X(Y_1)$ is a non-decreasing function of $Y$ since $Y_1$ is a non-increasing as a function of $Y$ (recall $b\geq0$) and $S_X$ is a survival function and thus non-increasing. Similarly, $S_X(Y_2)$ is a non-decreasing function of $Y$. Consequently, see e.g. this question, we must have $\text{Cov}[S_X(Y_1),S_X(Y_2)] \geq 0$ as desired.

  • If $a=0$ then the inequality of interest reads $\frac{\mathbb{E}[\mathbb{1}{{bY > u}}S_X(Y_2)]}{\mathbb{E}[\mathbb{1}{{bY > u}}]\mathbb{E}[S_X(Y_2)]}$ and the proof proceeds similarly by noting that $\mathbb{1}_{{bY > u}}$ is a non-decreasing function of $Y$ (recall $b\geq0$). –  Feb 11 '21 at 03:00
  • Thanks, that is a nice trick using the covariance, I like it. – Albert Paradek Feb 11 '21 at 11:33
  • Indeed; noticing that link was also the most challenging part for me. –  Feb 11 '21 at 11:56
  • Does this argument hold for more than two random variables? If I have $n$ random variables, we can simply use conditional expectation with $n-1$ of them in the condition. Then,we still have the survival function with Y_1, Y_2 which will be functions of $n-1$ variables. Can we still say that COV will be positive? – Albert Paradek Feb 11 '21 at 12:42
  • I'm not sure... –  Feb 11 '21 at 17:44
  • If you create another question, I might give it a go. –  Feb 11 '21 at 17:50
  • 1
    I just found it here https://math.stackexchange.com/questions/3759838/when-does-mathrmcovgx-hx-ge-0-hold-for-all-nondecreasing-g-and-h?noredirect=1&lq=1

    It can be actually generalized for a countable number of variables (provided that the sum converges), which is actually pretty cool and what I needed

    – Albert Paradek Feb 11 '21 at 17:51
  • 1
    But thank you very much, it was really helpfull – Albert Paradek Feb 11 '21 at 17:55
0

Is this a valid counterexample?

Counterexample