4

Let $U \in \mathbb{R}^k$ and $V\in \mathbb{R}^k$ be two independent standard normal vectors (i.e., $U \sim \mathcal{N}(0,I)$ and $U \sim \mathcal{N}(0,I)$ ). Define a set $S$ as \begin{align} S=\{ x \in \mathbb{R}^k: x_1 \le x_2 \le x_3 \le ... \le x_k \} \end{align}

We are interested in computing the following conditional expectation \begin{align} E\left[ \|U\|^2 \mid U+V \in S , V\in S \right]. \end{align}

My guess is that, most likely, there is no closed-form expression, so an upper bound would be also fine.

One upper bound I that I tried is via Cauchy-Schwarz: \begin{align} E\left[ \|U\|^2 \mid U+V \in S , V\in S \right]&= \frac{E\left[ \|U\|^2 1_{ \{ U+V \in S , V\in S \}} \right] }{P [ U+V \in S , V\in S ]}\\ &\le \frac{ \sqrt{E\left[ \|U\|^4 \right]} \sqrt{ P [ U+V \in S , V\in S ]} }{P [ U+V \in S , V\in S ]}\\ &= \frac{ \sqrt{E\left[ \|U\|^4 \right]} }{\sqrt{ P [ U+V \in S , V\in S ]}}. \end{align}

Now computing $E\left[ \|U\|^4 \right]$ is simple. However, $P [ U+V \in S , V\in S ]$ is not so much. I tried using inclusion-exclusion principle \begin{align} P [ U+V \in S , V\in S ]&= P [ U+V \in S ]+ P [ V\in S ]- P [ U+V \in S \text{ or } V\in S ]\\ &= \frac{2}{k!}-P [ U+V \in S \text{ or } V\in S ] \end{align} where we used that $P [ U+V \in S ]= P [ V\in S ]=\frac{1}{k!}$

Boby
  • 6,381
  • 1
    Well, you can obtain a bound via $(U\in S, V\in S) \implies (U+V \in S, V\in S)$, s.t. $P(U+V\in S , V\in S) \ge P(U\in S, V\in S) = 1/k!^2$. But I would imagine the resulting bound can be pretty loose. – antkam Dec 26 '19 at 19:36
  • 1
    BTW when you said $U,V$ are standard normal vectors I assume this includes covariance matrix $\Sigma = I$, i.e. the components are independent, right? (You must have, or else you cannot really conclude $P(V\in S) = 1/k!$) – antkam Dec 26 '19 at 19:38
  • Thank you for the bound. Even though it might be loose. This is already an approach and we had zero before. I updated my answer. – Boby Dec 26 '19 at 20:19
  • 1
    For $k=2$ (BTW you still had the dimension as $n$ in the def of $S$ in the title and body text), this can be solved exactly, at least in non-closed form, because $V \in S$ is equiv. to the dot product $V \cdot (1,-1) \le 0$ and we can condition on $V \cdot (1,-1) = v \le 0$ to then find out the exact region of allowed $U$, namely $U \cdot (1,-1) \le -v$. However, this approach does not generalize to $k > 2$. – antkam Dec 27 '19 at 04:39
  • For another (loose? trivial?) upperbound, I think $E[||U||^2 \mid U+V \in S, V \in S] \le E[||U||^2]$, the unconditioned value. Would that be interesting to you? – antkam Dec 27 '19 at 13:59
  • @antkam I don't think so. Conditioning might increase the expectation. – Boby Dec 27 '19 at 14:05
  • oh! then you would find the bound interesting. I don't have a formal proof, but here's my gut feel: Obviously any $U\in S$ is always "allowed", regardless of choice of $V\in S$. For other $U \notin S$ though, smaller magnitude $U$ will be "allowed" by more choices of $V\in S$, while larger magnitude $U$ will be "allowed" by fewer choices of $V\in S$. This creates a bias on the conditional distribution of $U$ toward those of smaller magnitude. If $E[||U||^2] = k$ is a good enough bound for your "ulterior motive" (i.e. reason why you asked this question), then this may be a viable approach. – antkam Dec 27 '19 at 14:13
  • @antkam Sorry, I didn't mean to say I am not interested. I mean to say I'd don't think it holds. – Boby Dec 27 '19 at 17:40
  • My gut tells me it holds, so the question becomes: would an upperbound of $E[||U||^2] = k$ (if proven) be good enough for whatever underlying reason which made you ask this question in the first place? If it is, I will try to find some time to justify / prove my gut feel. I'm not sure I can do it, and even if I can it's sure to be tedious, so I'd rather not spend the time unless you tell me a bound of $k$ is interesting to you. – antkam Dec 27 '19 at 20:25
  • Dear @antkam, yes the bound $k$ would be very interesting to me. Not that this would greatly improve a previous bound which is of the order $\sqrt{k! E[|U|^4]}$. – Boby Dec 27 '19 at 21:17
  • You mean "Note that it would greatly improve..." right? :) – antkam Dec 27 '19 at 21:19
  • @antkam Yes, note. I am making a lot of typos today :) – Boby Dec 27 '19 at 21:20
  • @antkam I have uploaded a new question that is related to this one. You can find here. https://math.stackexchange.com/questions/3495059/find-mathbbp-uv-in-s-v-in-s-where-u-v-are-standard-normal-vectors – Boby Jan 03 '20 at 02:03

1 Answers1

1

This answer is just writing up the idea in @antkam's comment - I hope that's ok. I'll show:

$$\mathbb E\left[ \|U\|^2 \mid U+V \in S, V\in S\right]\leq k$$

The crucial point is that if we fix $V\in S$ and the direction $\widehat U:=U/\|U\|,$ then $\|U\|^2$ is increasing in $\|U\|,$ but the characteristic function $1_{U+V\in S}$ is decreasing in $\|U\|,$ because $S$ is convex:

$$U+V,V\in S \implies \lambda U + V = \lambda(U+V)+(1-\lambda)V\in S\text{ for $0\leq \lambda\leq 1$}$$

So we can use the result that the covariance between a decreasing function and an increasing function is non-positive. You can find proofs on this site for example at Covariance of increasing functions of random variables. It is important that the direction $\widehat{U}$ and magnitude $\|U\|$ are independent - the pdf of $U$ factorizes as a (constant) function of direction multiplied by a function of magnitude. We get

$$V\in S\implies\operatorname{Cov}(\|U\|^2,1_{U+V\in S}\mid \widehat U, V)\leq 0\text{ a.e.}$$

More explicitly, $$V\in S\implies\mathbb E[\|U\|^21_{U+V\in S}\mid \widehat U, V)\leq k\mathbb P[U+V\in S\mid \widehat U, V]\text{ a.e.}$$ Both sides can then be integrated over the event $V\in S$ and divided by $\mathbb P[V\in S]$ to give $$\mathbb E[\|U\|^21_{U+V\in S}\mid V\in S)\leq k\mathbb P[U+V\in S\mid V\in S].$$

(Alternatively, use the law of total covariance conditioned on $V\in S,$ which happens to reduce here to the law of total expectation because $\|U\|^2,\widehat U,V$ are independent. This gives $\operatorname{Cov}(\|U\|^2,1_{U+V\in S}\mid V\in S)\leq 0,$ which is the same thing.)

This means $$\mathbb E[\|U\|^2 \mid U+V\in S, V\in S]=\frac{\mathbb E[\|U\|^21_{U+V\in S}\mid V\in S]}{\mathbb P[U+V\in S\mid V\in S]}\leq k.$$

Dap
  • 25,701
  • 20
  • 52
  • +1 Not only is it OK, I really appreciate the answer!! I've been struggling to prove this on and off for a few days, and always got stuck with some messy Jacobian (which I'm not even sure I should be using). I'm beginning to suspect I'm missing some powerful theorem(s), and your provided them. THANKS A TON!! :) Separate question: this bound $k$, while proven, is sort of "ignorant". I have a vague geometric intuition (esp. based on $k=1,2$ cases) that the bound has to be quite a bit tighter (see my orig comments on "big $U$ being disallowed more often"). What's your gut feel on this? – antkam Dec 31 '19 at 23:22
  • Quick question. Can you explain a bit more why indicator function is a decreasing function of $|U|? – Boby Dec 31 '19 at 23:35
  • @Boby - Because $U + V \in S \implies \lambda U + V \in S$ for any $\lambda \in (0,1)$. So the indicator starts at $1$ (when $U=0$, and conditioned on $V \in S$) and at some point might (or might not) drop to $0$ and then must stay there. As to why $\lambda U + V \in S$, that can be proved easily if you consider each $x_j \le x_{j+1}$ requirement separately. – antkam Jan 01 '20 at 00:04
  • Hmm... Actually Boby has a point: the proof as written seems to say $1_{U+V\in S}$ is decreasing for any $V$, but I think that's false. I.e. I think it is only decreasing for any $V \in S$...? – antkam Jan 01 '20 at 00:11
  • @antkam: I've edited to add the important condition $V\in S$ - indeed $1_{U+V\in S}=0$ at $|U|=0$ if $V\not\in S.$ I don't have much intuition for how tight this bound is - the conditioning biases towards smaller values, but a very severe constraint could force behavior similar to $U\in S$ so that $U$ and $V$ are sort of independent - consider how $\mathbb E[|X|^2 \mid X<t]\approx \mathbb E[|X|^2]$ if $X\sim N(0,1)$ and $\epsilon\approx 0.$ Based on some dodgy simulations I wouldn't rule out $\mathbb E\left[ |U|^2 \mid U+V \in S, V\in S\right]\sim k.$ – Dap Jan 01 '20 at 14:05
  • Yeah, it would be great to also have a lower bound on this. However, I suspect it is a difficult one to find. – Boby Jan 02 '20 at 13:22
  • Can you also expand on the step with the law of total covariance? I tried to replicate, it but get lost with conditionings. – Boby Jan 02 '20 at 14:26
  • @Boby: I've expanded a bit - hopefully it's easier to see what's going on – Dap Jan 03 '20 at 14:55
  • Dear Dap, can you take a look the following question that I recently asked: https://mathoverflow.net/questions/418204/show-that-mathbbp-a-v-le-z-vz-mathbbpav-ge-z-vz-text-a-s-i – Boby Mar 18 '22 at 13:58