1

According to the answer by Will here Conditional expectation, showing that the random variables are a.s. equal. we have:

If $\phi$ is a strictly convex function and we have that $\phi(E(X|\mathcal{G}))=E[\phi(X)|\mathcal{G}]$. Then $X$ must be $\mathcal{G}$-measurable.

I would like to show this.

According to these two posts:

When Jensen's inequality is equality

Convexity and equality in Jensen inequality

when $\phi$ is a strictly convex function, we have equality in the unconditional expectation case when $X$ is a.s. a constant. But how do we show our result in the conditional case?

Note, it is easy to see that if $X$ is $\mathcal{G}$-measurable. Then $\phi(E(X|\mathcal{G}))=E[\phi(X)|\mathcal{G}]$ because then $E[X|\mathcal{G}]=X$. But it is the converse result I am looking for.

Attempt:

I was thinking of mimicking the proof for the unconditional case by Daniel Fisher here:

Convexity and equality in Jensen inequality

I was thinking of writing $\phi(X)\ge \phi(E(X|\mathcal{G}))+k(X-E[X|\mathcal{G}])$. But I am not sure if we can say this, because $E[X|\mathcal{G}]$ is of course not a constant but a random variable.

Do you see how to prove this result?

user394334
  • 1,707

1 Answers1

1

Do not hesitate to comment answers if you need more explanation :)

For the sake of accuracy, the conclusion is not that $X$ is $\mathcal G$-measurable but a.s. equal to a $\mathcal G$-measurable random variable. Note that this makes no difference if $\mathcal G$ is complete.

Let us go with Daniel Fischer's method. For a strictly convex function $\phi:\mathbb R\to\mathbb R$ we have $$ \forall t,c\in\mathbb R,\quad\exists\kappa\in\mathbb R,\quad\phi(t)\ge\phi(c)+\kappa\cdot(t-c), $$ with equality iff $t=c$.

As you wrote, we would like to apply the above inequality with $c=\mathbb E[X\vert\mathcal G]$. However as you pointed out, it is a random variable, so we face the following problems:

  1. $\kappa$ depends on $c$, so here $\kappa$ would be a value which depends on the value of $\mathbb E[X\vert\mathcal G]$, so $\kappa$ should be replaced with a function of $\mathbb E[X\vert\mathcal G]$, say $K(\mathbb E[X\vert\mathcal G])$
  2. The function $K$ has no reason to be measurable if $\kappa$ is chosen without further specification. So $K(\mathbb E[X\vert\mathcal G])$ might not be a random variable which means we cannot play with expected values.

Both problems can actually be solved at once: just choose for example $\kappa$ as the right-derivative of $\phi$ at $c$, denoted $\phi'_+(c)$: $$ \forall t,c\in\mathbb R,\quad\phi(t)\ge\phi(c)+\phi'_+(c)(t-c), $$ with equality iff $t=c$.

We deduce that $$ \phi(X)\ge\phi(\mathbb E[X\vert\mathcal G])+\phi'_+(\mathbb E[X\vert\mathcal G])(X-\mathbb E[X\vert\mathcal G]), $$ with a.s. equality iff $X=\mathbb E[X\vert\mathcal G]$ a.s.

As $\phi$ and $\phi'_+$ are measurable, all terms in the inequality above are random variables. Therefore $$Y:=\phi(X)-\phi(\mathbb E[X\vert\mathcal G])-\phi'_+(\mathbb E[X\vert\mathcal G])(X-\mathbb E[X\vert\mathcal G])$$ is a random variable, which has nonnegative values.

For technical convenience, suppose first that $\phi(\mathbb E[X\vert\mathcal G])$ and $\phi'_+(\mathbb E[X\vert\mathcal G])$ are bounded. Then we easily deduce that $\mathbb E[Y\vert\mathcal G]=\mathbb E[\phi(X)\vert\mathcal G]-\phi(\mathbb E[X\vert\mathcal G])=0$ (it is our starting assumption) and therefore $\mathbb E[Y]=0$. As $Y$ is nonnegative, we get $Y=0$ a.s. So $X=\mathbb E[X\vert\mathcal G]$ a.s., hence $X$ is a.s. equal to a $\mathcal G$-measurable random variable.

If $\phi(\mathbb E[X\vert\mathcal G])$ or $\phi'_+(\mathbb E[X\vert\mathcal G])$ is not bounded, then define for instance the set $A_n:=\{\vert\phi(\mathbb E[X\vert\mathcal G]\vert\le n,\ \vert\phi'_+(\mathbb E[X\vert\mathcal G])\vert\le n\}$. Then with the same method as above you show that $Y1_{A_n}=0$ a.s. You then readily deduce $Y=0$ a.s. by letting $n$ go to $+\infty$.

Will
  • 8,358