0

Let $f$ be twice differentiable function on $\mathbb R$ such that $$f(tx+(1-t)y) \le tf(x)+(1-t)f(y)$$

for all $x,y \in \mathbb R$ and for all $t \in[0,1]$.Then show that $f''(x) \ge 0$ for all $x \in \mathbb R$.

I have tried to solve it but I fail. Would anyone please give me some suggestion about solving this problem.

EDIT $:$

I have solved it by the method of contradiction.Here's this $:$

If possible let $f''(c)<0$ for some $c \in \mathbb R$.Then $\exists$ a neighbourhood of $c$ say $(\alpha,\beta)$ such that for $\alpha<x<y<\beta$ we have $f'(x) > f'(y)$.

Now take $z=tx+(1-t)y$.Then clearly $x<z<y$.Therefore we have $:$

$$f(z)-f(x)=\int_{x}^{z} f'(t)\ dt>f'(z)(z-x).$$

and

$$f(y)-f(z)=\int_{z}^{y} f'(t)\ dt<f'(z)(y-z).$$

Hence we have $f(tx+(1-t)y)>tf(x)+(1-t)f(y)$ which contradicts the fact that $f$ is a convex function on $\mathbb R$.

Hence the result follows.

Is the above reasoning correct at all? Please verify it.

Thank you in advance.

  • 2
    https://math.stackexchange.com/questions/1224955/proving-that-the-second-derivative-of-a-convex-function-is-nonnegative – CY Aries Jun 13 '17 at 04:20
  • 2
    You want to show that $f'(t)$ is nondecreasing. So say $a < b$ and show $f'(a) \leq f'(b)$. If $a < s \leq t < b$, then you can prove that the slope of the secant line from $a$ to $s$ does not exceed the slope of the secant line from $t$ to $b$. Then the inequality $f'(a) \leq f'(b)$ follows by taking limits. – user49640 Jun 13 '17 at 04:30
  • @user49640 can you please elaborate your reasoning?Then it will help me much. – Arnab Chattopadhyay. Jun 13 '17 at 07:12
  • It was really just a hint. If it's not enough, then you can find the full details worked out in Spivak's Calculus, for example. – user49640 Jun 13 '17 at 07:17
  • @user49640 please verify my recent edit. – Arnab Chattopadhyay. Jun 13 '17 at 08:20
  • How do you deduce the inequality $f(tx + (1-t)y) > tf(x) + (1-t)f(y)$ from what comes before? I think it can be done, so that part ought to work. There is a mistake, though. When you say that there is an interval $(\alpha,\beta)$ on which $f'$ is a decreasing function, that assumes that $f''$ is continuous at $c$. The problem can be fixed, however, if you choose $y$ so that $f'(t) < f'(c)$ on the interval $(c,y]$, and you do something similar on the left with $x$. Then you take $z$ to be $c$. It's not very nice to use integrals here. You can achieve the same result by writing down a function.. – user49640 Jun 13 '17 at 08:37
  • with positive derivative. – user49640 Jun 13 '17 at 08:37
  • Deduce $f'(z)$ by above two inequalities and then you can find the last inequality pretty naturally. – Arnab Chattopadhyay. Jun 13 '17 at 09:00
  • When we say that $f''(c)<0$ then $f'$ is decreasing in some neighbourhood of $c$.How continuity of $f''$ is needed here? – Arnab Chattopadhyay. Jun 13 '17 at 09:04

0 Answers0