3

Not using $f'(x) = A''(x)$, and that $A''(x) >0$ means convex,

Show $A(x) = \int_{0}^{x}f(t)dt$ is convex if $f(x)$ is increasing.

This is from Apostol Calculus Vol. 1, Theorem 2.9 Pg. 122. My try is given

For a function to be convex in $[a,b]$ we need $\forall \alpha \in (0,1)$

$$f(\alpha b + (1-\alpha) a) < (\alpha) f(b) + (1-\alpha) f(a)$$

Using this, we need to show $A(\alpha x) < \alpha A(x)$

Now $A(\alpha x) = \int_0^{\alpha x} f(t) dt = \alpha \int _{0}^{x} f(\alpha t) dt < \alpha \int_{0}^{x} f(t) dt = \alpha A(x)$ because $t > \alpha t \implies f(t) > f(\alpha t)$.

Is this proof alright, and how can I write it more properly. I am self learning calculus and will then proceed to real analysis.

jeea
  • 1,414

2 Answers2

2

$A(\alpha x) < \alpha A(x)$ does not tell you that $A$ is concave upward. What you have to show is: $$A(\alpha x +(1-\alpha) y) \leq \alpha A(x) +(1-\alpha) A(y)$$ for $x<y$ and $0 < \alpha <1$. To prove this write the required inequality in the form $$\alpha ([A(\alpha x +(1-\alpha) y) - A(x)] \leq (1-\alpha) [A(y)-A(\alpha x +(1-\alpha) y)]$$. This becomes $$\alpha \int_x^{\alpha x +(1-\alpha) y} f(t) \, dt \leq (1-\alpha) \int_{\alpha x +(1-\alpha) y}^{y} f(t)\, dt$$. Make the change of variable $s=\alpha (t-x)$ on the left side to get $\int_0^{w} f(x+\frac s {\alpha}) \, ds$ where $w=\alpha (1-\alpha) (y-x)$. Similarly write the right hand side as $\int_0^{w} f(y+\frac s {1-\alpha})\, ds$. Finally, all that remains is to verify that $$f(x+\frac s {\alpha})\leq f(y+\frac s {1-\alpha})$$ whenever $s$ lies in the interval $(0,w)$. Since $f$ is increasing we only have to show that $x+\frac s {\alpha}\leq y+\frac s {1-\alpha}$ whenever $s$ lies in the interval $(0,w)$. I leave it to you to verify this simple fact.

  • So for convex in $[a,b]$ we need to consider another interval $[l,m]$ inside $[a,b]$ ? – jeea May 29 '18 at 08:54
  • @jeea The idea is to re-write the definition of convexity (or upward concavity) in such a way that the increasing nature of $f$ can be used. – Kavi Rama Murthy May 29 '18 at 09:12
  • Thank you for answer but still I need to get the starting points of answer, if a function is convex in $[a,b]$ don't we just need $f((1-\alpha)a + \alpha b) \lt (1-\alpha) f(a) + \alpha f(b)$ ? In my answer posted inside question box I used $[a,b] = [0,x]$ – jeea May 29 '18 at 09:36
  • You cannot take $[a,b]=[0,x]$ You have to prove the inequality for all intervals $[a,b]$. Please also pay attention to 'less than' and 'less than or equal to'. – Kavi Rama Murthy May 29 '18 at 10:21
  • I understand it now, Thanks a lot for this answer! :) – jeea May 29 '18 at 10:46
1

Your proof does not hold for the same reasons Kavi Rama Murty brought up.

Assuming $f$ is continuous, using the fundamental theorem of calculus, what you showed is equivalent to the backward implication of:

Let $f$ be a function, differentiable over $I\subset \mathbb{R}$, then $f$ is convex in $I$ iff $f'$ is increasing over $I$.

Proof: let $c\in (a,b)$, using mean value theorem, we have: $\exists a'\in (a,c), b'\in (c,b)$ s.t. : $$ \frac{f(c)-f(a)}{c-a}=f'(a') \text{ and } \frac{f(b)-f(c)}{b-c}=f'(b') $$ Because $f'$ is increasing, $f'(a')\leq f'(b')$ and: $$ \frac{f(c)-f(a)}{c-a}\leq \frac{f(b)-f(c)}{b-c} $$ Which sometimes called the "slopes inequality" and is equivalent to $f$ being convex (see this post or this one ).

Bill O'Haran
  • 3,032
  • I am confused because Kavi Rama Murty is giving different pov – jeea May 29 '18 at 08:58
  • @jeea I tried to give the most elementary proof possible. No diffrentiability, no mean value theorem, just direct use of the fact that $f$ is increasing. – Kavi Rama Murthy May 29 '18 at 09:10
  • @Bill O'Haran I am afraid you are making slightly stronger assumptions than the one's given. The function $A$ can only be shown to be differentiable almost everywhere. so Mean Value Theorem is not applicable. – Kavi Rama Murthy May 29 '18 at 09:14
  • @KaviRamaMurthy Very true. I wrongly assumed $f$ was continuous. – Bill O'Haran May 29 '18 at 09:15