0

A question on my math homework asks us to show that if $0 < p < 1$ and $a, b > 0$, then $a^p + b^p > (a + b)^p$. I have no idea how to do this, any pointers?

4 Answers4

11

First, the inequality is homogeneous, so you can divide both $a$ and $b$ by $a+b$, then you have to show that $$x^p+y^p>1$$ if $x+y=1$, $x,y>0$ and $0<p<1$. But now, $x,y<1$ so $x^p>x$ and $y^p>y$ for $0<p<1$. And you are done.

wisefool
  • 4,303
6

Fix $a\gt 0$, and consider $f(x)=a^p+x^p-(a+x)^p$. Note that $f(0)=0$. We have $f'(x)=x^{p-1}-(a+x)^{p-1}$. Since $p-1\lt 0$, we have $f'(x)\gt 0$ whenever $x\gt 0$. So $f(x)$ is increasing, and the result follows.

André Nicolas
  • 514,336
2

Setting $$t=\frac a{a+b},$$ you can rewrite $$t^p+(1-t)^p>1,$$ with $0<t,1-t<1$.

For $0<p<1$,

$$t^p>t\text{, and }(1-t)^p>1-t.$$ By summing you get the desired result.

0

If you where to expand, you would get; $$ (a+b)^{p} = a^{p} + b^{p} + C $$ Where C would become a positive amount that depends on a,b and p.

(If you want to know what that would be, I suggest you google binomial theory and Pascal's triangle)

  • 3
    But $p\in(0,1)$, how do you use the binomial theorem when $p$ isn't a natural number? – Eff Oct 30 '14 at 22:58
  • At least, when $p$ is the inverse of an integer, you can expand $(a^p+b^p)^{1/p}$. I guess this could be extended to rationals. –  Oct 31 '14 at 07:26