A question on my math homework asks us to show that if $0 < p < 1$ and $a, b > 0$, then $a^p + b^p > (a + b)^p$. I have no idea how to do this, any pointers?
-
See also: Prove that $(p+q)^m \leq p^m+q^m$, Prove variant of triangle inequality containing p-th power for 0 < p < 1, Does $|x|^p$ with $0<p<1$ satisfy the triangle inequality on $\mathbb{R}$? – Martin Sleziak Aug 29 '17 at 13:52
-
https://math.stackexchange.com/q/1707969/9464 – Nov 16 '17 at 15:04
-
How does a homework problem void of any trace whatsoever wrt effort, get three upvotes? I'm willing to bet they all come from answerers here. – amWhy Nov 16 '17 at 16:31
4 Answers
First, the inequality is homogeneous, so you can divide both $a$ and $b$ by $a+b$, then you have to show that $$x^p+y^p>1$$ if $x+y=1$, $x,y>0$ and $0<p<1$. But now, $x,y<1$ so $x^p>x$ and $y^p>y$ for $0<p<1$. And you are done.
- 4,303
Fix $a\gt 0$, and consider $f(x)=a^p+x^p-(a+x)^p$. Note that $f(0)=0$. We have $f'(x)=x^{p-1}-(a+x)^{p-1}$. Since $p-1\lt 0$, we have $f'(x)\gt 0$ whenever $x\gt 0$. So $f(x)$ is increasing, and the result follows.
- 514,336
Setting $$t=\frac a{a+b},$$ you can rewrite $$t^p+(1-t)^p>1,$$ with $0<t,1-t<1$.
For $0<p<1$,
$$t^p>t\text{, and }(1-t)^p>1-t.$$ By summing you get the desired result.
If you where to expand, you would get; $$ (a+b)^{p} = a^{p} + b^{p} + C $$ Where C would become a positive amount that depends on a,b and p.
(If you want to know what that would be, I suggest you google binomial theory and Pascal's triangle)
- 74
-
3But $p\in(0,1)$, how do you use the binomial theorem when $p$ isn't a natural number? – Eff Oct 30 '14 at 22:58
-
At least, when $p$ is the inverse of an integer, you can expand $(a^p+b^p)^{1/p}$. I guess this could be extended to rationals. – Oct 31 '14 at 07:26