It is known that the real numbers $a>0$, $b>0$, and $0<\theta<1$, then is the following inequality correct? $$ \frac{a^\theta\cdot b^{1-\theta}}{\theta\cdot a+(1-\theta)\cdot b}\leq 1 $$ And why? Thanks in advance.
Asked
Active
Viewed 44 times
0
-
For a proof of this inequality and its generalization see http://math.stackexchange.com/a/1788730/72031 – Paramanand Singh May 29 '16 at 10:17
-
@ParamanandSingh Thanks for your link. – user143763 May 29 '16 at 19:17
1 Answers
1
Since $\log x$ is concave, we have $$ \log(\theta a+(1-\theta)b)\geq \theta\log a+(1-\theta)\log b $$ for all $a,b>0$ and $\theta\in(0,1)$. Exponentiating both sides shows that $$ \theta a+(1-\theta)b\geq a^{\theta}b^{1-\theta}$$ which is equivalent to the stated inequality.
carmichael561
- 54,793
-
Thanks for your answer. BTW, can you give the reference why the logarithm function is concave? – user143763 May 28 '16 at 18:32
-
-