9

Out of interest, I am trying to proof QM-AM-GM-HM inequality. If you don't know it, it's something like this...
Let there be $n$ numbers $x_1, x_2, x_3...x_n$, where $x_1, x_2, ...,x_n>0$.
Proof that $$\sqrt{\frac{x_1^2+x_2^2...+x_n^2}{n}}\geqslant{\frac{x_1+x_2...+x_n}{n}}\geqslant{\sqrt[n]{x_1x_2...x_n}}\geqslant{\frac{n}{\frac{1}{x_1}+\frac{1}{x_2}...+\frac{1}{x_n}}}$$ I thought of using induction (for n). The base case was something that took me about 20 mins to solve. I used n=2 (n=1 was trivial) but I am stuck. Can anyone give me a hint to continue me? To be exact, I need help in apply the induction hypothesis to the induction step. The numbers/fractions are starting to get ... uh ... ugly... Update 1: I don't want to see the answer. Just a hint...

3 Answers3

3

If you're not restricted to proof by induction, you can try to show that $$ M(p; x_1,x_2,\dotsc,x_n) := \left(\frac{1}{n}\sum _{i=1} ^n x_i^p\right)^{1/p},$$ is an increasing function of $p\in\mathbb{R}$. You only need Jensen's inequality to prove this.

update: For proof without calculus, you only need to prove the AM-GM inequality (e.g., through the Cauchy induction as others suggested). QM-AM is a simple case of the Cauchy-Schwarz inequality (which has an elementary proof). Furthermore, GM-HM is the same as AM-GM for the numbers $y_i = 1/x_i$.

S.B.
  • 3,375
2

I would like to elaborate a bit on S.B.'s answer from 2014, relying on Jensen's inequality (i.e. convexity). The fully rigorous proof of convexity of the functions $g(x):= |x|^{q/p}$ for $q\geq p$ on the domain $D = (0,\infty)$ may require calculus, but at least it is visually obvious (and hence hopefully this explanation is fully satisfactory, from an intuitive point of view).

It is most natural to interpret the expressions (QM, AM, HM) from a probabilistic perspective: namely, let $U$ be a uniform random variable taking each value $x_1,\ldots, x_n$ with $\frac 1n$ probability each. Then, QM, AM, HM are just $\mathbb E[U^p]^{1/p}$ for various values of $p$. (And as S.B. pointed out in a comment, GM $= \lim_{p\to 0} \mathbb E[U^p]^{1/p}$.)

Jensen's inequality in a probabilistic context says that for any $g : D \to \mathbb R$ (domain $D = [0,\infty)$ or $(0,\infty)$) satisfying the property [($\star$): for all $x_0$ in the domain $D$ of $g$, $g(x_0)$ equals the supremum of $L(x_0)$ over all linear functions $L$ below $g$ --- i.e. all $L:D\to \mathbb R$ s.t. $L \leq g$ on $D$], and random variable $X$, $$\mathbb E[g(X)] \geq g(\mathbb E[X])$$ (which I remember with the silly mnemonic "eggs are better than geeks"). The proof is very easy: by linearity of expectation, for any linear function $L\geq g$ we get $L(\mathbb E[X])\leq \mathbb E[L(X)] \leq \mathbb E[g(X)]$. Then on the LHS take the supremum over all lines $L \leq g$!

The fact that convex functions $g:D \to \mathbb R$ satisfy the property ($\star$) probably uses some derivatives, but should be intuitively obvious.

Obvious modifications lead to a version of Jensen's inequality for concave functions (and the resulting inequality goes in the other direction).


Finally, we apply Jensen's inequality. Let $q \geq p$ be (nonzero) real numbers, and the uniform random variable $U$ as above. To prove that S.B.'s function $M(p)$ is increasing in $p$, we want to show (in our probabilistic language) that $$\mathbb E[U^q]^{1/q} \geq \mathbb E[U^p]^{1/p}. \qquad (!)$$ For $\color{red}{q >0}$, raising both sides to the $q$ we see that (!) is equivalent to $\mathbb E[U^q]\geq \mathbb E[U^p]^{q/p}.$

But for $\color{red}{q <0}$, raising both sides to the $q$ we see that (!) is equivalent to $\mathbb E[U^q]\leq \mathbb E[U^p]^{q/p}.$

Let $X = U^p$ (also a uniform random variable taking positive values), so then $U^q = X^{q/p}$. And take $g(x) = |x|^{q/p}$ on domain $D=(0,\infty)$, which is convex if $q>0$ (both for $p>0$ and $p<0$) and concave if $q<0$. Jensen's inequality (the convex and concave versions for $q>0$, $q<0$ resp.) gives EXACTLY (!).

D.R.
  • 10,556
1

Hint for AM-GM:

Note that

$$(x_1x_2)(x_3x_4)\leq\left[\frac{x_1+x_2}{2}\right]^2\left[\frac{x_3+x_4}{2}\right]^2 \leq \left[\frac1{4}\sum_{i=1}^{4}x_i\right]^4.$$

Use this to prove by induction that

$$\left[\prod_{i=1}^{2^n}x_i\right]^{1/2^n} \leq \frac1{2^n}\sum_{i=1}^{2^n}x_i.$$

If $n$ is not a power of $2$, then choose $m$ such that $n+q=2^m$ and apply the previous result to$x_1,x_2,\ldots,x_n,A_n,\ldots,A_n$ where $A_n$ is repeated $q$ times and is the arithmetic average

$$A_n=\frac1{n}\sum_{i=1}^{n}x_i.$$

RRL
  • 92,835
  • 7
  • 70
  • 142
  • ok. seems convincing. What about the others? Roughly, I get the idea, but are there other methods? –  Jun 20 '14 at 14:27
  • There are -- as others are showing -- but that is the straightforward induction approach. The far right inequality follows directly from AM-GM: switch $x_i$ with $1/x_i$ – RRL Jun 20 '14 at 14:41