8

I am reading the book by McCabe and Tremayne “Elements of modern asymptotic theory with statistical applications” and in Chapter 8 about Brownian motion I ran into this inequality:

$$2\sum_{i=1}^{n}(t_{i,n} - t_{i-1,n})^2 \ \leq\ 2\max(t_{i,n} - t_{i-1,n}) \sum_{i=1}^{n}(t_{i,n} - t_{i-1,n}).$$

In a statistical context, $(t_{i,n} - t_{i-1,n})$ is the variance of increment $W(t_{i,n})-W(t_{i-1,n})$ of a Brownian motion. But in a mathematical sense, i.m.h.o., it can be seen as a distance. In short, it says that the sum of squared distances is less than, or equal to, the maximum distance times the sum of those distances.

My question is, what is this inequality and why does it hold? I tried searching in Google for it, but did not find anything (probably because I don’t know its name). I also tried plugging in numbers and it indeed holds, but I struggle to understand why.

Rócherz
  • 4,241

2 Answers2

12

Suppose you have a finite set of positive numbers $a$, $b$, $c$, $\ldots$, $z$. Take the maximum of them all and call it $M$. You agree that $M$ is greater than or equal to any number of your set: $M \ge a$, $M \ge b$, $M \ge c$, and so on. Since each number is positive, you can multiply each inequality by the right-hand side without affecting it: $Ma \ge a^2$, $Mb \ge b^2$, $Mc \ge c^2$, and so on. Then sum all left- and right-hand sides of these inequalities: $$M(a+b+c+\ldots+z) \ge (a^2+b^2+c^2+\ldots+z^2).$$ That is your inequality right there.

Rócherz
  • 4,241
3

Let $x_1$, $x_2$, $\ldots$, $x_n$ be some positive reals. Also let $x^\ast=\max_{i=1,\ldots,n}\{x_i\}$. Then: $$\sum_{i=1}^{n}x^2_i\leq\sum_{i=1}^{n}x^\ast x_i= x^\ast\sum_{i=1}^{n}x_i.$$ Now plug in $x_i=(t_{i,n}-t_{i-1,n})$.

Rócherz
  • 4,241