4

For a sequence $\{x_n\}_{n=1}^{\infty}$, define $$\Delta x_n:=x_{n+1}-x_n,~\Delta^2 x_n:=\Delta x_{n+1}-\Delta x_n,~(n=1,2,\ldots)$$ which are named 1-order and 2-order difference, respectively.

The problem is stated as follows:

Let $\{x_n\}_{n=1}^{\infty}$ be bounded , and satisfy $\lim\limits_{n \to \infty}\Delta^2 x_n=0$. Prove or disprove $\lim\limits_{n \to \infty}\Delta x_n=0.$

By intuiton, the conclusion is likely to be true. According to $\lim\limits_{n \to \infty}\Delta^2 x_n=0,$ we can estimate $\Delta x_n$ almost equal with an increasing $n$. Thus, $\{x_n\}$ looks like an arithmetic sequence. If $\lim\limits_{n \to \infty}\Delta x_n \neq 0$, then $\{x_n\}$ can not be bounded.

But how to prove it rigidly?

Martin R
  • 128,226
WuKong
  • 14,376
  • 1
  • 18
  • 45
  • The “equivalent” question for differentiable functions is discussed here: https://math.stackexchange.com/q/38811/42969. – Martin R Oct 18 '19 at 11:37

5 Answers5

5

Yes: If $(x_n)$ is bounded and $\lim_{n \to \infty}\Delta^2 x_n = 0$ then $\lim_{n \to \infty}\Delta x_n = 0$. That is a consequence of the following general estimate:

If $(x_n)$ is a sequence with $|x_n| \le M$ and $|\Delta^2 x_n| \le K$ for all $n$ then $$ \tag{*} |\Delta x_n|^2 \le 4MK \, . $$ for all $n$.

In our case $\lim_{n \to \infty}\Delta^2 x_n=0$, so that the above can be applied to tail sequences $(x_n)_{n \ge n_0}$ with $K$ arbitrarily small, and $\lim_{n \to \infty}\Delta x_n=0$ follows.

Proof of the claim. It suffices to prove $(*)$ for $n=0$. Without loss of generality assume that $\Delta x_0 \ge 0$. We have $$ x_n = x_0 + \sum_{j=0}^{n-1} \Delta x_j = x_0 + \sum_{j=0}^{n-1} \left( \Delta x_0 + \sum_{k=0}^{j-1} \Delta^2 x_k \right) \\ = x_0 + n \Delta x_0 + \sum_{j=0}^{n-1}\sum_{k=0}^{j-1} \Delta^2 x_k \, . $$ Using the given bounds $-M \le x_n \le M$ and $\Delta^2 x_n \ge -K$ it follows that $$ M \ge -M + n \Delta x_0 - \frac{(n-1)n}{2}K \\ \implies 0 \le \frac{(n-1)n}{2}K - n \Delta x_0 + 2M $$

If $K=0$ then $0 \le \Delta x_0 \le 2M/n$ implies $\Delta x_0 = 0$, and we are done. Otherwise the quadratic inequality can be rearranged (by “completing the square”) to $$ 0 \le \left(n - \left(\frac{\Delta x_0}{K} + \frac 12 \right) \right)^2 + \frac{4M}{K} - \left(\frac{\Delta x_0}{K} + \frac 12 \right)^2 \, . $$

Now choose the non-negative integer $n$ such $\left| n - \left(\frac{\Delta x_0}{K} + \frac 12 \right) \right| \le \frac 12$. Then $$ 0 \le \frac 14 + \frac{4M}{K} - \left(\frac{\Delta x_0}{K} + \frac 12 \right)^2 = \frac{4M}{K} - \left(\frac{\Delta x_0}{K} \right)^2 - \frac{\Delta x_0}{K} \\ \le \frac{4M}{K} - \left(\frac{\Delta x_0}{K} \right)^2 $$ and the desired conclusion $(*)$ follows.


Remarks: There is a “similar” inequality for differentiable functions:

Let $f: \Bbb R \to \Bbb R$ be twice differentiable. Then $$ \tag{**}\sup_{x \in \Bbb R} \left| f'\left( x\right) \right| ^{2}\le 4\sup_{x \in \Bbb R} \left| f\left( x\right) \right| \sup_{x \in \Bbb R} \left| f''\left( x\right) \right|$$

which goes back to Edmund Landau. See

The proofs resemble each other: We have $$ f(t) = f(0) + t f'(0) + \int_{u=0}^t \int_{v=0}^u f''(v) \, dv $$ which implies $$ 0 \le \frac{t^2}2 \sup_{x \in \Bbb R} \left| f''\left( x\right) \right| - t f'(0) + 2 \sup_{x \in \Bbb R} \left| f\left( x\right) \right| \, . $$ Then $t$ is chosen such that the right-hand side is minimal. The same is done in above prove for sequences, only that $n$ is restricted to integers and cannot be chosen arbitrarily.

Landau also proved that the factor $4$ in $(**)$ is best possible. It would be interesting to know if $4$ is also the best possible factor for sequences in $(*)$.

Martin R
  • 128,226
  • How to guarantee $\exists n \in \mathbb{N^+}$ such that $\left| n - \left(\frac{\Delta x_0}{K} + \frac 12 \right) \right| \le \frac 12$? – WuKong Oct 18 '19 at 14:56
  • @mengdie1982: For every real number $x \ge 0$ there is an integer $n \ge 0$ with $|n - x| \le \frac 12$. Just round the real number to the nearest integer. – Martin R Oct 21 '19 at 11:35
3

Suppose that $\Delta x_n \not\to 0$. Then there is - without loss of generality, replace $x_n$ with $-x_n$ if necessary - a $c > 0$ such that $\Delta x_n > 2c$ for infinitely many $n$. Now for every $\varepsilon > 0$ there exists an $N_{\varepsilon}$ such that $\lvert \Delta^2 x_n\rvert < \varepsilon$ for all $n \geqslant N_{\varepsilon}$. Pick an $n_1 \geqslant N_{\varepsilon}$ with $\Delta x_{n_1} > 2c$. Then $$\Delta x_{n_1 + k} \geqslant \Delta x_{n_1} - k\varepsilon > c$$ for $0 \leqslant k \leqslant c/\varepsilon$. It follows that $$x_{n_1+k+1} - x_{n_1} = \sum_{\kappa = 0}^k \Delta x_{n_1 + \kappa} > (k+1)\cdot c$$ for $0 \leqslant k \leqslant c/\varepsilon$, and thus

$$\sup_n x_n - \inf_n x_n \geqslant \frac{c^2}{\varepsilon}$$ for every $\varepsilon > 0$, which says that $x_n$ is unbounded.

Daniel Fischer
  • 211,575
  • Isn't it sufficient use that $\sum_{n=1}^N \Delta x_n=x_N-x_1$ and $x_n$ is bounded? – user Oct 17 '19 at 13:37
  • 1
    @user No. Because $\Delta x_n = (-1)^n$ leads to a bounded $x_n$. You have to prove that the limit on $\Delta^2x_n$ prevents any kind of alternating sheningans like that from happening. – Arthur Oct 17 '19 at 13:38
  • @Arthur Yes of course! I was confused on that point assuming the $\sum$ convergent. – user Oct 17 '19 at 13:40
  • Since $0 \leqslant k \leqslant c/\varepsilon$,$(k+1)c\geq \frac{c^2}{\varepsilon}$? – WuKong Oct 17 '19 at 14:24
  • @mengdie1982 Not quite, the last doesn't hold for the small $k$, but for $k = \lfloor c/\varepsilon\rfloor$ we have $(k+1)c > (c/\varepsilon)\cdot c$. – Daniel Fischer Oct 17 '19 at 14:27
  • @DanielFischer Sir, can we alternate the last few steps to aviod the upper/lower limit? In fact, since $\varepsilon>0$ is arbitary, we can always choose a sufficiently small $\varepsilon$ such that $c^2/\varepsilon$ goes larger and larger. Thus, there exist at least two items, whose difference can be large as desired. Hence ${x_n}$ can not be bounded. – WuKong Oct 17 '19 at 15:34
  • @mengdie1982 I don't understand what "alternate the last few steps" means, but "two items whose difference can be as large as desired" is the point I arrive at. I have then taken the supremum and infimum of all the $x_n$ to get rid of specific indices, but that's not necessary. – Daniel Fischer Oct 17 '19 at 15:56
1

Let $\{x_n\}$ be bounded by $X$ (i.e. $|x_n|< X$ for all $n$) and $\lim_{n\to\infty} \Delta^2x_n = 0$. Take an arbitrary $\varepsilon>0$. I will show that there is an $N\in \Bbb N$ such that $|\Delta x_n|<\varepsilon$ for all $n>N$, thus showing that $\lim_{n\to\infty}\Delta x_n = 0$.

First fix some natural number $m\geq \frac{4X}{\varepsilon} + 1$ (but also make sure that $m\geq 3$). Next, fix some $N$ such that $|\Delta^2x_n|<\frac{\varepsilon}{m-2} = Y$ for all $n>N$. Now, assume for contradiction, that there is some $n>N$ such that $\Delta x_n\geq\varepsilon$. Then we have $$ \begin{align} x_{m+n} - x_n &= \Delta x_n + \Delta x_{n+1} + \Delta x_{n+1} + \cdots + \Delta x_{n+m-1}\\ &\geq \Delta x_n + (\Delta x_{n} - Y) + (\Delta x_n - 2Y) + \cdots + (\Delta x_{n} - (m-2)Y)\\ &= (m-1)\Delta x_n - \frac{(m-1)(m-2)}{2}Y\\ &\geq (m-1)\varepsilon - \frac{(m-1)(m-2)}{2}Y\\ &= (m-1)\left(\varepsilon - \frac{m-2}2 \cdot \frac{\varepsilon}{m-2}\right)\\ &\geq \left(\frac{4X}\varepsilon + 1 - 1\right)\cdot \frac\varepsilon2\\ &= 2X \end{align} $$ But $|x_n|<X$ and $|x_{n+m}|<X$, so we can't have $x_{n+m} - x_n\geq 2X$. Thus we have a contradiction. So we must have $\Delta x_n < \varepsilon$ for all $n>N$. A very similar contradiction argument shows that $\Delta x_n > -\varepsilon$. It follows that $\Delta x_n\to 0$.


It may look like $\frac{4X}\varepsilon + 1$ and $\frac{\varepsilon}{m-2}$ are pulled out of thin air, and that it just magically works out in the end. This is not the case. They are derived the following way:

We want some natural number $m$ to indicate how many $\Delta x_n$ terms we are adding together, and we want some $Y$ to bound $\Delta^2x_n$. With those bounds named, but without knowing what they are, we can actually do most of the working-out above. We get $$ \begin{align} x_{m+n} - x_n &\geq\Delta x_n + (\Delta x_{n} - Y) + (\Delta x_n - 2Y) + \cdots + (\Delta x_{n} - (m-2)Y)\\ &\geq \varepsilon + (\varepsilon - Y) + (\varepsilon - 2Y) + \cdots + (\varepsilon - (m-2)Y)\\ & \geq (m-1)\left(\varepsilon - \frac{m-2}{2}Y\right) \end{align} $$ We want $\varepsilon - (m-2)Y\geq 0$. (We don't want to add enough terms that we allow $\Delta x_{n+m}$ to become negative again. That's wasting terms.) And we want the final $(m-1)\left(\varepsilon - \frac{m-2}{2}Y\right)$ to be at least $2X$. Solving these two inequalities give $m\geq \frac{4X}{\varepsilon} + 1$ and $Y\leq \frac{\varepsilon}{m-2}$, which is what I used in the proof above.

Note that we don't really need $m\geq 3$. If $\frac{4X}\varepsilon \leq 1$, and we happen to pick $m = 2$, then we can choose whichever value we want for $Y$, and the argument works out in the end. However, because the general expressions require division by $m-2$, I added the $m\geq 3$ requirement for simplicity.

Arthur
  • 204,511
  • I think that this might possibly be the heaviest $\epsilon$-$\delta$ type proof I have actually worked through from beginning to end. I do not envy anyone who gets this on an assignment, and I sincerely hope it never appears on any exam. – Arthur Oct 17 '19 at 14:21
  • Nice. Would you agree that the proof by contradiction is easier to find than yours? – Daniel Fischer Oct 17 '19 at 14:31
  • @DanielFischer I couldn't avoid some contradiction myself. I think it may be possible to avoid by going through some of the inequalities in reverse order, but that seems way too opaque for my taste. $\frac{4X}\varepsilon + 1$ is bad enough. – Arthur Oct 17 '19 at 14:34
  • Yep. I meant "suppose it doesn't converge to $0$ and deduce that the sequence cannot be bounded". I started that way, and everything just unrolled naturally from that point. – Daniel Fischer Oct 17 '19 at 14:38
  • 1
    It may be instructive to think of the continuous analogue: If $|f| \le M$ and $|f''|\le \epsilon$ then $|f'| \le 4 M \epsilon$ (or something like that). – Martin R Oct 17 '19 at 14:43
  • @DanielFischer Yeah, I think you're right. The final expression is not how I would naturally think to express what you had deduced, without some intermediate explanation. That being said, mine isn't too hard to actually find either. It comes rather naturally from "What inequalities do I need, and how do I make that happen?" – Arthur Oct 17 '19 at 14:44
  • @MartinR The continuous analogue is a nice comparison, but unfortunately you can't translate between statements about differences of sequences and statements about derivatives that easily. One can always find results that hold in one world and not the other. That being said, the fundamental theorem of calculus and the mean value theorem makes the continuous version much easier to prove. – Arthur Oct 17 '19 at 14:46
  • @Arthur: Yes, but in this case it turned out that the results (and the proofs) are quite similar. – Martin R Oct 21 '19 at 11:39
0

I wonder if this prove is correct.

Assume $\Delta x_n$ does not converge to 0, than there will be infinitely many $n$ with $\Delta x_n>c$(or $\Delta x_n<c$). as $\Delta^2x_n \to 0$, there is $N>0$ for all $\epsilon$ such that $|\Delta x_{n+1}-\Delta x_n|<\epsilon$. Since there are infinitely many $n$ that satisfy $\Delta x_n>c$, for that $n$ and $n>N$ we can write $|\Delta x_{n+1}-c|<\epsilon$. and for sufficiently large m, we can say that $\sup_{m}x_m-\inf_{m}x_m=2\epsilon M$ when M is the number of m's which mathches two condition. and this makes contradiction.

hskimse
  • 183
0

Since $x_n$ is bounded, choose $M$ so that $|x_n|\le M$.

Suppose that $\lim\limits_{n\to\infty}\Delta x_n\ne0$. Then $\exists\epsilon\gt0:\forall n_0,\exists n\ge n_0:|\Delta x_n|\ge\epsilon$.

Let $\delta=\frac{\epsilon^2}{6M}$. Since $\lim\limits_{n\to\infty}\Delta^2x_n=0$, choose $n_0$ so that if $n\ge n_0$, we have $\left|\Delta^2x_n\right|\le\delta$.

Choose $n\ge n_0$ so that $|\Delta x_n|\ge\epsilon$.

Let $k=\left\lceil\frac{6M}\epsilon\right\rceil$. Note that $(k-1)\delta\lt\epsilon$ and $k\epsilon\ge6M$.

By the choice of $n_0$ and $n$, $$ |\Delta x_{n+j}|\ge\epsilon-j\delta $$ Therefore, $$ \begin{align} |x_{n+k}-x_n| &\ge\sum_{j=0}^{k-1}(\epsilon-j\delta)\\ &=k\epsilon-\frac{k(k-1)}2\delta\\[3pt] &\gt\frac{k\epsilon}2\\[9pt] &\ge3M \end{align} $$ which contradicts the choice of $M$. Thus, $$ \lim_{n\to\infty}\Delta x_n=0 $$

robjohn
  • 353,833
  • In fact, the argument above shows that $$ 2|x|\infty\ge m|\Delta x|\infty-\frac{m^2}2\left|\Delta^2x\right|\infty $$ Therefore, $$ \begin{align} |\Delta x|\infty &\le\frac2m|x|\infty+\frac{m}2\left|\Delta^2x\right|\infty\ &=2\sqrt{|x|\infty\left|\Delta^2x\right|\infty} \end{align} $$ where $$ m=\sqrt{\frac{4|x|\infty}{\left|\Delta^2x\right|\infty}},\ge1 $$ – robjohn Oct 18 '19 at 09:07
  • I had not seen that your computations lead to the same estimate, my answer was derived independently :) – Martin R Oct 18 '19 at 10:57
  • Although our answers use similar estimates, I can tell yours was not copied from mine. However, it was your result that prompted me to add my comment. I had thought I might be able to get a better constant, but ended up with yours. – robjohn Oct 18 '19 at 13:26
  • I just found out that the corresponding result for $C^2$ functions goes back to Landau, who also proved that the constant $2$ is best possible. I'll read the Landau article later, and see if that gives some inspiration to show that the constant is also best possible in the discrete case. – Martin R Oct 18 '19 at 13:35