4

A somewhat silly mathematical diversion was proposed to me by a friend, and I have reduced the question to the following one:

Let $f(x)=e^{-\frac{1}{x^2}}$. Given some $x\in \mathbb{R}$ with $0< x < 1$, find asymptotics on the sequence of values $a_n$ defined by iterating the Newton map associated to $f$, with initial value $a_0=x$. Assuming I did not make a calculational blunder, we have the recurrence relation \begin{equation} a_{n+1} = a_n - \frac{a_n^3}{2},\end{equation} and one can easily see by the monotone convergence theorem that this sequence limits to 0 for any initial $x$ in the specified range.

Now, what I really desire are estimates on this convergence, but there are immediate difficulties: this function is not analytic in a neighborhood of zero, and there, its Taylor series is identically zero. What can you say about the rate of convergence in this situation, if anything? The general proofs usually require $f'(y)\neq 0$ for the root $y$, or at least some $s\in \mathbb{N}$ such that $f^{(s)}(y)\neq 0$.

I know that this topic has been hammered to death by students in much less pathological situations, so please forgive me if this initially seems like spam. I am somewhat skeptical that general considerations for Newton's method can say anything here, but this is also an iterated function system for a cubic polynomial, so I am mildly hopeful that a Dynamicist can shed some light. I am a reasonably mathematically mature graduate student in SCV, but have almost no exposure to dynamics.

Alp Uzman
  • 12,209
npdotrand
  • 121

2 Answers2

4

Rewrite the recurrence in the difference form $a_{n+1} - a_n = - \frac{a_n^3}{2}$ and, heuristically, consider the related differential equation

$$a'(x) = - \frac{a(x)^3}{2}.$$

This admits a closed-form solution given by

$$\begin{align*} \int - \frac{2a'(x)}{a(x)^3} \, dx &= \int 1 \, dx \\ \frac{1}{a(x)^2} &= x + C \\ a(x) &= \sqrt{ \frac{1}{x + C} } \end{align*}$$

which suggests that we ought to expect $\boxed{ a_n \stackrel{?}{=} O \left( \frac{1}{\sqrt{n}} \right) }$, and suggests we try the substitution $b_n = \frac{1}{a_n^2}$. This changes the recurrence to

$$\begin{align*} b_{n+1} &= \frac{b_n}{\left( 1 - \frac{1}{2b_n} \right)^2} \\ &= b_n + 1 + \frac{3}{4b_n} + O \left( \frac{1}{b_n^2} \right) \end{align*}$$

which suggests a rate of growth for $b_n$ that looks like $n + \frac{3 \log n}{4} + O(1)$. It's not so hard to at least get a lower bound on $b_n$ (hence an upper bound on $a_n$) from here; applying the simple inequality $\frac{1}{1 - x} \ge 1 + x$ for $x \in [0, 1]$ gives

$$b_{n+1} \ge b_n \left( 1 + \frac{1}{2b_n} \right)^2 \ge b_n + 1$$

and induction starting from $a_0 = x, b_0 = \frac{1}{x^2}$ gives

$$b_n \ge n+\frac{1}{x^2} \Leftrightarrow \boxed{ a_n \le \frac{1}{\sqrt{n+\frac{1}{x^2}}} }.$$

So $x$ turns out to matter relatively little. We can get a matching bound in the other direction by proving that

$$\frac{x}{(1 - \frac{1}{2x})^2} \le x + 1 + \frac{1}{x-\frac{1}{2}}$$

for all $x \ge 1$. The difference works out to be $\frac{x-1}{(2x-1)^2}$ so this is clear. Together with the lower bound $b_n \ge n$, this gives

$$b_{n+1} \le b_n + 1 + \frac{1}{b_n - \frac{1}{2}} \le b_n + 1 + \frac{1}{n-\frac{1}{2}}$$

which gives

$$b_n \le n + \frac{1}{x^2} + H_n' \Leftrightarrow \boxed{ a_n \ge \frac{1}{\sqrt{n + H_n' + \frac{1}{x^2}}} }$$

where $H_n' = \sum_{k=1}^{n-1} \frac{1}{k-\frac{1}{2}}$ is a slightly offset version of the harmonic numbers; and we have $H_n' = \log n + O(1)$ as with the usual harmonic numbers. We could be more precise but there's not much point since we're already losing a constant factor in front of that logarithm with this argument anyway.

Qiaochu Yuan
  • 468,795
3

On the interval $a_n \in [2^{-k-1}, 2^{-k}]$, the value of $\frac{a_n^3}{2}$ is between $2^{-3k-4}$ and $2^{-3k-1}$, so it takes between $2^{2k}$ and $2^{2k+3}$ steps to cross that interval.

Therefore, up to a constant factor, it takes $4^k$ steps to zero out the $k^{\text{th}}$ bit after the, uh, binary point. Since $\sum_{k=1}^m 4^k = O(4^m)$, it takes $O(4^m)$ steps to zero out $m$ bits, though of course this only kicks in once those bits don't all start out zero in $a_0$. In other words, since it takes $O(4^m)$ steps for the sequence to go below $2^{-m}$, we get convergence at the rate of $a_n = O(1/\sqrt n)$.

Misha Lavrov
  • 159,700