4

I've been trying to do an exercise that goes as follows: Let $u_0\in\mathbb{R}$ and define for all $n\in\mathbb{N}$, $u_{n+1}=u_n+\exp(-u_n)$. Determine an asymptotic equivalent of this sequence to two terms. I'd like to see how a second year university student could tackle this problem, as it is usually given to students at an end of year oral exam (ideally a solution without any magic tricks, like simply guessing what the equivalent would be and taking the difference). I tried tackling the problem in a manner similar to the following classic problem:

Let $u_0\in[0,\frac{\pi}{2}[$, and for all $n\in\mathbb{N}$, $u_{n+1}=\sin(u_n)$. Determine an asymptotic equivalent of $(u_n)$.

Here you'd use the fixed point theorem to show that $\lim_{n\to\infty}u_n=0$ and then examine for some $\alpha\in\mathbb{R}$: $$u_{n+1}^\alpha-u_n^\alpha=_{+\infty}u_n^\alpha\left(\left(1-\frac{u_n^2}{6}+o(u_n^2)\right)^\alpha-1\right)=_{+\infty}-\alpha\frac{u_n^{\alpha+2}}{6}+o(u_n^{\alpha+2})$$

And so by taking $\alpha=-2$ and summing on both sides we may yield: $$u_n\sim\sqrt{\frac{3}{n}}$$

But this exercise seems to resist this technique quite neatly, and all I've been able to show is that the sequence diverges. With a few python simulations, for $u_0$ relatively close to $0$, the sequence seems to be approximately equal to $\ln(n)$ but how could I even conjecture such a result without using python, and how could I solve the problem without any conjectures at all?

Edit: The solution should not use Stolz-Cesaro as it isn't in the exam syllabus. The sequence has to be treated in its most general case as stated above and I'd appreciate an answer that motivates every step.

J.J.T
  • 1,077
  • Check this: https://math.stackexchange.com/a/2800008/42969, and this: https://math.stackexchange.com/q/3731853/42969 – Martin R Feb 28 '25 at 15:15
  • These don't really answer my question as they're either specific examples for a given $u_0$ or the answer is to plot the results, which as I stated is against the spirit of the exercise in this context (oral exam) – J.J.T Feb 28 '25 at 15:22

3 Answers3

3

First observe that the sequence is strictly increasing and diverges, this is a consequence of the inequality $u_{n+m} \geq u_n + m \exp(-u_{n+m})$.

Observe that, if we interpret $u_n$ as $u(n)$ for some smooth function $u$, then the condition $u_{n+1} = u_n + \exp(-u_n)$ Can be thought of as $$u'(n) \cong \frac{u(n+1) - u(n)}{1} = \exp(-u(n))$$ Because of this I define the following ODE $$\begin{cases} v(0) = u_0 \\ v'(t) = e^{-v(t)} \end{cases}$$

The second condition is equivalent to $(e^{v(t)})' = 1$ and so we have $e^{v(t)} = t + e^{u_0}$ which gives $v(t) = \log(t + e^{u_0})$, this suggests us that $u_n \sim \log(n)$

One idea is to analyze the recursion in a similar way we did with the ODE, so introducing the difference operator $\Delta$ we have

$$\Delta (e^{u_n} ) = e^{u_{n+1}} - e^{u_n} = \frac{e^{u_{n+1}} - e^{u_n}}{u_{n+1} - u_n}(u_{n+1} - u_{n}) = \exp(\theta_n e^{-u_n} )$$ for some $\theta_n \in (0,1)$ (I used Lagrange's theorem and the recursion formula. Now the quickest way to conclude is to use the divergence of the sequence and the Stoltz-Cesaro theorem to compute

$$\lim_{n \to \infty}\frac{e^{u_n}}{n} = \lim_{n \to \infty}\frac{ \Delta(e^{u_n}) }{ \Delta(n) } = \lim_{n \to \infty} \exp(\theta_n e^{-u_n} ) = 1$$

Now we use that $a_n,b_n \to +\infty$ or $0+$ and $a_n \sim b_n$ then $\log(a_n) \sim \log(b_n)$

We show it using the following trick

$$ \lim_{n \to \infty} \frac{u_n }{\log(n)} = \lim_{n \to \infty} \frac{\log( \frac{e^{u_n}}{n})}{\log(n)} + 1 = \frac{ \log(1)}{\infty} + 1 = 1$$

Paul
  • 1,489
3

Letting $f(x) = x + e^{-x}$, since $f(x)>x$ for every $x$ we get that $(u_n)_{n\geq 0}$ is strictly increasing; since $f$ has no fixed point, and any limit of $(u_n)_{n\geq 0}$ would be a fixed point of $f$, we deduce that $(u_n)_{n\geq 0}$ diverges.

Just like Paul suggested, we can then informally pretend that $u_n = u(n)$ is a function of a continuous variable, and write $$ u'(n) \approx u_{n+1}-u_n = e^{-u_n} . $$ Like Paul described, this suggests $u_n \sim \ln n$. A sequence with a simpler asymptotic would then be $v_n := e^{u_n}$: $$ v_{n+1} - v_n = e^{u_{n+1}} - e^{u_n} = e^{u_n}\left( e^{e^{-u_n}}-1 \right) = e^{u_n} \left( e^{-u_n} + O(e^{-2u_n})\right) = 1 + O(e^{-u_n}) $$ with a $O$ that is uniform over $n\geq 1$ (since $u_n \geq u_1 \geq \inf f = 1$ for every $n\geq 1$). Summing it, we get $$ v_n = v_1-1 + n + O\left(\sum_{j=1}^{n-1} e^{-u_n} \right) = v_1-1 + n + O(u_n-u_1) . $$ But because $u_n \to \infty$, we know that $O(u_n) = o(v_n)$, hence $v_n \sim n$ and $u_n \sim \ln n$.

2

Not a complete answer. We are iterating the function $f(x) = x + \exp(-x)$, so we should try to understand something about the behavior of this function! First, as you seem to have already observed, it satisfies $f(x) > x$, so it has no fixed points and the sequence diverges. Moreover if $x$ is negative then $\exp(-x)$ grows rapidly, so it's plausible that we can assume WLOG that $u_0 \ge 0$.

The main thing that sticks out to me is that $\exp(-x)$ is decreasing (rapidly) as a function of $x$. This means the sequence can't grow too quickly; for example it is not consistent with the sequence growing linearly, since $\exp(-n)$ decays too rapidly for that.

More generally we see that the sequence of finite differences

$$u_{n+1} - u_n = \exp(-u_n)$$

must have the property that

  • $\sum \exp(-u_n)$ diverges
  • at the same rate as $u_n$ itself.

This first property rules out $u_n$ growing as fast as $n^{\varepsilon}$ for any $\varepsilon > 0$. So a natural next step after this is to check whether $u_n$ can grow logarithmically. And if $u_n \approx \ln n$ - a common choice of growth rate slower than $n^{\varepsilon}$ - then $\exp(-u_n) \approx \frac{1}{n}$ and $\sum_{k=1}^n \frac{1}{k} \approx \ln n$, so this is a consistent rate of growth for the sequence to have.

Having guessed (but I think in a reasonably well-motivated way and without Python) the dominant growth rate we can now write

$$u_n = \ln n + r_n$$

which gives $\exp(-u_n) = \frac{\exp(-r_n)}{n}$, so

$$u_n = \ln n + r_n = u_0 + \sum_{k=1}^{n-1} \frac{\exp(-r_k)}{k}.$$

Note that in order to match the $\ln n$ on the LHS $r_n$ can't be asymptotic to a nonzero constant $r$ (or else the $\ln n$ would be multiplied by $\exp(-r)$), so $r_n \to 0$. This means we can try to approximate $\exp(-r_k) \approx 1 - r_k$, which gives

$$\ln n + r_n \approx u_0 + H_{n-1} - \sum_{k=1}^{n-1} \frac{r_k}{k}$$

so we are looking for $r_n$ such that

  • $\sum_{k=1}^{n-1} \frac{r_k}{k}$ converges (to a constant affected by $u_0$)
  • at rate $r_n$ itself.

One of the simplest options we could check is $r_n \approx \frac{r}{n}$, and it has this property. So at this point we have, plausibly,

$$\boxed{u_n \stackrel{?}{=} \ln n + \frac{r}{n} + O \left( \frac{1}{n^2} \right) }$$

where $r$ is determined by $u_0$, and then we could try to prove it.

Qiaochu Yuan
  • 468,795