5

Under what conditions are the inverses of asymptotic functions themselves asymptotic? (A simple example where this is not the case is the pair of functions $\log (x)$ and $\log (2x)$.)

MITjanitor
  • 2,728
  • 1
  • 23
  • 48
Palafox
  • 261

2 Answers2

3

This isn't meant to be comprehensive, but here is a fairly straightforward result that is also general enough to handle your question about $\pi(x)$. I'll state it in a way that might feel a bit backwards at first — but this allows us to give a concise definition by making use of the existing concept of uniform continuity.

Let $f: \mathbb R^+ \to \mathbb R^+$ be a strictly increasing function with $\lim_{x\to\infty} f(x) = \infty$, and define the conjugate function $g(x) := \log f(\exp(x))$ (which also strictly increases to infinity). If $g^{-1}$ is uniformly continuous on some interval $[a,\infty)$, then $y \sim f(x)$ implies $x \sim f^{-1}(y)$ .

If $f(x) = x^n$, then the conjugate is just the linear function $nx$, which is of course uniformly continuous. A more interesting case is $f(x) = x /\log x$: its conjugate is $x - \log x$, whose derivative is bounded away from $0$ on $[2,\infty)$, and thus it has uniformly continuous inverse. In fact it's not hard to see that this property is preserved by multiplication and division by slowly-growing functions like $\log x$ or $\log \log x$ or even $\exp(\sqrt{\log x})$, so it applies to most of the counting functions you'll encounter in Hardy and Wright.

This result might be considered "obvious" for a fixed function $f$, so it could easily be taken for granted. But the general principle is not so hard to prove, either: assume that the conditions hold and $y \sim f(x)$, so $y / f(x) \to 1$ as $x\to\infty$. Taking logs, we have $\log y - \log f(x) \to 0$, so letting $X = \log x$ and $Y = \log y$ we get $Y - g(X) \to 0$. Now applying uniform continuity of $g^{-1}$ gives $g^{-1}(Y) - X \to 0$, which unravels to $\log f^{-1}(y) - \log x \to 0$, in other words $x \sim f^{-1}(y)$.

Erick Wong
  • 25,868
  • 3
  • 43
  • 96
2

In fact, it's a problem about asymptotic solution of transcendental equations.

For equations with real variables, we have the following conclusion.

$\mathbf{Theorem\ 1.}$ Let $f(\xi)$ be continuous and strictly increasing in an interval $a<\xi<\infty$, and $$f(\xi)\sim \xi, \ \xi \to \infty.$$ Denote by $\xi(u)$ the root of the equation $f(\xi)=u$ which lies in $(a,\infty)$ when $u>f(a)$.Then $$\xi(u)\sim u,\ u\to \infty.$$

It's fortunate to generalize this conclusion to complex variables, but there're several constraints.

$\mathbf{Theorem\ 2.}$ Suppose that $f(z)$ is an analytic function of complex $z$, which is holomorphic in a region containing a closed annular sector $\mathbf{S}$ with vertex at the origin and angle less that $2\pi$. $$\ $$ Let $\mathbf{S}_1$ and $\mathbf{S}_2$ be closed annular sectors with vertices at the origin, $\mathbf{S}_1$ being properly interior to $\mathbf{S}$, and $\mathbf{S}_2$ being proper interior to $\mathbf{S}_1$.Then $$z(u)\sim u, as\ u\to \infty\ in\ \mathbf{S}_2,$$ where $z(u)$ is a root of the equation $u=f(z)$.

MathRoc
  • 860
  • 1
    The proof can be founded in Page 14 of the book called Asymptotics and Special Functions which is written by Frank Olver. – MathRoc Mar 28 '20 at 07:22
  • Theorem 1 seems awfully restrictive at first glance. It seems to only apply to functions that are very nearly equal to the identity function, e.g. $x + \frac12 \sin x$. Is there an application that I’m missing? – Erick Wong Jul 07 '20 at 22:01
  • @ErickWong You're quite right. Theorem 1 is just used to deduce Theorem 2. – MathRoc Jun 27 '21 at 22:32