1

The Context:

I was reading the book The Road to Reality by Roger Penrose when in his calculus section he gave a paragraph or two about the notion of $C^{\infty}$-smoothness and there was one exercise that asked us to prove this infinitely derived continuity of a puzzling function:

$$f(x)=\left\{\begin{matrix} x>0:&e^{-1/x} \\ x\leq 0: & 0 \end{matrix}\right.$$

The question in the book was labeled as difficult and recommended for those with good experience with analysis. Presumably, this could have meant, Real Analysis students or well-off calculus students, which is what I thought I was.

I searched the web for a proof, and I found one that used the idea that for any nth derivative of this piecewise, for the function to be continuous there, the limit of the function towards $0^+$ with $0^-$ (or just $0$ from both sides) must match. This makes intuitive sense and uses an argument of neighborhoods, which I have just accepted as an internal axiom at this point in my career.

The argument came down to show the fact that the derivative will always take the form of this composed exponential multiplied by some polynomial that is composed with the reciprocal function. Then it was stated that the numerator tends to zero faster than the denominator tends to infinity. Which is saying that e^x is faster than any polynomial.

The Question:

My question is, what is a rigorous notion of 'fastness' or growth rate? I've seen that you can observe the asymptotic behavior of a function or its derivative to gain an insight on the ordering (order) of two functions and their growths, but I never really had a sense of rigorous intuition to give a complete, non-circular proof of how functions grow faster than others using a definition of 'growth rate' that doesn't include just a list of functions that are known to 'grow' faster than others.

The issue with my definition of growth is that in my mind it can mean multiple things. It can mean instantaneous rate of change, average rate of change, how fast a rate of change grows (or changes), or even what a function approaches asymptotically. It all seems to be not abstracted enough. (Like is there a number who can quantify the notion of fastness of a function?)

For example, the question of which grows faster between $f(x)=Ax\sin(x)$ and $g(x)=Bx$ seems to have ambiguities. It would be awesome to gain a better insight into this and maybe a more liberating way of thinking than the box I seem to be within. This question is also open to the discussion of sequences and series.

  • Usually when authors talk about rates of growth they mean the formalism of something like the various o-notations. In your case, both of those functions would be classified as having the same "growth rate" (really an upper or lower bound on the size of the function) because they be bounded arbitrarily tightly by some $Cx$ in an envelope. Usually though, the concept of a growth rate is reserved for functions we can call "sufficiently monotonic" i.e. with parameter small enough (e.g. close to $0$) or parameter large enough the function is indeed monotonic. – Ninad Munshi Jun 15 '24 at 05:26
  • You may be interested in Landau notation. It's often used to provide meaningful bounds on the asymptotic behavior of functions. https://en.wikipedia.org/wiki/Big_O_notation – CyclotomicField Jun 15 '24 at 05:27
  • Continuing from my last comment - intuitively you can use standard families of monotonic functions to bound any function and describe the "speed" at which they approach a limit. – Ninad Munshi Jun 15 '24 at 05:29
  • For $C^{\infty}$-smoothness (at $x=0$ is the only nontrivial point), mathematical induction such as described in this MSE answer will work (also this MSE answer). For some common ways to define the same growth rate of functions, see this 22 March 2003 sci.math post. – Dave L. Renfro Jun 15 '24 at 11:30

0 Answers0