1

Let $u \in C^{\infty}(\mathbb{R})$ and suppose $u(x_0) = 0$. Does there always exist some $\epsilon > 0$ such that one of the following three conditions is satisfied?

  • $f(x) > 0$ for every $x \in (x_0, x_0 + \epsilon)$
  • $f(x) < 0$ for every $x \in (x_0, x_0 + \epsilon)$
  • $f(x) = 0$ for every $x \in (x_0, x_0 + \epsilon)$

My thoughts

I know that this isn't true if we just require differentiability, thanks to the famous example $x^2 \cdot \sin(1/x)$. In fact, simply choosing $n$ large enough puts $x^n \cdot \sin(1/x)$ into any class $C^k(\mathbb{R})$ for a fixed finite $k$.

However, I'm not sure if requiring $C^{\infty}$ may circumvent this type of scenario. Perhaps an infinite smoothness will get rid of any such kinks? The example I gave seems to suggest that we'd need to have "$x^{\infty}$", which is uniformly zero in $(-1,1)$.

Sambo
  • 7,469
  • I dont think $C^\infty$ will magically fix stuff. analyticity would, but not $C^\infty$. I don't have an example though, but I'd bet I'm right. – mathworker21 Jul 30 '19 at 00:28
  • @mathworker21 I'd tend to agree with you. Maybe it's possible to give the standard bump function a "jiggle" so that it oscillates near where it's zero? – Sambo Jul 30 '19 at 00:31
  • 1
    What about $\exp(-1/x^2) \sin(1/x)$? – Nate Eldredge Jul 30 '19 at 00:40
  • @NateEldredge Is that $C^{\infty}$? If so, it does serve as a counter-example. – Sambo Jul 30 '19 at 00:43
  • yes that example by Nate is $C^{\infty}$ (provided that you define it to be $0$ at $x=0$) in fact for that function, all its derivatives at the origin vanish. This just goes to show that by using a very flat (near the origin) exponential, you can "mask" alot of the bad behaviour of $\sin(1/x)$. So, you're right, $C^{\infty}$ functions can still be very badly behaved, it is usually analytic functions that have a lot of very nice properties – peek-a-boo Jul 30 '19 at 00:51
  • If you would make that into an answer I'd be happy to accept it! (Preferably with a quick proof of smoothness) – Sambo Jul 30 '19 at 00:54
  • See find monotonic interval for real $C^\infty$ function and Smooth function with infinite oscillation. In the second of these I mention that the $\exp(-1/x^2)\sin(1/x)$ example was first given in Dini's 1878 treatise on real analysis, and I'm pretty sure this was the first published example of a function failing to have the property at the beginning of your question. – Dave L. Renfro Jul 30 '19 at 05:52

1 Answers1

5

The counterexample given here was suggested by @Nate Eldredge in the comments; I'm just elaborating on its properties :)


The function $f: \Bbb{R} \to \Bbb{R}$ defined by \begin{align} f(x) = \begin{cases} e^{-1/x^2} \sin\left( \frac{1}{x}\right) & \text{if $x \neq 0$} \\ 0 & \text{if $x=0$} \end{cases} \end{align} is easily seen to be $C^{\infty}$ away from the origin, and at the origin, one can show that all the derivatives vanish. The most straight-forward proof I know is by direct verification (a really strict proof follows by induction on the form of the derivative).

The rapid oscillatory behaviour of $f$ near the origin shows that there is no $\varepsilon > 0$ for which those conditions you stated hold. (I suggest you use wolfram alpha to plot this function to see just how quickly things approach $0$ at the origin, and how fast the function is oscillating).


Here's a rough idea of the proof of $C^{\infty}$ at the origin. Let's first show that $f'(0)$ exists and equals $0$. For $x\neq 0$, we have that \begin{align} \left | \dfrac{f(0 + x) - f(0)}{x} \right| &= \left| \dfrac{e^{-1/x^2} \sin (1/x)}{x} \right| \\ & \leq \left| \dfrac{e^{-1/x^2}}{x} \right| \cdot 1 \end{align} And, you should know from somewhere that "exponentials dominate polynomials", in the sense that the numerator goes to $0$ much faster than the denominator goes to $\pm\infty$, so as $x \to 0$, the RHS tends to $0$ as well.

In general, you can show that for $x \neq 0$, the $k^{th}$ derivative looks like an exponential term multiplied by trigonometric term multiplied by a polynomial in $\dfrac{1}{x}$. I.e There exist polynomials $P,Q$ such that \begin{align} f^{(k)}(x) = e^{-1/x^2} \left( P\left(\dfrac{1}{x} \right) \cdot \sin\left(\dfrac{1}{x} \right) + Q\left(\dfrac{1}{x} \right) \cdot \cos \left(\dfrac{1}{x} \right)\right) \end{align}

And as $x \to 0$, the limit will be $0$, because "exponentials dominate polynomials" (the trigonometric terms are bounded by $1$; so they don't really matter).

peek-a-boo
  • 65,833