How to prove that the function $f(x)=\sin\frac{1}{x}$ for $x\neq 0,f(0)=0$ has an antiderivative? This means $F(x)=\int^{x}_{0}\sin(1/t)dt$ has derivative $0$ at $x=0$, but I have no idea how to prove it.
-
2You can try a substitution to get an integrand where the existence of the limit of the difference quotients is more easily seen. – Daniel Fischer Mar 20 '14 at 17:07
2 Answers
We can substitute $u = t^{-1}$ to get a more convenient expression:
$$\begin{align} \left\lvert \frac{F(x) - F(0)}{x}\right\rvert &= \frac{1}{\lvert x\rvert} \left\lvert \int_0^x \sin (t^{-1})\,dt\right\rvert \\ &= \frac{1}{\lvert x\rvert} \left\lvert \int_{1/\lvert x\rvert}^\infty \frac{\sin u}{u^2}\,du\right\rvert \tag{symmetry}\\ &= \frac{1}{\lvert x\rvert} \left\lvert \left[-\frac{\cos u}{u^2}\right]_{1/\lvert x\rvert}^\infty - 2 \int_{1/\lvert x\rvert}^\infty \frac{\cos u}{u^3}\,du\right\rvert\\ &= \frac{1}{\lvert x\rvert} \left\lvert \lvert x\rvert^2 \cos \frac{1}{\lvert x\rvert} - 2 \int_{1/\lvert x\rvert}^\infty \frac{\cos u}{u^3}\,du\right\rvert\\ &\leqslant \lvert x\rvert + \frac{1}{\lvert x\rvert} \int_{1/\lvert x\rvert}^\infty \frac{2}{u^3}\,du\\ &= 2\lvert x\rvert. \end{align}$$
Thus $F'(0) = 0$.
- 211,575
I just blogged another solution to this question in my blog. The idea is to consider the following function
$$G(x)=\begin{cases}x^2\cos \frac 1x, & \text{ if } x \ne 0,\\ 0, & \text{ if }x=0.\end{cases}$$
which is differentiable. Clearly,
$$G'(x)=\begin{cases} \sin \frac 1x + 2x \cos \frac 1x, & \text{ if } x \ne 0,\\ 0, & \text{ if }x=0.\end{cases}$$
Hence, $G' = f + h$ where
$$h(x)=\begin{cases} 2x \cos \frac 1x, & \text{ if } x \ne 0,\\ 0, & \text{ if }x=0.\end{cases}$$
Since $h$ is continuous, it has antiderivative $H$, thus giving us $f = (G-H)'$. In other words, $G-H$ is an antiderivative of $f$.
- 23,601
- 7
- 53
- 88
- 546
-
I know this is quite old question, but could anyone explain the part after Hence, G'=f+h? I don't uderstand why we get f=(G-H)' from g'=f+h – Question Apr 27 '20 at 14:56
-
@Question Since $G' = f + h$ we have $f = G' - h$. Now recall that $H$ is the antiderivative of $h$. That is, $H' = h$. It therefore follows that $f = G' - h = G' - H' = (G-H)' – Digitallis Apr 09 '25 at 12:53