It's true that I'm not familiar with too many exotic functions, but I don't understand why there exist functions that cannot be described by a Taylor series? What makes it okay to describe any particular functions with such a series? Is there any difference for different number sets? In the case of complex numbers maybe? Could somebody provide an example?
-
5A Taylor series exists if and only if the function is infinitely differentiable at some a. – Doug M Feb 02 '17 at 03:46
-
11And even then, the Taylor series doesn't always converge to the function in some neighbor. @DougM – Thomas Andrews Feb 02 '17 at 03:48
-
1@ThomasAndrews Indeed, I thought about going there, but then decided to keep it to the point. Radius of convergence is then a whole 'nuther thing. – Doug M Feb 02 '17 at 03:50
-
4Related questions: Motivating infinite series ; Why doesn't a Taylor series converge always? ; Is it possible for a function to be smooth everywhere, analytic nowhere, yet Taylor series at any point converges in a nonzero radius? – Winther Feb 02 '17 at 03:52
-
3The Taylor series represent the function when the Taylor remainder tends to zero, otherwise the Taylor series doesnt represent the function. – Feb 02 '17 at 04:03
-
1If you want a set of functions whose Taylor series exist and converges everywhere to the function in question then you want to consider analytical (entire) functions. This includes polynomials, $\sin$, $\cos$, the exponential any many more however it is a very restrictive property that rules out most "exotic" functions. – Winther Feb 02 '17 at 04:08
-
Okay, thanks, what about the sine integral function? What is the deal with that, i assume it cannot be described by a series, how else could one go about it? – smaude Feb 02 '17 at 04:29
-
@smaude Do you mean the sine integral, $\text{Si}(x) = \int_0^x \frac{\sin(y)}{y}{\rm d}y$? That function is entire so it's Taylor series converges everywhere to the correct function. For example expanding about $x=0$ the series starts out like $x - \frac{x^3}{18} + \frac{x^5}{600} + \ldots$. – Winther Feb 02 '17 at 04:36
-
I don't understand why there are close votes. Could someone enlighten me? – Simply Beautiful Art Feb 02 '17 at 14:22
5 Answers
We have the somewhat famous function:
$$f(x)=\begin{cases}e^{-1/x^2}&x\neq 0\\ 0&x=0 \end{cases}$$
is infinitely differentiable at $0$ with $f^{(n)}(0)=0$ for all $n$, so, even though the function is infinitely differentiable, the Taylor series around $0$ does not converge to the value of the function for any $x>0$.
Technically, any function that is infinitely differentiable at $a$ has a Taylor series at $a$. Whether you find that Taylor series useful depends on what you want the series to do.
For example, if given a $g$ infinitely differentiable at $0$, the we know that there exists $C,\epsilon>0$ such that:
$$\left|g(x)-\sum_{k=0}^{n} \frac{g^{(k)}(0)}{k!}x^k\right|<Cx^{n+1}$$
for all $|x|<\epsilon$.
So the finite terms of the Taylor series are in some sense always the "best" polynomial for agreeing with function.
So what happens to our function $f$ above is that $f(x)$ converges to $0$ faster than any function $x^n$.
What we don't always get, for real functions, is a Taylor series that converges to the function in the interval.
In complex numbers, things become intriguing. It turns out, if you define differentiation on complex functions in a relatively simple way, then any function which is differentiable at a point is infinitely differentiable at that point, and the Taylor series converges in some "ball" centered on that point.
- 103
- 186,215
-
9I've never understood why so many find this surprising; I guess it is because many are taught (incorrectly) that Taylor Series must converge to the function. I was taught from the beginning that a Taylor Series simply provides an approximation in terms of derivatives around a point and might happen to converge. In the neighborhood of $0$, the function $e^{-1/x^2}$ is approximated quite well by $0$ – Brevan Ellefsen Feb 02 '17 at 04:02
-
12@BrevanEllefsen: That's not really why. It's because you would intuitively expect a smooth function to be predictable, and it seems weird that the derivatives of a function do not contain enough information to predict it at a nearby point. – user541686 Feb 02 '17 at 07:59
-
2Yes, but that function does have a Taylor series, it just doesn't agree with the function anywhere but at 0. – Klangen Feb 02 '17 at 14:21
-
2@BrevanEllefsen But any infinitely differentiable function in complex numbers does have a converging Taylor series, so the question for reals is interesting. I wouldn't say it is surprising that there are counter-examples in the reals, but it is interesting. – Thomas Andrews Feb 02 '17 at 15:24
-
@ThomasAndrews good points, I see what you're saying. I suppose that, for me at least, the really surprising thing is that there aren't counter-examples of the same kind among functions in complex numbers. For me the really surprising part about Taylor Series is that they don't have to converge everywhere or just at a single point - they can often converge in a small radius (or disk) of convergence. Nevertheless, have a good day sir! – Brevan Ellefsen Feb 02 '17 at 15:39
-
12One of my favorite "proofs" that complex numbers "exist" is that the Taylor series at the real number $a$ for the function $\frac{1}{1+x^2}$ has radius of convergence $\sqrt{1+a^2}$. This is entirely a statement about real numbers, but it suggests there is some root of $1+x^2$ that is $\sqrt{1+a^2}$ distance away from each real number $a$. (If $p(x)$ has all real roots, then the radius of convergence at $a$ of $\frac{1}{p(x)}$ is always the distance to the nearest root to $a$.) @BrevanEllefsen – Thomas Andrews Feb 02 '17 at 15:43
-
I would recommend replacing ""ball" centered around that point" with something like "disk in the complex plane centered at that point". I think it will be understandable to strictly more people that way. – Mark S. Jun 09 '17 at 16:32
-
1@Mehrdad : The best way then to dispel this is to show you can modify a smooth function "locally" (i.e. within some interval only) while keeping smoothness and having the same values elsewhere. Thus it is a much more restrictive and profound property if the function is not only smooth but can be holographically reconstructed from an arbitrarily small piece, which is what "analytic" means, and the most profound bit of all is that when you go to the complex plane, existence of even a single derivative is sufficient (if differentiable everywhere) to guarantee this. – The_Sympathizer Jul 17 '17 at 03:36
-
@ThomasAndrews, me too! That was the example that suddenly made me go - hang on... Maybe these complex number things aren't just pointless abstraction :) – goblin GONE Nov 02 '18 at 14:02
-
Do I understand correctly that your letter $x$ implies real analysis only? If I grasp your answer, your $f(x)$ would be undifferentiable at $z=0$ if the function were instead $f(z)$, on account of an essential singularity at that point in the complex plane. Is this right, please? – thb Sep 12 '19 at 17:42
If the limit of the Lagrange Error term does not tend to zero (as $n \to \infty $), then the function will not be equal to its Taylor Series.
You can also read more on this in Appendix $1$ in Introduction to Calculus and Analysis $1$ by Courant and John. Hope it helps.
In addition to all the comments here, I would like to add the curious Weierstrass function, which is known for its quality of being nowhere differentiable despite the fact that it is continuous everywhere:
$$ W(x) = \sum_{n=0}^\infty a^n\cos(b^n\pi x)$$
Consequently, it does not have a Taylor series.
You can find a visualization of $W$ here.
- 5,459
-
1
-
4@dmtri while I'm unsure of a formal definition of a fractal, differentiable functions must look "linear" if you "zoom in enough". Fractals self similarity on "zooming in" seems to rule this out. – Mark Schultz-Wu Aug 23 '18 at 16:02
I think the intuition you want is the fact that functions that are not complex-differentiable* (also known as holomorphic) are not described by a Taylor series.
And to give another example that is perhaps even more unexpected than the one given by Andrew:
$$f(z) = \begin{cases} e^{-\frac{1}{z}} && \text{if } z > 0 \\ 0 && \text{otherwise}\end{cases}$$
This function is smooth and zero over an infinitely long interval, and yet nonzero, because it is not holomorphic.
*If you're not familiar with complex differentiation, it's like real differentiation, with $h$ complex:
$$f'(z) = \lim_{h \to 0} \frac{f(z + h) - f(z)}{h}$$
For details, see here.
- 14,298
The existence of functions that cannot be described by Taylor series is actually completely intuitive; take the indicator function of the rational numbers viewed as a subset of the reals, for example. Try to keep in mind that functions can be really... arbitrary.
Much more subtle is the existence of smooth functions that aren't analytic; Thomas Andrews gives the standard example of such a beast. Fwiw, my understanding of why this is possible is that okay, there's functions that change behaviour suddenly at a point, BUT the change in behaviour at that point is so gradual, so gentle, so smooth, that none of the function's derivatives can see the change happening; therefore, the Taylor series can't, either.
- 69,385