7

I hope my question becomes clear:
I understand what a Taylor polynomial does. It approximates an analytical function in the point $x=a$ in a way that the n'th order Taylor polynomial matches the function up to its n'th derivative in the point $x=a$.

Something I could never answer myself though is:
Why does each additional Taylor term improve the approximation of the function in the vicinity of the point $x=a$? Can't it be that a Taylor polynomial provides a worse approximation in the vicinity of $x=a$ when taking more Taylor terms (i.e. choosing a later truncation for $n$)?
If this is the case:
Is it also safe to say that truncating a Taylor series at larger $n$ also improves the approximation of $f(x)$ for points far away from $x=a$?

And a last question:
Why is an analytic function $f(x)$ almost always equal to the Taylor series of the form $f(x)=\sum_{n=0}^{\infty} \frac{f^{(n)}}{n!} (x-a)!$
Are there function where this is not the case? And is there a clear cut proof, why an infinite Taylor series equals an analytic function in general?

AlpaY
  • 111
  • 2
    They don't, in general. Tautologically, only for functions that they improve the approximation they improve the approximation. – MoonLightSyzygy Dec 22 '19 at 17:33
  • 1
    Note that the error aproximation can be zero. Indeed, any polynomial has a Taylor expansion with $R(x)=0$. (Since the best approximation to a polynomial is the polynomial itself!) – manooooh Dec 22 '19 at 17:36
  • Yes, there are functions $C^\infty$ but not analytical. For instance a function defined as $e^{-1/x}$ for $x \geq 0$ and $0$ for $x<0$ is not analytical in 0 – tommy1996q Dec 22 '19 at 17:36
  • 1
    This related question and its answer look like they cover part of this. – amd Dec 22 '19 at 18:04
  • 2
    If you are far away from the expansion point, then taking a higher degree polynomial often makes things worse: Consider for example $f(x) =\sin(x)$. The zero order expansion $\sin(x) \approx 0$ is much better than (say) the first order expansion $\sin(x) \approx x$ for $|x| \gg 1$. – PhoemueX Dec 22 '19 at 18:05
  • 1
    It improves the accuracy (at least, eventually and within the convergence disk) because there are error estimates that converge to $0$ as the number of terms increases. And if it does not then the approximation is poor and shouldn't be used. Analytic function is always equal to the sum of its Taylor series close enough to the expansion point. That is one of the definitions of "analytic", so it requires no proof. What does require proof is that a given function is analytic. – Conifold Dec 22 '19 at 18:32

1 Answers1

8

Why does each additional Taylor term improve the approximation of the function in the vicinity of the point $x=a$?

It does not always do that.

Can't it be that a Taylor polynomial provides a worse approximation in the vicinity of $x=a$ when taking more Taylor terms (i.e. choosing a later truncation for $n$)?

It could be the case that taking some additional terms provides a worse approximation. The idea is merely that the series eventually converges to the correct value. So you will get a better approximation at whatever point you are looking at (within the radius of convergence) if you take enough additional terms.

Is it also safe to say that truncating a Taylor series at larger $n$ also improves the approximation of $f(x)$ for points far away from $x=a$?

Not always. A particular increase in $n$ might improve the approximation at a closer point but make it worse farther away.

For example, consider the Taylor series for $\cos(x)$ about $x = 0.$ Consider the Taylor polynomials \begin{align} p_0(x) &= 1,\\ p_2(x) &= 1 - \frac12 x^2,\\ p_4(x) &= 1 - \frac12 x^2 + \frac1{24}x^4,\\ p_6(x) &= 1 - \frac12 x^2 + \frac1{24}x^4 - \frac1{720}x^6.\\ \end{align}

Now evaluate these at $x= 1$. We get \begin{align} p_0(1) &= 1,\\ p_2(1) &= 0.5,\\ p_4(1) &\approx 0.54167,\\ p_6(1) &\approx 0.54028,\\ \end{align} each of which gets progressively closer to $\cos(1),$ which is approximately $0.54030.$

But at $x= 5$ we get \begin{align} p_0(5) &= 1,\\ p_2(5) &= -11.5,\\ p_4(5) &\approx 14.54,\\ p_6(5) &\approx -7.16,\\ \end{align} whereas $\cos(5)$ is approximately $0.28.$ So we see that from $p_0$ to $p_4$ the approximation just keeps getting worse, and it does not even begin to improve until $p_6.$ Continuing with higher-order polynomials, $p_8(5) \approx 2.52$ and $p_{10}(5) \approx -0.16$. The absolute error of $p_{10}$ is a little less than $0.45,$ which is the first time we get an error less than the error of $p_0,$ which is about $0.72.$

Now, whether $x = 5$ is really in the "vicinity" of $x = 0$ is something you might argue, but considering that the radius of convergence of the Taylor series about $x = 0$ is infinite, $x = 5$ is not really that far away. And we could always do a similar analysis for a function such as $\frac1{10000}\cos(10000x),$ for which the behavior of the Taylor polynomials at $x = 0.0005$ is analogous to the behavior we examined at $x = 5$ above.

Why is an analytic function $f(x)$ almost always equal to the Taylor series of the form $$f(x)=\sum_{n=0}^{\infty} \frac{f^{(n)}}{n!} (x-a)^n ?$$ Are there function where this is not the case?

I would not say "almost always." There are some "nice" functions such as polynomials or sinusoidal functions whose radius of convergence is infinite; in general, however, we have a finite radius of convergence, which means that the Taylor series that you find around a particular point is wrong (in fact does not even converge) on a far greater part of the number line than the part on which it is correct.

On the other hand, if by "almost always" you just mean that the Taylor series is almost always correct within some neighborhood of the point we take it about, you can delete the word "almost." By definition, if a function $f$ is a real analytic function then at every real number $x_0$ the Taylor series of $f$ around $x=x_0$ is correct on some neighborhood of $x_0.$

And is there a clear cut proof, why an infinite Taylor series equals an analytic function in general?

The proof is by definition.

A more interesting question is whether an infinitely differentiable function always has a Taylor series at every point that is accurate on an interval about that point. The answer is no. An example that is often cited is $$ f(x) = \begin{cases} e^{-1/x^2} & x\neq 0, \\ 0 & x = 0, \end{cases} $$ whose Taylor series around $x = 0$ is simply zero, which is the correct value of the function only at the single point $x = 0$ itself. See this answer for further discussion.

David K
  • 108,155