I am taking some courses online (machine learning) and the instructor noted that a Taylor series can be used to approximate any function. A student left a comment that a Taylor series cannot approximate ANY function, but noted that the instructor probably stated it unequivocally because it can certainly approximate any function we would need for the class.
I am curious, what functions cannot be approximated by a Taylor series?
Two things lead me to believe that any function CAN be approximated by a Taylor series:
- In context of regression, it seems I can always continue to add terms and adjust the coefficients on those terms (or solve for them to reduce the error) to continually get a better approximation of points in my data set.
- The ambiguity of the word approximate suggests that we don't have to meet a strict criteria to say that a Taylor series approximates a function.
With those things in mind, I can also see a point to the opposite view. For example, a function with large discontinuities such as
$f(x) = \begin{cases}sin(x)/x^2 & \text{if $x < 0$} \\ 2x & \text{if $x > 100$}\end{cases}$
seems like it wouldn't be well approximated by a Taylor series. But then again, without some defenition of approximate, I don't see how we can say either way.