21

Unfortunately, I don't have much detail to give here. But is the general idea to cancel out the constant obtained from taking the derivative.

For instance, say my function was $f(x)=f_0+f_1x+f_2x^2+\dotsb$

Then $f'(x)=f_1+2f_2x+\dotsb$.

And if the expansion is centered around $x=0$, then \begin{align}f'(0)&=0 \\ f''(0)&=2f_2\\ f'''(0)&=3\cdot 2f_3.\\ \end{align}

Therefore \begin{align} f_0&=f(0) \\ f_1&=\frac{f'(0)}{1} \\ f_2&=\frac{f''(0)}{2} \end{align}

And so forth. Is that where the factorial comes from?

It is quite clear for a polynomial, but what about a trig function such as $\sin(x)$ other than using Taylor's formula?

Tom
  • 211
  • 1
    Sure. Differentiate $x^n$ a total of $n$ times, or integrate $1$ a total of $n$ times. – André Nicolas Oct 08 '12 at 16:32
  • If that's the case, how can a trig function be explained? – Tom Oct 08 '12 at 16:40
  • Sine and cosine have (reciprocal) factorials. Everybody does. The examples of missing, or apparently missing factorials are things like $1/(1-x)$ and its relatives like $\log(1+x)$ and $\arctan x$, where factorials are prouced by the differentiation process and largely cancel the factorial that comes from $x^n$. – André Nicolas Oct 08 '12 at 18:05
  • 1
    Ignoring differentiability issues and rigor, you can obtain the coefficients in a purely algebraic manner by following the method I used in my answer at power series expansion. – Dave L. Renfro Oct 08 '12 at 19:26

5 Answers5

14

Start with the fundamental theorem of calculus: $$ f(x) = f(x_0) + \int_{x_0}^x f^\prime(y) \mathrm{d} y $$ and reapply it to $f'(y)$: $$ f(x) = f(x_0) + \int_{x_0}^x \left( f^\prime(x_0) + \int_{x_0}^y f^{\prime\prime}(z) \mathrm{d} z \right) \mathrm{d} y = f(x_0) +f^\prime(x_0) \int_{x_0}^x \mathrm{d} y + \underbrace{\int_{x_0}^x \left( \int_{x_0}^y f^{\prime\prime}(z) \mathrm{d} z\right)\mathrm{d} y}_{\mathcal{R}_2(x)} $$ Repeat this with $f^{\prime\prime}(z)$: $$ f(x) = f(x_0) + f^\prime(x_0) \underbrace{\int_{x_0}^x \mathrm{d} y}_{I_1(x)} + f^{\prime\prime}(x_0) \underbrace{\int_{x_0}^x \int_{x_0}^y \mathrm{d}z \mathrm{d} y}_{I_2(x)} + \underbrace{\int_{x_0}^x \int_{x_0}^y \int_{x_0}^z f(w) \mathrm{d} w \mathrm{d} z \mathrm{d} y}_{\mathcal{R}_3(x)} $$ and by continuing, we get: $$ f(x) = f(x_0) + f^\prime(x_0) \int_{x_0}^x \mathrm{d} y + \cdots + f^{(k)}(x_0) \underbrace{\int_{x_0}^{x} \int_{x_0}^{y_1} \int_{x_0}^{y_2} \cdots \int_{x_0}^{y_{k-2}} \mathrm{d} y_{k-1} \cdots\mathrm{d} y_3 \mathrm{d} y_2 \mathrm{d} y_1}_{I_k(x)} + \mathcal{R}_{k+1}(x) $$ The iterated integrals $I_k(x)$ are easy to evaluate. They can be defined recursively $$ I_0(x) = 1, \quad I_k(x) = \int_{x_0}^x I_{k-1}(y) \mathrm{d} y $$ Giving $I_k(x) = \frac{1}{k!} (x-x_0)^k$.

Elmina
  • 139
Sasha
  • 71,686
  • 1
    Very insightful! The last two steps involve calculating the anti derivatives and observing that the denominator corresponds to the factorial. – Nick Mar 02 '21 at 21:37
  • Not easy to be understood by someone who's not super familiar with integral such as me. – zzzgoo Mar 26 '23 at 12:30
6

Watch this video to see why there are factorials: http://www.youtube.com/watch?v=QMJvRNFhEGc This guy is a simple, but effective teacher.

In short, the Taylor Series expansion is derived from the Power Series formula.

Power Series formula is: $f(x) = a + ax^1 + ax^2 + ax^3 + ...$

For example, multiple derivatives of $f(0) = ax^5$ leads to:

$f(0) = ax^5$

$f '(0) = 5ax^4$ then...

$f ''(0) = 20ax^3$ but wait!! Instead, write the second derivative as $5*4a(x^3)$

$f '''(0) = 5*4*3a(x^2) $

$f ''''(0) = 5*4*3*2a(x)$

$f '''''(0) = 5*4*3*2*1a$

Solve for $a$ yields $a = f '''''(0)/5!$ Now, insert this $a$ value for the Power Series term $ax^5$. So, $[f '''''(0)/5!]x^5$

Not easy to edit with this but hopefully the video will help.

Ritch
  • 69
5

It sounds like you already accept that the $n!$ terms make sense when you're talking about polynomials. For other functions like $\sin{x}$, the whole motivation for Taylor series is to approximate those functions by polynomials, so in my opinion I would say that the $n!$ terms appear because that is precisely the property that mathematicians wanted out of Taylor series when they first invented it - so that any random function, $\sin{x}$, $\ln{x}$, etc, could look like a polynomial.

Alternatively, maybe this can help you see: if we have the Taylor series for $f(x)$ at $0$, $$ f(0) + f'(0)x + \frac{1}{2} f''(0) x^2 + \frac{1}{3!} f'''(0) x^3 + \ldots$$ then if we differentiate this function once, we get $$ f'(0) + f''(0) x + \frac{1}{2} f''(0) x^2 + \ldots $$ which gives us the Taylor series for $f'(x)$ at $0$! Notice that all the terms "shifted" downwards; allowing us to recover the familiar form of the Taylor series.

  • That makes sense. So in your opinion, it was extended to other functions that aren't naturally polynomials based on taylor's theorem. – Tom Oct 08 '12 at 16:56
  • @Tom, yeah, that's my opinion. But others may have different interpretation, so you should stick around and see what other people have to say. – Christopher A. Wong Oct 08 '12 at 17:54
3

Since other answers have already covered the "real" reason, let's try to make an "intuitive" explanation.

If you're trying to approximate the function $f(x)$ with a 3rd degree polinomial $P(x)=ax^3+bx^2+cx+d$ around the point $x=0$. The least you can ask is that their first, second and third derivative match at $x=0$. So:

$$f(0) = P(0) = d$$ $$f'(0) = P'(0) = c$$ $$f''(0) = P''(0) = 2b$$ $$f'''(0) = P'''(0) = 6a$$

Put otherwise: $$d = f(0)$$ $$c = f'(0)$$ $$b = \frac{1}{2} f''(0)$$ $$a = \frac{1}{6} f'''(0)$$

Of course, the reasoning would be similar for a general $n$th degree polynomial $Q(x) = a_n x^n + a_{n-1} x^{n-1} + ... + a_1 x + a_0$, where

$$f^{(k)}(0) = Q^{(k)}(0) = k! a_k $$

Therefore:

$$a_k = \frac{1}{k!} f^{(k)}(0)$$

Of course, what we've stated here still holds for Taylor polynomials not centered around $x=0$

David
  • 3,144
1

$\forall x \in \mathbb{R}\int_0^xt dt = \frac{x^2}{2!}$

$\forall x \in \mathbb{R}\int_0^x\frac{t^2}{2!} dt = \frac{x^3}{3!}$

$\forall x \in \mathbb{R}\int_0^x\frac{t^3}{3!} dt = \frac{x^4}{4!}$

$\forall x \in \mathbb{R}\int_0^x\frac{t^4}{4!} dt = \frac{x^5}{5!}$

Timothy
  • 860