0

Background Information

In my previous post, I asked how to derive the limit definition of $e$ from the series definition, and the answer given by @peek-a-boo is very thought-provoking. I followed his idea, and tried to cook up an argument which derives the limit definition of $e^x$ for $x \in \mathbb{R}$ from the series definition.

The series definition and limit definition are listed below:

Definition 1. (through series) $$e^x := \sum_{n=0}^{\infty} \frac{x^n}{n!} , x \in \mathbb{R}$$ Definition 2. (through limit) $$e^x := \lim_{n\rightarrow \infty}\left(1 + \frac{x}{n} \right)^n , x \in \mathbb{R}$$

And here are some important theorems and lemma that will be used in the proof:

a. Monotone convergence theorem (for a monotone sequence of real numbers)

a1. Let $(a_n)$ be a real non-decreasing sequence which is bounded above, then its limit exists when $n \rightarrow \infty$, and we have $$\lim_{n \rightarrow \infty} a_n = \sup_n a_n < \infty$$ a2. Let $(a_n)$ be a real non-increasing sequence which is bounded below, then its limit exists when $n \rightarrow \infty$, and we have $$\lim_{n \rightarrow \infty} a_n = \inf_n a_n > -\infty$$

b. Ratio test

Theorem. Let $(a_n)$ be a sequence of real or complex numbers, and let $L = \lim_{n \rightarrow \infty}\left|\frac{a_{n+1}}{a_n}\right|$, then we have: $$\lim_{n \rightarrow \infty}\left|\frac{a_{n+1}}{a_n}\right| = L: \begin{cases} <1 &, \text{$(a_n)$ converges absolutely} \\ >1 &, \text{$(a_n)$ diverges} \\ =1 &, \text{the test is inconclusive} \end{cases}$$

c. One important lemma for convergence of sequences

Lemma. Let $(a_n)$ be a sequence of real or complex terms, s.t. $\lim_{n\rightarrow \infty} n(a_n - 1) = 0$, then we have $\lim_{n \rightarrow \infty} a_n^n = 1$.

Then I will post my own proof, and I expect more different ideas on it.


My Attempt

Let $s_n = \frac{x^n}{n!}$. Then fix some arbitrary $x \in \mathbb{R}$, by ratio test, we have

$$\begin{aligned} \lim_{n\rightarrow \infty} \left| \frac{s_{n+1}}{s_n} \right| &= \lim_{n\rightarrow \infty} \left| \frac{x^{n+1}}{(n+1)!} \frac{n!}{x^n} \right|\\ &= \lim_{n\rightarrow \infty} \left| \frac{x}{n+1}\right| \\ &= 0 \\ &< 1 \end{aligned}$$

So we have the series

$$\sum_{n=0}^{\infty} \frac{x^n}{n!}$$

converges absolutely for any $x \in \mathbb{R}$, and absolute convergence implies its convergence. So we can indeed define $e^x$ as

$$\forall x \in \mathbb{R}, e^x := \sum_{n=0}^{\infty} \frac{x^n}{n!}$$

Then we will derive the limit definition from the series definition. Suppose $x\geq 0$, then we let

$$a_n = \left( 1 + \frac{x}{n} \right)^n$$

and by Binomial Theorem, we have

$$\begin{aligned} a_n &=\left( 1 + \frac{x}{n} \right)^n \\ &= \sum_{k=0}^{n} \binom{n}{k} \frac{x^k}{n^k} \\ &= \sum_{k=0}^{n} \frac{n!}{k! (n-k)!} \frac{x^k}{n^k} \\ &= \sum_{k=0}^{n} \frac{x^k}{k!} \frac{n-k+1}{n} \frac{n-k+2}{n} \cdots \frac{n-1}{n} \frac{n}{n} \\ &= \sum_{k=0}^{n} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &\leq \sum_{k=0}^{n} \frac{x^k}{k!} \\ &= s_n \end{aligned}$$

Then, for the existence of the limit of $(a_n)$, since $x\geq 0$, we have

$$\begin{aligned} a_{n+1} &= \sum_{k=0}^{n+1} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n+1} \right) \\ &\geq \sum_{k=0}^{n+1} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &\geq \sum_{k=0}^{n} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &= a_n \end{aligned}$$

and this means $(a_n)$ is a non-decreasing sequence which is bounded above, so by monotone convergence theorem, the limit of $(a_n)$ exists and we have

$$\lim_{n \rightarrow \infty} a_n = \sup_n a_n \leq \lim_{n \rightarrow \infty} s_n$$

Then, let $m \in \mathbb{N}$ be s.t. $m<n$, then we have

$$\begin{aligned} a_n &= \sum_{k=0}^{n} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &\geq \sum_{k=0}^{m} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ \end{aligned}$$

and this implies

$$\begin{aligned} \lim_{n\rightarrow \infty}a_n &\geq \lim_{n\rightarrow \infty} \sum_{k=0}^{m} \frac{x^k}{k!} \prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &= \sum_{k=0}^{m} \frac{x^k}{k!} \lim_{n\rightarrow \infty}\prod_{j=0}^{k-1} \left( 1 - \frac{j}{n} \right) \\ &= \sum_{k=0}^{m} \frac{x^k}{k!} \\ &= s_m \end{aligned}$$

and it again implies

$$\lim_{m\rightarrow \infty}s_m \leq \lim_{n\rightarrow \infty}a_n$$

Thus, we get

$$e^x = \lim_{m\rightarrow \infty}s_m \leq \lim_{n\rightarrow \infty}a_n \leq \lim_{n\rightarrow \infty}s_n = e^x$$

and by squeeze theorem, we have

$$\forall x \geq 0, \lim_{n\rightarrow \infty}a_n = e^x$$

For the case where $x\leq0$, given the equation

$$\left(1 + \frac{x}{n} \right)^n \left(1 - \frac{x}{n} \right)^n = \left(1 - \frac{x^2}{n^2} \right)^n$$

since we have proved

$$\forall x \geq 0, \lim_{n\rightarrow \infty} \left(1 + \frac{x}{n} \right)^n = e^x$$

if we can prove

$$\lim_{n\rightarrow \infty} \left(1 - \frac{x^2}{n^2} \right)^n = 1$$

then we will get

$$\forall x \geq 0, \lim_{n\rightarrow \infty} \left(1 - \frac{x}{n} \right)^n = e^{-x}$$

which is equivalent to the case for $x\leq 0$. Therefore, by the lemma, fix an arbitrary $x \in \mathbb{R}$, and we have

$$\lim_{n\rightarrow \infty} n\left(1 - \left( 1 - \frac{x^2}{n^2}\right)\right) = \lim_{n\rightarrow \infty} \frac{x^2}{n} = 0$$

and this implies that we indeed have $$\lim_{n\rightarrow \infty}\left(1 - \frac{x^2}{n^2} \right)^n = 1 $$

and thus we indeed get

$$\forall x \geq 0, \lim_{n\rightarrow \infty} \left(1 - \frac{x}{n} \right)^n = e^{-x}$$

which is equivalent to

$$\forall x \leq 0, \lim_{n\rightarrow \infty} \left(1 + \frac{x}{n} \right)^n = e^{x}$$

Hence, we have

$$\forall x \in \mathbb{R}, \sum_{n=0}^{\infty} \frac{x^n}{n!} = \lim_{n\rightarrow \infty} \left(1 + \frac{x}{n} \right)^n = e^{x}$$


My Question

The proof is quite algebra-heavy, and I am wondering whether there are any different proofs, for example those using clever theorems/results in real analysis. And as @peek-a-boo has said in his comment, maybe the Dominated Convergence Theorem or other interesting theorems can offer a very simple proof. But I currently have no idea about it.

All ideas are welcome. Thanks for your help!

  • 1
    Well, this question is much more different than the other one you asked, and now you should look up all the dupes (and the various others which I didn’t explicitly link to). Anyway, using the dominated convergence theorem gives an easy proof even for all $z\in \Bbb{C}$, in particular it’s not really reliant on the ordering of the real line. – peek-a-boo Jan 24 '25 at 05:12
  • Your help is greatly appreciated. I haven't done much deeper things in real analysis like Dominated Convergence Theorem yet, but I will read the wiki article to know more about it. – Yunxuan Zhang Jan 24 '25 at 05:15
  • 1
    I’m pretty confident that we can find a concrete enough proof of it without all the measure-theoretic baggage. So if you want to know that, you should ask that separately (or edit your question accordingly), but I believe as written, this question is a dupe of some of those others I linked to. – peek-a-boo Jan 24 '25 at 05:20
  • Well I currently have no idea how to cook up a simple and clear proof. This is why I am asking for more ideas. – Yunxuan Zhang Jan 24 '25 at 05:26
  • 1
    See related https://math.stackexchange.com/a/637687/72031 – Paramanand Singh Jan 24 '25 at 06:12
  • 1
    Another approach which requires more effort (but perhaps enjoyable as well) is to start with any damn definition of symbol $e^x$ or $\exp(x) $ and using that definition establish all the properties of the function. Thus for example using the limit definition you can prove that $(e^x) '=e^x$ and directly obtain its Taylor series. – Paramanand Singh Jan 24 '25 at 06:14
  • 1
    I wrote a few posts on theories of logarithm and exponential function based on various definitions and you may have a look at them: post 1, post 2, post 3. – Paramanand Singh Jan 24 '25 at 06:17
  • @ParamanandSingh Thanks for your comments. The lemma you used is quite useful to me, and I appreciate your articles very much. – Yunxuan Zhang Jan 24 '25 at 06:30
  • 1
    Since you have also mentioned the lemma concerning $n(a_n-1)\to 0$, you should note that the lemma itself can be used to derive all the properties of $\exp(z) $ starting with definition $(1+(z/n))^n$ for $z\in\mathbb{C} $. – Paramanand Singh Jan 24 '25 at 06:36
  • @ParamanandSingh Yes, I know, this is why I said this lemma is very useful to me – Yunxuan Zhang Jan 24 '25 at 12:54

2 Answers2

1

The case $x\geq 0$ I think you handled well based on ideas from your previous question, while for $x<0$ the sort of ‘reflection’ trick you used seems to be the standard approach (it’s very briefly alluded to in Wikipedia). So, this amount of algebraic hurdles is necessary for keeping the proof elementary. But anyway, if you know a little bit about $\limsup$ (just to make life less stressful with $\epsilon$-$N$’s) and that the tail of a convergent series converges to $0$, the DCT is easy.

Besides, I think the DCT, even in this special case, is important enough that it’s worth writing an answer for, even though the question itself about exponentials isn’t new. The reason why the DCT is so important is that it provides us a very general condition under which limits and series (or more generally integrals) commute, and this allows us to more naturally consider the limits which arise in such ‘practical’ situations, without having to rely on specific properties of $\Bbb{R}$, like its total ordering, or having various lemmas lying around. So, it’s one of those things that you prove once and for all and then happily apply in 95% of situations.


Discrete Dominated Convergence Theorem.

Recall that a sequence of real numbers is really just a function $f:\Bbb{N}\to\Bbb{R}$; and this can be thought of as a function $f:\Bbb{Z}\to\Bbb{R}$ in a very natural way (by setting it to $0$ for negative integers), or we could look at any infinite subset of $\Bbb{Z}$ as our indexing set for the sequence. In what follows I shall adopt this function notation because I dont want doubly-indexed sequences. We can actually be more general than this, but I think this level of generality is more than enough in practice.

Theorem (Discrete DCT: Interchanging Limits with Series).

Let $V$ be a Banach space, and let $\{f_n:\Bbb{Z}\to V\}_{n=1}^{\infty}$ be a collection of $V$-valued sequences. Suppose the following:

  • Pointwise convergence: $f_n$ converges pointwise to some $f:\Bbb{Z}\to V$, i.e that for each $k\in\Bbb{Z}$, suppose $f(k):=\lim\limits_{n\to\infty}f_n(k)$ exists in $V$.
  • Domination: For each $k\in\Bbb{Z}$, define $g(k):= \sup\limits_{n\geq 1}\|f_n(k)\|$ and suppose that the series $\sum\limits_{k\in\Bbb{Z}}g(k)$ is finite.

Then, each $f_n$ and $f$ is absolutely-summable (hence by completeness of $V$ also summable) and we have that $\sum\limits_{k\in\Bbb{Z}}\|f_n(k)-f(k)\|\to 0$ as $n\to\infty$; which by the triangle inequality of course implies that $\sum\limits_{k\in\Bbb{Z}}f_n(k)\to \sum\limits_{k\in\Bbb{Z}}f(k)$ as $n\to\infty$.

The proof is pretty straight-forward; if you’re uncomfortable with general Banach spaces, just replace $V$ with $\Bbb{R}$ or $\Bbb{C}$ and replace all instances of $\|\cdot\|$ with absolute values $|\cdot|$.

  • First, observe that $\|f(k)\|=\lim\limits_{n\to\infty}\|f_n(k)\|\leq g(k)$, and by definition $\|f_n(k)\|\leq g(k)$. So, summability of $g$ implies absolute summability of each $f_n$ and of $f$ (hence by completeness of $V$ this implies summability).
  • Next, let $\phi_n(k):=\|f_n(k)-f(k)\|$. By hypothesis, for each $k\in\Bbb{Z}$, we have $\phi_n(k)\to 0$ as $n\to\infty$. Fix any $n,N\in\Bbb{N}$. Then, \begin{align} \sum_{k\in\Bbb{Z}}\phi_n(k)&=\sum_{|k|\leq N}\phi_n(k)+\sum_{|k|>N}\phi_n(k)\\ &\leq \sum_{|k|\leq N}\phi_n(k)+\sum_{|k|>N}2g(k). \end{align} Now, take $\limsup$ of both sides to get \begin{align} \limsup_{n\to\infty}\sum_{k\in\Bbb{Z}}\phi_n(k)&\leq \limsup_{n\to\infty}\left(\sum_{|k|\leq N}\phi_n(k)+\sum_{|k|>N}2g(k)\right)\\ &=\lim_{n\to\infty}\left(\sum_{|k|\leq N}\phi_n(k)+\sum_{|k|>N}2g(k)\right)\tag{since the limit exists}\\ &=0+\sum_{|k|>N}2g(k). \end{align} Now, since $g$ is summable, it follows that the tail of the series vanishes as $N\to\infty$ (this is very easy to prove right from the definitions), so by taking $N\to\infty$, the RHS vanishes. This shows \begin{align} \limsup_{n\to\infty}\sum_{k\in\Bbb{Z}}\phi_n(k)&\leq 0. \end{align} Thus, we have equality, and if a $\limsup$ of non-negative quantities is $0$, then the limit exists and is also $0$. This completes the proof.

So, notice that I only used very few things: absolute summability and completeness of $V$ imply summability, and that the tail of a convergent series vanishes, and some basic facts about $\limsup$.


Application to the exponential.

Fix any $z\in\Bbb{C}$. Then, the binomial expansion gives \begin{align} \left(1+\frac{z}{n}\right)^n&=\sum_{k=0}^n\frac{z^k}{k!}\prod_{j=0}^{k-1}\left(1-\frac{j}{n}\right),\tag{$*$} \end{align} where the empty product for $k=0$ is treated as $1$. This motivates the following definition: $f_n:\Bbb{Z_{\geq 0}}\to\Bbb{C}$ defined as \begin{align} f_n(k):=\chi_{[0,n]}(k)\cdot \frac{z^k}{k!}\prod_{j=0}^{k-1}\left(1-\frac{j}{n}\right), \end{align} where $\chi_{[0,n]}(k)$ equals $1$ if $0\leq k\leq n$ and is $0$ otherwise. With this, we can write $\left(1+\frac{z}{n}\right)^n=\sum\limits_{k\geq 0}f_n(k)$. Observe further that

  • for each $k\geq 0$, we have $f_n(k)\to\frac{z^k}{k!}\equiv f(k)$ as $n\to\infty$.
  • for all $k\geq 0,n\geq 1$, we have $|f_n(k)|\leq \frac{|z|^k}{k!}$, and the latter is obviously summable by the ratio/root test.

Hence, by the discrete DCT, the limit as $n\to\infty$ of the series exists and can be interchanged with the series: \begin{align} \lim_{n\to\infty}\left(1+\frac{z}{n}\right)^n&=\lim_{n\to\infty}\sum_{k\geq 0}f_n(k)=\sum_{k\geq 0}f(k)=\sum_{k\geq 0}\frac{z^k}{k!}. \end{align} In particular, we’re not really using the ordering on $\Bbb{R}$ (not directly anyway). So, notice that the DCT provides us with the rigorous justification for something we’re all tempted to do: simply let $n\to\infty$ in the summation in $(*)$ and let $n\to\infty$ in each of the products.

peek-a-boo
  • 65,833
1

Let $$f_n(x)=\sum_{k=0}^n {x^k\over k!},\quad g_n(x)= \left (1+{x\over n}\right )^n$$ Then $$f_n'(x)=f_{n-1}(x),\quad g_n'(x)=\left (1+{x\over n}\right )g_{n-1}(x)\quad (*)$$ The functions $f_n(x)$ and $g_n(x)$ are convergent uniformly for $|x|\le a,$ so are $f_n'(x)$ and $g_n'(x).$ Thus the limits of $f_n(x)$ and $g_n(x)$ represent differentiable functions $f(x)$ and $g(x),$ respectively. Moreover $(*)$ implies $$ f'(x)=f(x),\quad g'(x)=g(x),\ \ f(0)=1=g(0)\ \ (**)$$ Thus $h(x)=f(x)-g(x)$ satisfies $h'(x)=h(x)$ and $h(0)=0.$ This implies $h(x)=0.$

Remark The uniform convergence of $f_n(x)$ follows from the Weierestrass majorization theorem. The uniform convergence of $g_n(x)$ can be proved by the Arzeli-Ascoli theorem, as the functions $g_n(x)$ and $g_n'(x)$ are bounded on $|x|\le a.$ Actually the Arzeli-Ascoli theorem implies that $g_n(x)$ contains a uniformly convergent subsequence, but every convergent subsequence of $g_n(x)$ is convergent to the same limit by $(**).$ In this way we avoid any involved calculations.

  • Nice work! Thanks for your idea! – Yunxuan Zhang Jan 24 '25 at 12:45
  • your formula for $g_n’$ isn’t right ($g_n’$ should be a degree $n-1$ polynomial function, but the RHS is a degree $n$ polynomial function). The correct equation is $\left(1+\frac{z}{n}\right)g_n’(z)=g_n(z)$, or upon dividing, $g_n’(z)=\frac{g_n(z)}{1+\frac{z}{n}}$. – peek-a-boo Jan 24 '25 at 18:54