6

I am trying to show the function defined by: $$E(x)=\lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n$$ satisfies the property: $$E(x)E(y)=E(x+y)$$


Assuming $E$ is well defined, I can interchange products and limits (?). We have:

$$\begin{aligned} E(x)E(y) &=\lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n\lim_{n\to\infty}\left(1+\frac{y}{n}\right)^n \\ &=\lim_{n\to\infty}\left[\left(1+\frac{x}{n}\right)\left(1+\frac{y}{n}\right)\right]^n \\ &=\lim_{n\to\infty}\left[1+\frac{x+y}{n}+\frac{xy}{n^2}\right]^n\\ &=\lim_{n\to\infty} \sum_{k=0}^n\binom{n}{k}\left(1+\frac{x+y}{n}\right)^{n-k}\left(\frac{xy}{n^2}\right)^k \\ & = E(x+y)+\lim_{n\to\infty} \sum_{k=1}^n\binom{n}{k}\left(1+\frac{x+y}{n}\right)^{n-k}\left(\frac{xy}{n^2}\right)^k\end{aligned}$$ This is where I am stuck. I can see that, for any $k$: $$\binom{n}{k}\left(1+\frac{x+y}{n}\right)^{n-k}\left(\frac{xy}{n^2}\right)^k=\left(\frac{xy}{n}\right)^k\left(1+\frac{x+y}{n}\right)^{n-k}\prod_{r=0}^{k-1}\left(1-\frac{r}{n}\right)\overset{n\to\infty}\to 0$$ But because the sum increases in terms as $n$ increases, I feel like this is not sufficient to argue that the limit of it is $0$. Is that right? If so I'm not sure how to go about it and would appreciate some help.

K. 622
  • 933
  • 1
    If you know that $E(x) \neq 0, \forall x\in \mathbb {R} $ then an easy proof is given by considering the sequence $x_n=(1+((x+y) /n)) /((1+(x/n))(1+(y/n)))$ and noting that $ n(x_n-1)\to 0$ so that $x_n^n\to 1$. See this answer https://math.stackexchange.com/a/3000717/72031 – Paramanand Singh Nov 23 '18 at 04:34

3 Answers3

4

You do not need to deal with that hard expansion. Here is another way using Squeeze theorem for $xy\ge 0$. The case $xy\le 0$ is also similar. Notice that for any $a>0$ and large enough $n$ we have:$$1+{x+y\over n}\le 1+{x+y\over n}+{xy\over n^2}<1+{x+y\over n}+{xy\over an}=1+{x+y+{xy\over a}\over n}$$using Squeeze theorem we have$$\lim _{n\to \infty}(1+{x+y\over n})^n\le \lim _{n\to \infty}(1+{x+y\over n}+{xy\over n^2})^n\le \lim _{n\to \infty}(1+{x+y+{xy\over a}\over n})^n$$or using the definition $$E(x+y)\le E(x)E(y)\le E(x+y+{xy\over a})$$since this is true for any $a>0$ by tending $a$ to $\infty$ we obtain $$E(x+y)\le E(x)E(y)\le E(x+y)$$which yields to $$E(x+y)=E(x)E(y)$$

Mostafa Ayaz
  • 33,056
  • 1
    Thank you, that is easier indeed. I assume you mean $a\to\infty$ as opposed to $\to 0$? – K. 622 Nov 22 '18 at 22:48
  • Yes that's because of how I plugged $a$ in the equation..... – Mostafa Ayaz Nov 23 '18 at 07:28
  • 2
    @ Mostafa Ayaz: What are the arguments for letting $a$ tend to 0? If seems that you use continuity of $E$ at a very early stage. – Jens Schwaiger Nov 27 '18 at 08:14
  • I took this idea to use: $ax$ grows mush slowly from $x^2$ for any $a>0$ and for sufficiently large $x$. – Mostafa Ayaz Nov 27 '18 at 08:19
  • If you said that $a\to\infty$ from the begening the result $\frac{yx}{n^2}\lt\frac{yx}{an}$ would be compromise and more near of a pure equality since $n\to\infty$ as wel. And the whole sandwitch form would not work I think. – lazare Jul 04 '24 at 22:23
  • I agree with Jens and @lazare. This answer is wrong. – Anne Bauval Jul 05 '24 at 15:04
3

I don't recommend trying to do it this way. The cleanest proof I know is a bit less direct than this. First, show that $E(x)$ is the unique solution to the differential equation $E'(x) = E(x)$ with initial condition $E(0) = 1$, and more generally that $C E(x)$ is the unique solution with initial condition $E(0) = C$.

For existence you'll want to exchange a limit and a derivative and you'll need to be careful about that, but once you've justified that exchange, $\frac{d}{dx} \left( 1 + \frac{x}{n} \right)^n = \left( 1 + \frac{x}{n} \right)^{n-1}$ so it's clear that the two limits are the same. Morally the point is that the limit definition of $E(x)$ is attempting to solve this differential equation using Euler's method.

Uniqueness is easier: if $E_1(x)$ and $E_2(x)$ are two solutions compute the derivative of $\frac{E_1(x)}{E_2(x)}$. (Well, first show that solutions are always positive, so we can take this quotient.)

Once you have this everything is very easy: $E(x) E(y)$ and $E(x + y)$, as functions of $x$ with $y$ fixed, are both solutions to the differential equation $f'(x) = f(x)$ with initial condition $f(0) = E(y)$. And then we're done by uniqueness.

Qiaochu Yuan
  • 468,795
  • Thanks, that's a clever detour. I wasn't aware of the source of the limit definition (Euler's method), thank you for mentioning that. – K. 622 Nov 22 '18 at 22:50
  • But to prove the uniqueness it's very weighty and long not simple theorems. – lazare Jul 04 '24 at 21:33
  • 1
    @lazare: in this special case there's a very easy argument, as I indicated above: compute the derivative of $\frac{E_1(x)}{E_2(x)}$ where $E_1, E_2$ are two solutions. – Qiaochu Yuan Jul 04 '24 at 21:43
  • So, for proving existence the product limit formula that we already get from the Euler method is not enought? – lazare Jul 04 '24 at 22:36
  • By saying $E_1$ and $E_2$ you mean for example: $E_1=5E(x)$ and $E_2=35E(x)$ ? – lazare Jul 04 '24 at 22:39
  • 1
    @lazare: are you asking a question about existence or uniqueness? Your first question was about uniqueness so that's what I addressed. And yes, for example, those are two solutions. – Qiaochu Yuan Jul 04 '24 at 22:50
  • That what I understand is that you say that for your proof, we need first to prove de existance and uniqueness of $E(x)$. I don't understand why at the point I rendered, but I try to understand your proofs for existance and uniqueness both before going to the next step. – lazare Jul 04 '24 at 22:53
  • 1
    @lazare: existence and uniqueness are two different claims so let's discuss them separately. Existence can be proven either by showing that $\lim_{n \to \infty} \left( 1 + \frac{x}{n} \right)^n$ is a solution to the differential equation or that $\sum_{n=0}^{\infty} \frac{x^n}{n!}$ is a solution, either way will work fine. Uniqueness can be proven, as I have already said, by calculating the derivative $\frac{d}{dx} \frac{E_1(x)}{E_2(x)}$ where $E_1$ and $E_2$ are two nonzero solutions. You'll find it's zero, so the ratio is a constant, meaning $E_1 = C E_2(x)$ for some constant $C$. – Qiaochu Yuan Jul 04 '24 at 23:01
  • 1
    Neither of these arguments requires the Picard-Lindelof theorem, if that's what you're worried about. – Qiaochu Yuan Jul 04 '24 at 23:02
  • Thank you for your answers. Yes I'm really scared of the Picard-Lindelof theorem. I will try the calculation for your uniqueness proof then I say you if I'm all right. – lazare Jul 04 '24 at 23:39
  • So I'm ok with that $\frac{d}{dx} \frac{E_1(x)}{E_2(x)}=0$ after for proving that a function that it's derivative=0 is a constant you need to use the mean value theorem or at less kind of painfull math even may be ta you need the Picard-Lindelof theorem.. – lazare Jul 05 '24 at 00:11
  • If for you it's ok to accept that $f(x)=C$ when $f'(x)=0$ without proving it. So this solution on that video for showing to that $E(a+b)=E(a)*E(b)$ is probably the best: https://www.youtube.com/watch?v=TgrT2nTl6IM&lc=UgxJnSeG__4YyCKis254AaABAg.A4NV2BMcXhRA4O5SFI1WRe – lazare Jul 05 '24 at 00:22
1

In the excellent book

Analysis 1. 6., korrigierte Aufl. (German) Springer-Lehrbuch. Berlin: Springer. xiv, 398 S. (2001)

on page 78 (Exercise 14.) you find the following (approximately translated):

The exponential function as the limit of $\left(1+\frac x n\right)^n$. Show that $E(x)=\lim E_n(x)$ exist, where $E_n(x)=\left(1+\frac x n\right)^n$, exists and that this limit equals $e^x$.

Hint: Following example 4.7 [Existence of $E_n(x)$, proved by application of the AGM inequality; AGM inequality=inequality between arithmetic and geometric mean] the sequence is (finally) monotonically increasing, and $\left(1+\frac p n\right)\leq\left(1+\frac1 n\right)^p$ implies $E_n(p)\leq e^p$, i.e., $E_n(x)\leq E_n(y)\leq e^p$ for $-n\leq x\leq y\leq p$. Using the AGM inequality implies $$ \left(1+\frac x n\right)^n \left(1+\frac y n\right)^n\leq \left(1+\frac {x+y} {2n}\right)^{2n}$$ and

$$\left(1+\frac x {n-1}\right)^{n-1} \left(1+\frac {xy} n\right)\leq \left(1+\frac x n+\frac y n+\frac{xy}{n^2}\right)^n=\left(1+\frac x n\right)^n\left(1+\frac y n\right)^n$$ implying $E(x)E(y)\leq E(x+y)\leq E(x)E(y)$