We need to prove that $\lim_\limits{x \to \infty} \ln((1 + \frac{1}{x})^x) = \lim_\limits{x \to \infty} x \cdot \ln(1 + \frac{1}{x}) = 1$ and would like to use Taylor series.
To do this, when we expand $\ln(1 + \frac{1}{x})$, we should get a series like $\frac{1}{x} + \frac{a_1}{x^2} + \frac{a_2}{x^3} + ...$, multiplying by $x$ would make it $1 + \frac{a_1}{x} + \frac{a_2}{x^2} + ...$ which is $1$ if $x \to \infty$.
The Taylor expansion of $f(x) = \ln(1 + \frac{1}{x})$ at some $a$ is $\sum_{n = 0}^\infty \frac{f^{(n)}(a)}{n!}(x-a)^n = f(a) + f'(a)(x-a) + \frac{f''(a)}{2}(x-a)^2 + \frac{f'''(a)}{6}(x-a)^3 + ...$.
$f'(a) = -\frac{1}{a^2 + a}$, $f'(a) \cdot (x-a) = - \frac{x-a}{a^2 + a}$
$f''(a) = \frac{2a + 1}{(a^2 + a)^2}$, $f''(a) \cdot (x-a) = \frac{(2a + 1)(x-a)}{(a^2 + a)^2}$
So the Taylor series begins like this:
$\ln(1 + \frac{1}{x}) = \ln(1 + \frac{1}{a}) - \frac{x-a}{a^2 + a} + \frac{(2a + 1)(x-a)}{(a^2 + a)^2} - ...$
Since the first term doesn't involve $x$ and we want it to be $\frac{1}{x}$, we would like it to be $0$ and the next term to be $\frac{1}{x}$. To make the first term $0$, $a$ needs to be approaching infinity.
$\lim_\limits{a \to \infty} \left( \ln(1 + \frac{1}{a}) - \frac{x-a}{a^2 + a} + \frac{(2a + 1)(x-a)}{(a^2 + a)^2} - ... \right) = \lim_\limits{a \to \infty} \left(- \frac{x-a}{a^2 + a} + \frac{(2a + 1)(x-a)}{(a^2 + a)^2} - ... \right)$.
Now, this isn't in the form $\frac{1}{x} + \frac{a_1}{x^2} + \frac{a_2}{x^3} + ...$ that we wanted.
Do we need to pick some other $a$?
I found https://math.stackexchange.com/a/1071689/1095885 which says
When x is very large, using Taylor
$\log\left(1+\frac{1}{x}\right)=\frac{1}{x}-\frac{1}{2 x^2}+\frac{1}{3 x^3}+O\left(\left(\frac{1}{x}\right)^4\right)$
The series $\frac{1}{x}-\frac{1}{2 x^2}+\frac{1}{x^3} + ...$, which would be perfect, seems to be a Laurent series though.
Edit: When substituting $u = \frac{1}{x}$ and taking the Taylor series of $\ln(1 + u)$, why don't we have to use the chain rule?