Most of the arguments given in other answers have a curious fallacy. In the binomial expansion of $(1+1/n)^{n}$ the number of terms as well as each term is dependent on $n$ hence taking limits term by term is not justified. A proper proof requires more analysis. I have presented this proof in detailed manner in my blog post.
Update: Upon OP's request (see comments below) I have provided the full proof below. Its a variation on the proof given in blog. But before that a few remarks regarding the fallacy of other answers are in order. Taking limits term by term is valid in two general contexts:
1) When the number of terms is finite and independent of the limit variable $n$.
2) When the number of terms is infinite (case of infinite series, uniform convergence)
The current example is an expression of type $$f(n) = \sum_{k = 0}^{n}g(n, k)$$ so that each term $g(n, k)$ depends on $n$ and total numbers of terms also depends on $n$. Let's have a counter example. Let $g(n, k) = 1/n$ for all $k, n$. And consider $$f(n) = \sum_{k = 1}^{n}g(n, k) = \sum_{k = 1}^{n}\frac{1}{n} = 1$$ If we take limits term by term we get $\lim_{n \to \infty}g(n, k) = 0$ so that term by term limit gives the infinite series $$0 + 0 + 0 + \cdots = 0$$ But since the function $f(n) = 1$ so the limit should have been $1$.
Now I come to the proof of the relation $$\lim_{n \to \infty}\left(1 + \frac{1}{n}\right)^{n} = 1 + 1 + \frac{1}{2!} + \frac{1}{3!} + \cdots\tag{1}$$ Let us put $$f(n) = \left(1 + \frac{1}{n}\right)^{n}, g(n) = \left(1 - \frac{1}{n}\right)^{-n}, E(n) = \sum_{k = 0}^{n}\frac{1}{k!}$$ Note that the series $\sum (1/k!)$ is convergent so that $\lim_{n \to \infty}E(n)$ exists. For $n > 1$ I show that $$f(n) \leq E(n) \leq g(n)\tag{2}$$ The first inequality $f(n)\leq E(n)$ is obvious by the binomial expansion of $f(n)$ as each term in binomial expansion of $f(n)$ is less than or equal to corresponding term in $E(n)$. Since $n > 1$ so that $1/n < 1$ we can therefore apply binomial theorem for general exponents to get $$g(n) = 1 + n\cdot\frac{1}{n} + \frac{n(n + 1)}{2!}\cdot\frac{1}{n^{2}} + \cdots$$ which is an infinite series with positive terms. Now each term in above series is greater than or equal to each corresponding terms in $E(n)$. Moreover terms in $g(n)$ are infinite in number whereas $E(n)$ has a finite number of terms. Thus we have $E(n) \leq g(n)$ for all $n > 1$. So the inequalities mentioned in $(2)$ are established.
Again we know that $\lim_{n \to \infty}f(n) = e$ (I won't discuss the existence of this limit as it is a given assumption in the question, but its existence does require a proof). I now show that $$\lim_{n \to \infty}\frac{f(n)}{g(n)} = 1\tag{3}$$ so that $\lim_{n \to \infty}g(n)$ also exists and is equal to $e$. Once this is done we can apply squeeze theorem on $(2)$ and get $\lim_{n \to \infty}E(n) = e$ or what is the same as equation $(1)$. It thus remains only to show that equation $(3)$ holds.
Clearly we have $$\frac{f(n)}{g(n)} = \left(1 - \frac{1}{n^{2}}\right)^{n} = c(n)\text{ (say)}$$ and since $$0 < 1 - \frac{1}{n^{2}} < 1$$ it follows that $$1 - n\cdot\frac{1}{n^{2}} \leq c(n) \leq 1 - \frac{1}{n^{2}}$$ (note that first inequality follows from $(1 + x)^{n} \geq 1 + nx$ for $n > 1$ and $x > -1$). Applying squeeze theorem on the above inequalities we get $\lim_{n \to \infty}f(n)/g(n) = \lim_{n \to \infty}c(n) = 1$ and thus equation $(3)$ is verified. This completes the proof of $(1)$.