0

How to handle $$a_n := e^{- (1+\varepsilon)n} \sum_{k=0}^{n-1} \frac{(1+\varepsilon)^k n^k}{k!} $$ the analytical way, where $\varepsilon > 0.$

What we obviously know is: $a_n \leq 1$.

If we take $J_n \sim \gamma (n, n)$ where $\gamma(n,n)$ denotes the gamma distribution, then $a_n = \Bbb P( J_n > 1 + \varepsilon)$. I put the $\varepsilon$ there, because in my intuition $J_n \to 1$. Indeed:

$$J_n \sim \sum_{k=1}^{n} X_k \sim \frac 1 n \sum_{k=1}^n Y_k \to 1$$

weakly as $n\to\infty$, where $Y_k \sim \text{Exp}(1)$ independently. So, since $\delta_1 (\partial(1+\varepsilon,\infty))=0$, we have $a_n \to 0$.

But what I want is a analytical approach to this, at best with a bound for $a_n$, e.g. $a_n \leq e^{-\phi (n)}$.

Falrach
  • 4,242

2 Answers2

1

We can use the same probabilistic approach as this earlier post. If $X$ is a $\text{Pois}(\lambda)$ random variable with $\lambda = n(1+\epsilon)$, we can write $$ P(X<n)=\sum_{k=0}^{n-1} \frac{(1+\varepsilon)^k n^ke^{- (1+\varepsilon)n}}{k!}. $$ This equals $$ P\left(\frac{X-n(1+\epsilon)}{\sqrt{n(1+\epsilon)}}<\frac{-n\epsilon}{\sqrt{n(1+\epsilon)}}\right). $$ By the central limit theorem, $\frac{X-n(1+\epsilon)}{\sqrt{n(1+\epsilon)}}\to \mathcal{N}(0,1)$ in distribution. If $c$ is any fixed constant, then we have $$ \limsup_{n\to\infty}P\left(\frac{X-n(1+\epsilon)}{\sqrt{n(1+\epsilon)}}<\frac{-n\epsilon}{\sqrt{n(1+\epsilon)}}\right)\le\lim_{n\to\infty}P\left(\frac{X-n(1+\epsilon)}{\sqrt{n(1+\epsilon)}}<c\right)=P(\mathcal{N}(0,1)<c). $$ By letting $c\to -\infty$, we get $$ \lim_{n\to \infty}e^{- (1+\varepsilon)n}\sum_{k=0}^{n-1} \frac{(1+\varepsilon)^k n^k}{k!}=0. $$

EDIT: Note that $E[e^{tX}]=\exp(\lambda(e^t-1))$ where $\lambda=n(1+\epsilon)$. Observe that for $t>0$ $$\begin{eqnarray} P(X<n)=P(-tX>-nt)&\le &e^{nt}E[e^{-tX}]\\ &=&e^{nt}\exp(n(1+\epsilon)(e^{-t}-1))\\ &=&\left[\exp(t +(1+\epsilon)(e^{-t}-1))\right]^n. \end{eqnarray}$$ Note that $$ [t +(1+\epsilon)(e^{-t}-1)]'|_{t=0}=-\epsilon<0. $$ This implies there exists $t_0>0$ such that $\exp(t_0 +(1+\epsilon)(e^{-t_0}-1))=e^{-c}<1$. This gives $$ a_n\le e^{-cn} $$ for some $c>0$ and hence $a_n\to 0$.

Myunghyun Song
  • 22,003
  • 2
  • 26
  • 62
  • This is a good answer, but I am interested in an analytical approach. But your representation of $a_n$ suggests that a bound exists, which I am interested in. – Falrach Jan 16 '19 at 10:58
  • 1
    @Falrach Probabilistic method can give an exponential bound of the form $a_n\le e^{-cn}$ for some $c>0$. – Myunghyun Song Jan 16 '19 at 14:55
1

Let $$b_n=e^{-(1+\varepsilon)n}\sum_{k=n}^{(1+\varepsilon)n}{\frac{(1+\varepsilon)^kn^k}{k!}}.$$

Then $a_n+b_n \leq 1$, and $$b_n \geq n\varepsilon e^{-(1+\varepsilon)n}\frac{(1+\varepsilon)^nn^n}{n!}$$.

Now, note that for any $k < n$, $$\frac{(1+\varepsilon)^kn^k}{k!} \leq (1+\epsilon)^{-1}\frac{(1+\varepsilon)^{k+1}n^{k+1}}{(k+1)!}.$$

Thus $$a_n < e^{-(1+\varepsilon)n}\frac{(1+\varepsilon)^nn^n}{n!}\sum_{k=1}^n{(1+\varepsilon)^{-k}} \leq \varepsilon^{-2}n^{-1}b_n \leq \frac{1}{n\varepsilon^2}.$$

Aphelli
  • 37,929