251

I'm supposed to calculate:

$$\lim_{n\to\infty} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!}$$

By using WolframAlpha, I might guess that the limit is $\frac{1}{2}$, which is a pretty interesting and nice result. I wonder in which ways we may approach it.

Mark Viola
  • 166,174
  • 12
  • 128
  • 227
user 1591719
  • 43,176
  • 10
  • 95
  • 246

9 Answers9

173

The probabilistic way:

This is $P[N_n\leqslant n]$ where $N_n$ is a random variable with Poisson distribution of parameter $n$. Hence each $N_n$ is distributed like $X_1+\cdots+X_n$ where the random variables $(X_k)$ are independent and identically distributed with Poisson distribution of parameter $1$.

By the central limit theorem, $Y_n=\frac1{\sqrt{n}}(X_1+\cdots+X_n-n)$ converges in distribution to a standard normal random variable $Z$, in particular, $P[Y_n\leqslant 0]\to P[Z\leqslant0]$.

Finally, $P[Z\leqslant0]=\frac12$ and $[N_n\leqslant n]=[Y_n\leqslant 0]$ hence $P[N_n\leqslant n]\to\frac12$, QED.


The analytical way, completing your try:

Hence, I know that what I need to do is to find $\lim\limits_{n\to\infty}I_n$, where $$ I_n=\frac{e^{-n}}{n!}\int_{0}^n (n-t)^ne^tdt.$$

To begin with, let $u(t)=(1-t)e^t$, then $I_n=\dfrac{e^{-n}n^n}{n!}nJ_n$ with $$ J_n=\int_{0}^1 u(t)^n\mathrm dt. $$ Now, $u(t)\leqslant\mathrm e^{-t^2/2}$ hence $$ J_n\leqslant\int_0^1\mathrm e^{-nt^2/2}\mathrm dt\leqslant\int_0^\infty\mathrm e^{-nt^2/2}\mathrm dt=\sqrt{\frac{\pi}{2n}}. $$ Likewise, the function $t\mapsto u(t)\mathrm e^{t^2/2}$ is decreasing on $t\geqslant0$ hence $u(t)\geqslant c_n\mathrm e^{-t^2/2}$ on $t\leqslant1/n^{1/4}$, with $c_n=u(1/n^{1/4})\mathrm e^{-1/(2\sqrt{n})}$, hence $$ J_n\geqslant c_n\int_0^{1/n^{1/4}}\mathrm e^{-nt^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\int_0^{n^{1/4}}\mathrm e^{-t^2/2}\mathrm dt=\frac{c_n}{\sqrt{n}}\sqrt{\frac{\pi}{2}}(1+o(1)). $$ Since $c_n\to1$, all this proves that $\sqrt{n}J_n\to\sqrt{\frac\pi2}$. Stirling formula shows that the prefactor $\frac{e^{-n}n^n}{n!}$ is equivalent to $\frac1{\sqrt{2\pi n}}$. Regrouping everything, one sees that $I_n\sim\frac1{\sqrt{2\pi n}}n\sqrt{\frac\pi{2n}}=\frac12$.

Moral: The probabilistic way is shorter, easier, more illuminating, and more fun.

Caveat: My advice in these matters is, clearly, horribly biased.

Did
  • 271,033
  • 27
  • 280
  • 538
  • 1
    One could simply say that for natural $n$ the median for the Poisson variable is $n$ (which, I believe, is nontrivial), and thus $P[N_n – user35953 May 18 '20 at 18:48
  • 2
    @Did, you are the reason we don't have Nobel prize in mathematics: they are too envious) – Alex Jul 21 '20 at 10:25
159

Edited. I justified the application of the dominated convergence theorem.

By a simple calculation,

$$ \begin{align*} e^{-n}\sum_{k=0}^{n} \frac{n^k}{k!} &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k (n-k)! \\ (1) \cdots \quad &= \frac{e^{-n}}{n!} \sum_{k=0}^{n}\binom{n}{k} n^k \int_{0}^{\infty} t^{n-k}e^{-t} \, dt\\ &= \frac{e^{-n}}{n!} \int_{0}^{\infty} (n+t)^{n}e^{-t} \, dt \\ (2) \cdots \quad &= \frac{1}{n!} \int_{n}^{\infty} t^{n}e^{-t} \, dt \\ &= 1 - \frac{1}{n!} \int_{0}^{n} t^{n}e^{-t} \, dt \\ (3) \cdots \quad &= 1 - \frac{\sqrt{n} (n/e)^n}{n!} \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du. \end{align*}$$

We remark that

  1. In $\text{(1)}$, we utilized the famous formula $ n! = \int_{0}^{\infty} t^n e^{-t} \, dt$.
  2. In $\text{(2)}$, the substitution $t + n \mapsto t$ is used.
  3. In $\text{(3)}$, the substitution $t = n - \sqrt{n}u$ is used.

Then in view of the Stirling's formula, it suffices to show that

$$\int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du \xrightarrow{n\to\infty} \sqrt{\frac{\pi}{2}}.$$

The idea is to introduce the function

$$ g_n (u) = \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \mathbf{1}_{(0, \sqrt{n})}(u) $$

and apply pointwise limit to the integrand as $n \to \infty$. This is justified once we find a dominating function for the sequence $(g_n)$. But notice that if $0 < u < \sqrt{n}$, then

$$ \log g_n (u) = n \log \left(1 - \frac{u}{\sqrt{n}} \right) + \sqrt{n} u = -\frac{u^2}{2} - \frac{u^3}{3\sqrt{n}} - \frac{u^4}{4n} - \cdots \leq -\frac{u^2}{2}. $$

From this we have $g_n (u) \leq e^{-u^2 /2}$ for all $n$ and $g_n (u) \to e^{-u^2 / 2}$ as $n \to \infty$. Therefore by dominated convergence theorem and Gaussian integral,

$$ \int_{0}^{\sqrt{n}} \left(1 - \frac{u}{\sqrt{n}} \right)^{n}e^{\sqrt{n}u} \, du = \int_{0}^{\infty} g_n (u) \, du \xrightarrow{n\to\infty} \int_{0}^{\infty} e^{-u^2/2} \, du = \sqrt{\frac{\pi}{2}}. $$

Sangchul Lee
  • 141,630
  • 16
  • 242
  • 385
  • 5
    Your second equation is closely related to [this question](http://math.stackexchange.com/questions/71447/large-n-asymptotic-of-int-0-infty-left-1-x-n-rightn-1-exp-x) (which I answered). – robjohn Jun 19 '12 at 15:47
60

Integration by parts yields $$ \frac{1}{k!}\int_x^\infty e^{-t}\,t^k\,\mathrm{d}t=\frac{1}{k!}x^ke^{-x}+\frac{1}{(k-1)!}\int_x^\infty e^{-t}\,t^{k-1}\,\mathrm{d}t\tag{1} $$ Iterating $(1)$ gives $$ \frac{1}{n!}\int_x^\infty e^{-t}\,t^n\,\mathrm{d}t=e^{-x}\sum_{k=0}^n\frac{x^k}{k!}\tag{2} $$ Thus, we get $$ e^{-n}\sum_{k=0}^n\frac{n^k}{k!}=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\tag{3} $$ Now, I will reproduce part of the argument I give here, which develops a full asymptotic expansion. Additionally, I include some error estimates that were previously missing. $$ \begin{align} \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t &=n^{n+1}e^{-n}\int_0^\infty e^{-ns}\,(s+1)^n\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-n(s-\log(1+s)}\,\mathrm{d}s\\ &=n^{n+1}e^{-n}\int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u\tag{4} \end{align} $$ where $t=n(s+1)$ and $u^2/2=s-\log(1+s)$.

Note that $\frac{ss'}{1+s}=u$; thus, when $s\ge1$, $s'\le2u$. This leads to the bound $$ \begin{align} \int_{s\ge1} e^{-nu^2/2}\,s'\,\mathrm{d}u &\le\int_{3/4}^\infty e^{-nu^2/2}\,2u\,\mathrm{d}u\\ &=\frac2ne^{-\frac98n}\tag{5} \end{align} $$ $(5)$ also show that $$ \int_{s\ge1}e^{-nu^2/2}\,\mathrm{d}u\le\frac2ne^{-\frac98n}\tag{6} $$

For $|s|<1$, we get $$ u^2/2=s-\log(1+s)=s^2/2-s^3/3+s^4/4-\dots\tag{7} $$ We can invert the series to get $s'=1+\frac23u+O(u^2)$. Therefore, $$ \begin{align} \int_0^\infty e^{-nu^2/2}\,s'\,\mathrm{d}u &=\int_{s\in[0,1]} e^{-nu^2/2}\,s'\,\mathrm{d}u+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\int_0^\infty\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u-\color{darkorange}{\int_{s>1}\left(1+\frac23u\right)e^{-nu^2/2}\,\mathrm{d}u}\\ &+\int_0^\infty e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u-\color{darkorange}{\int_{s>1} e^{-nu^2/2}\,O(u^2)\,\mathrm{d}u}\\ &+\color{red}{\int_{s>1} e^{-nu^2/2}\,s'\,\mathrm{d}u}\\ &=\sqrt{\frac{\pi}{2n}}+\frac2{3n}+O\left(n^{-3/2}\right)\tag{8} \end{align} $$ The red and orange integrals decrease exponentially by $(5)$ and $(6)$.

Plugging $(8)$ into $(4)$ yields $$ \int_n^\infty e^{-t}\,t^n\,\mathrm{d}t=\left(\sqrt{\frac{\pi n}{2}}+\frac23\right)\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{9} $$ The argument above can be used to prove Stirling's approximation, which says that $$ n!=\sqrt{2\pi n}\,n^ne^{-n}+O(n^{n-1/2}e^{-n})\tag{10} $$ Combining $(9)$ and $(10)$ yields $$ \begin{align} e^{-n}\sum_{k=0}^n\frac{n^k}{k!} &=\frac{1}{n!}\int_n^\infty e^{-t}\,t^n\,\mathrm{d}t\\ &=\frac12+\frac{2/3}{\sqrt{2\pi n}}+O(n^{-1})\tag{11} \end{align} $$

robjohn
  • 326,069
  • 34
  • 421
  • 800
  • 3
    Would the downvoter care to comment ([he asked, expecting the answer "no"](http://www.montypython.net/scripts/cheese.php))? – robjohn May 12 '18 at 14:01
  • hey rob, sadly the link in your answer is dead :( – tired Jun 10 '18 at 14:17
  • Yeah too many downvoters here :/ I upvoted this. Very Nice answer. – mick Jun 29 '18 at 21:49
  • 2
    The asymptotic approximation in $(11)$ can be improved by extending the approximations to get $$ \frac12+\frac1{\sqrt{2\pi n}}\left(\frac23-\frac{23}{270n}+\frac{23}{3024n^2}+\frac{259}{77760n^3}+O\!\left(\frac1{n^4}\right)\right) $$ and the big-$O$ term is close to $-\frac1{900n^4}$. – robjohn Mar 17 '20 at 18:42
30

$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle #1 \right\rangle} \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace} \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack} \newcommand{\dd}{{\rm d}} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\imp}{\Longrightarrow} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\pars}[1]{\left( #1 \right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert #1 \right\vert} \newcommand{\yy}{\Longleftrightarrow}$ \begin{align}&\color{#00f}{ \lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n}{n^{k} \over k!}}} \\[3mm]&=\lim_{n \to \infty}\bracks{\expo{-n}\sum_{k = 0}^{n} \exp\pars{k\ln\pars{n} - \ln\pars{k!}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\sum_{k = 0}^{n} \exp\pars{n\ln\pars{n} - \ln\pars{n!} - {1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{\expo{-n}\,{n^{n} \over n!}\sum_{k = 0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}} \\[3mm]&= \lim_{n \to \infty}\braces{{\expo{-n}n^{n} \over n!}\int_{0}^{n} \exp\pars{-{1 \over 2n}\bracks{k - n}^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\int_{-n}^{0} \exp\pars{-\,{k^{2} \over 2n}}\,\dd k} = \lim_{n \to \infty}\bracks{{\expo{-n}n^{n} \over n!}\,\root{2n} \int_{-\root{n}/2}^{0}\exp\pars{-k^{2}}\,\dd k} \\[3mm]&= \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \int_{-\infty}^{0}\exp\pars{-k^{2}}\,\dd k} = \lim_{n \to \infty}\bracks{{\root{2}n^{n + 1/2}\expo{-n} \over n!} \,{\root{\pi} \over 2}} \\[3mm]&= \half\,\lim_{n \to \infty}\bracks{{\root{2\pi}n^{n + 1/2}\expo{-n} \over n!}} =\color{#00f}{\Large\half} \end{align}

Lucian
  • 46,778
  • 1
  • 77
  • 148
Felix Marin
  • 84,132
  • 10
  • 143
  • 185
  • 17
    The second equal sign is a complete mystery. The passage from a sum to an integral also needs justification but it probably holds. – Did Mar 31 '14 at 16:25
  • 9
    Would you care answering @Did 's question regarding the second equal sign? I am miffed as well. Thank you, Felix Marin. – Hans May 11 '15 at 17:35
  • 4
    @Did's question has to be answered otherwise this seems to be an suitable answer – tired Oct 04 '15 at 12:12
  • 3
    yeah for me too, can you please answer @Did 's question? – user153330 Mar 01 '16 at 20:40
  • Felix, I'm just curious ... How did you arrive at that second equality? I don't see it. – Mark Viola May 16 '17 at 16:19
  • @MarkViola Note that $n^{k}/k!$ has a "sharp" maximum at $k = n$. Roughly speaking, it's a Gaussian. But the sum is just up to $n$. So, it was pretty obvious that the final result must be $1/2$. When I "went from the sum to the integral" I closely follows $\texttt{Laplace Method for Sums}$ as explained in the book $$ \color{#00f}{Analytic\ Combinatorics\ by\ Philippe\ Flajolet\ and\ Robert\ Sedgewick,\ Cambridge\ University\ Press\ 2009} $$. $\left(\,\color{#f00}{\texttt{see page 761}}\,\right)$. It's clearer whenever we switch the origin with $k \mapsto n - k$. I didn't do it. – Felix Marin May 19 '17 at 19:19
  • 4
    Is this supposed to justify the various unsubstantiated claims this post relies on? It does not. – Did Feb 08 '18 at 17:56
  • Could you please write more detail about the second equality?@FelixMarin – Li Taiji Feb 28 '19 at 11:24
  • @LiTaichi The $\displaystyle\exp$ function has a "sharp maximum" at $\displaystyle k = n \gg 1$. It's sufficient to expand its argument around $\displaystyle k = n$. See my previous comment about Flajolet et. al. book. – Felix Marin Feb 28 '19 at 19:42
  • @FelixMarin The two exponents are not equal at all, even if The exp function has a "sharp maximum" at $k=n$, how to prove the second equality is correct? – Li Taiji Mar 01 '19 at 04:21
26

If you'd like to see formal solution using calculus methods check this article http://www.emis.de/journals/AMAPN/vol15/voros.pdf

qoqosz
  • 3,332
  • 18
  • 26
  • 4
    I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time. – Steven Alexis Gregory Aug 22 '15 at 05:18
  • Please try to describe as much here as possible in order to make the answer self-contained. Links are fine as support, but they can go stale and then an answer which is nothing more than a link loses its value. Please read [this post](http://meta.stackexchange.com/questions/225370/your-answer-is-in-another-castle-when-is-an-answer-not-an-answer). – robjohn Jun 20 '19 at 03:14
26

The sum is related to the partial exponential sum, and thus to the incomplete gamma function, $$\begin{eqnarray*} e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} &=& e^{-n} e_n(n) \\ &=& \frac{\Gamma(n+1,n)}{\Gamma(n+1)}, \end{eqnarray*}$$ since $e_n(x) = \sum_{k=0}^n x^k/k! = e^x \Gamma(n+1,x)/\Gamma(n+1)$. But $$\begin{eqnarray*} \Gamma(n+1,n) &=& \sqrt{2\pi}\, n^{n+1/2}e^{-n}\left(\frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right) \right). \end{eqnarray*}$$ The first term in the asymptotic expansion for $\Gamma(n+1,n)$ can be found by applying the saddle point method to $$\Gamma(n+1,n) = \int_n^\infty dt\, t^n e^{-t}.$$ The higher order terms are in principle straightforward to compute. Using Stirling's approximation, we find $$e^{-n} \sum_{k=0}^{n} \frac{n^k}{k!} = \frac{1}{2} + \frac{1}{3}\sqrt{\frac{2}{n\pi}} + O\left(\frac{1}{n}\right).$$ Thus, the limit is $1/2$, as found by @sos440 and @robjohn. This limit is a special case of DLMF 8.11.13.

I just noticed a comment that suggests this be done using high school level math. If this is a standard exercise at your high school, maybe they covered the incomplete gamma function! ;-)

user26872
  • 18,473
  • 2
  • 39
  • 92
  • 3
    (+1) I derived this asymptotic expansion [here](http://www.whim.org/nebula/math/gammaratio.html) in answer to a question on sci.math. I computed a few terms past $\frac12$: $$ \frac12+\frac{1}{\sqrt{2\pi n}}\left(\frac23-\frac{23}{270n}+\frac{23}{3024n^2}+\dots\right) $$ – robjohn Jun 20 '12 at 03:14
  • @robjohn: Thanks for the link, I'll have a look. By the way, I voted up your nice solution a couple of hours ago. I like your short and sweet derivation of (3). – user26872 Jun 20 '12 at 03:47
  • @robjohn As steven gregory says above "I wish you had copied the article here. It seemed small enough. References have a habit of pointing to NULL over time" . Please, could you upload somewhere again? I'm curious about it. – vesszabo Jul 04 '17 at 21:13
13

I do not know how much this will help you.

For a given $n$, the result is $\dfrac{\Gamma(n+1,n)}{n\ \Gamma(n)}$ which has a limit equal to $\dfrac12$ as $n\to\infty$.

Tunk-Fey
  • 24,769
  • 9
  • 78
  • 106
Claude Leibovici
  • 214,262
  • 52
  • 92
  • 197
7

I thought that it might be of instructive to post a solution to a generalization of the OP's question. Namely, evaluate the limit

$$\lim_{n\to\infty}e^{-n}\sum_{k=0}^{N(n)}\frac{n^k}{k!}$$

where $N(n)=\lfloor Cn\rfloor$, where $C>0$ is an arbitrary constant. To that end we now proceed.


Let $N(n)=\lfloor Cn\rfloor$, where $C>0$ is an arbitrary constant. We denote $S(n)$ the sum of interest

$$S(n)=e^{-n}\sum_{k=0}^{N}\frac{n^k}{k!}$$

Applying the analogous methodology presented by @SangchulLee, it is straightforward to show that

$$S(n)=1-\frac{(N/e)^{N}\sqrt{N}}{N!}\int_{(N-n)/\sqrt{N}}^{\sqrt{N}}e^{\sqrt{N}x}\left(1-\frac{x}{\sqrt N}\right)^N\,dx\tag7$$

We note that the integrand is positive and bounded above by $e^{-x^2/2}$. Therefore, we can apply the Dominated Convergence Theorem along with Stirling's Formula to evaluate the limit as $n\to\infty$.

There are three cases to examine.

Case $1$: $C>1$

If $C>1$, then both the lower and upper limits of integration on the integral in $(7)$ approach $\infty$ as $n\to \infty$. Therefore, we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{\lfloor Cn\rfloor}\frac{n^k}{k!}=1$$

Case $2$: $C=1$

If $C=1$, then the lower limit is $0$ while the upper limit approaches $\infty$ and we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!}=\frac12$$

Case $3$: $C<1$

If $C<1$, then the lower limit is approaches $-\infty$ while the upper limit approaches $\infty$ and we find

$$\lim_{n\to \infty}e^{-n}\sum_{k=0}^{n}\frac{n^k}{k!}=0$$


To summarize we have found that

$$\bbox[5px,border:2px solid #C0A000]{\lim_{n\to \infty}e^{-n}\sum_{k=0}^{\lfloor Cn\rfloor}\frac{n^k}{k!}=\begin{cases}1&,C>1\\\\\frac12&, C=1\\\\0&, C<1\end{cases}}$$

Mark Viola
  • 166,174
  • 12
  • 128
  • 227
  • This is clever and I wonder if you could take it further: if $\lim\limits_{n\to\infty}e^{-n}\sum_{k=0}^{n+g_C(n)} = C$, can you characterize $g_C(n)$ at all, along the lines of things like the critical-value results for connectivity in random graph models? – Steven Stadnicki Sep 23 '20 at 23:06
  • @StevenStadnicki First, thank you! Much appreciated. And yes, I thought about further generalizations, similar to the one you suggest, which should be fairly straightforward. – Mark Viola Sep 23 '20 at 23:18
  • On the off chance you haven't seen it before, a further generalization is $$\lim_{n\to\infty} e^{-n} \sum_{k=0}^{n + \lfloor z\sqrt{n}\rfloor} \frac{n^k}{k!} = \Phi(z)$$ where $\Phi$ is the standard normal CDF. Your results then follow by direct comparison from $$\Phi(-\infty) = 0, \Phi(0) = \frac{1}{2}, \Phi(+\infty) = 1$$ – Brian Moehring Dec 09 '20 at 05:55
6

On this page there is a nice collection of evidence.

I add another proof which also uses the Stirling formula.

$\displaystyle e^{-n}\sum\limits_{k=0}^n\frac{n^k}{k!} = e^{-n}\sum\limits_{k=0}^n\frac{k^k (n-k)^{n-k}}{k!(n-k)!} \hspace{4cm}$ e.g. here

$\displaystyle \lim\limits_{n\to\infty} e^{-n}\sum\limits_{k=1}^{n-1}\frac{e^k e^{n-k}}{\sqrt{2\pi k (1+\mathcal{O}(1/k))}\sqrt{2\pi (n-k)(1+\mathcal{O}(1/(n-k)))}} $

$\displaystyle = \lim\limits_{n\to\infty} \frac{1}{2\pi}\frac{1}{n}\sum\limits_{k=1}^{n-1}\frac{1}{\sqrt{\frac{k}{n}\left(1-\frac{k}{n}\right)}} =\frac{1}{2\pi} \int\limits_0^1\frac{dx}{\sqrt{x(1-x)}}=\frac{\Gamma(\frac{1}{2})^2}{2\pi~\Gamma(1)} = \frac{1}{2}$

user90369
  • 11,262
  • 19
  • 37