I was wondering if it is possible to get a link to a rigorous proof that $$\displaystyle \lim_{n\to\infty} \left(1+\frac {x}{n}\right)^n=\exp x$$

  • 679
  • 1
  • 7
  • 10
  • 6
    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition. – Three Apr 11 '13 at 22:43
  • Isn't the definition of $\exp(x)$ "the function that is equal to its derivative?" – Lord Soth Apr 11 '13 at 22:45
  • 3
    @LordSoth Consider $x\mapsto 0$. – Git Gud Apr 11 '13 at 22:46
  • @LordSoth, another is that it is the inverse of $\ln x = \int_1^x \frac{d u}{u}$... – vonbrand Apr 11 '13 at 22:47
  • maybe you want $\lim \left ( 1+ \frac{1}{n}\right )^n=e$ then it implies $\lim \left (1 +\frac{x}{n} \right )^n=e^x$? – clark Apr 11 '13 at 22:47
  • @Git Gud, so?. I guess I did not get your point. – Lord Soth Apr 11 '13 at 22:47
  • @LordSoth The function I mentioned is equal to its derivative and yet it's not $\exp$ as one would normally expect. – Git Gud Apr 11 '13 at 22:48
  • @Git Gud, there is no reason to be absolutely precise here, and I guess my message is clear enough. I think people did not define $\exp(x)$ as it is in the question (historically), they were looking for a non-trivial function that would be equal to its derivative. I do not have a reference right now though. – Lord Soth Apr 11 '13 at 22:50
  • 1
    @LordSoth, actually that's false. $\exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: http://en.wikipedia.org/wiki/Exponential_function – Three Apr 11 '13 at 22:56
  • 1
    @Three I suggest you read http://www-history.mcs.st-and.ac.uk/HistTopics/e.html – Lord Soth Apr 11 '13 at 22:59
  • 5
    How do you **define** $\exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $\exp$? Can you use $\log$? &c... Whenever you make this kind of questions, you **must** state what definitions and available tools are, *always*. Else we're just guessing what you want. – Pedro Apr 11 '13 at 23:56
  • http://math.stackexchange.com/questions/365029/intuitive-proofs-that-lim-limits-n-to-infty-left1-frac-xn-rightn-ex/1825161#1825161 –  Jun 20 '16 at 20:46
  • 1
    @GuyFsone : You should not vote to close this one. This one is much more well-received, and if you check the right column, you'll see that lots of questions are linked to this one. Also, in general please think twice before closing a question with the tag "faq". Those questions are the "abstract duplicate" which are used by many others to locate duplicates. See [here](https://math.meta.stackexchange.com/questions/1868/list-of-generalizations-of-common-questions) for more such examples. –  Nov 11 '17 at 04:08
  • 1
    Is this a duplicate? Yes, all the duplicates are duplicate to each other. We close duplicate question since we want to direct users to the "best" duplicate, where they can find different good answers. The question you linked might be more "fancy", but the technique is identical. And obviously this questions generate better answers. And of course closing a question will make a difference: one cannot answer closed questions. That's another reason why we want to choose the best duplicates, so that all future good answers are posted at the same place. –  Nov 11 '17 at 16:50
  • And please remember to @ ping a user when you want them to get a notification @GuyFsone –  Nov 11 '17 at 16:51
  • 1
    Thanks I did not know that. And also that option does not work on my phone which is what I used:) – Guy Fsone Nov 11 '17 at 16:56
  • is this a duplicate or not? The original question in the link I gave is more fancy than here . also if one now a day ask a question with one line description like here it will be directly close as off topics. since people will ask what Op have tried . in addition it is bitter sad this post has many up vote and reaction more than the earlier post. Even if this close nothing will happen to those linked question on the right. so for me your excuse is pointless. I thought twice – Guy Fsone Feb 05 '18 at 11:18

12 Answers12


I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book Analysis I as follows:

Fundamental lemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$\lim_{n \to \infty} \Bigl(1 + \frac{w_n}{n}\Bigr)^n = \sum_{k=0}^\infty \frac{w^k}{k!}.$$ Proof. For every $\varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$\sum_{k=K}^\infty \frac{(|w|+1)^k}{k!} < \frac \varepsilon 3 \quad\mbox{and}\quad |w_n| \le |w|+1.$$Therefore if $n \ge K$ then $$\left|\Bigl(1 + \frac{w_n}{n}\Big)^n - \exp w \right| \le \sum_{k=0}^{K-1} \left|{n \choose k}\frac{w_n^k}{n^k} - \frac{w^k}{k!}\right| + \sum_{k=K}^n{n\choose k} \frac{|w_n|^k}{n^k} + \sum_{k=K}^\infty \frac{|w|^k}{k!}.$$ The third sum is smaller than $\varepsilon / 3$ based on our estimations. We can find an upper bound for the middle one using $${n \choose k} \frac 1 {n^k} = \frac{1}{k!} \prod_{i = 1}^{k-1} \Bigl(1 - \frac i n \Bigr) \le \frac 1 {k!}.$$ Combining this with $|w_n| \le |w| + 1$, $$\sum_{k=K}^n {n \choose k} \frac{|w_n|^k}{n^k} < \sum_{k=K}^n \frac{(|w|+1)^k}{k!} < \frac \varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n \to w$ and ${n \choose k} n^{-k} \to \frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $\varepsilon / 3$ as soon as $n > N$.

Really brilliant.

  • 1,404
  • 10
  • 23
  • 3
    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc. – CopyPasteIt Jul 09 '17 at 23:10
  • 2
    +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K. – DanielWainfleet Aug 30 '18 at 15:49
  • 2
    A simpler proof of this Fundamental Lemma, for the case $w_n$ are all real, can be found at Proposition 4.6.1 in Koski's Lecture Notes on Probability and Random Processes (https://www.math.kth.se/matstat/gru/sf2940/lectnotemat5.pdf). – xFioraMstr18 Feb 16 '20 at 16:37
  • ( on page 133 ) – xFioraMstr18 Feb 16 '20 at 16:44

From the very definition (one of many, I know):


we can try the following, depending on what you have read so far in this subject:

(1) Deduce that

$$e=\lim_{n\to\infty}\left(1+\frac{1}{f(n)}\right)^{f(n)}\;,\;\;\text{as long as}\;\;f(n)\xrightarrow[n\to\infty]{}\infty$$

and then from here ($\,x\neq0\,$ , but this is only a light technicality)


2) For $\,x>0\,$ , substitute $\,mx=n\,$ . Note that $\,n\to\infty\implies m\to\infty\,$ , and

$$\left(1+\frac{x}{n}\right)^n=\left(\left(1+\frac{1}{m}\right)^m\right)^x\xrightarrow[n\to\infty\iff m\to\infty]{}e^x$$

I'll leave it to you to work out the case $\,x<0\,$ (hint: arithmetic of limits and "going" to denominators)

  • 205,374
  • 17
  • 121
  • 273
  • Side note: The difficult part here is deducing the first limit holds over real $n$, opposed to just natural $n$. – Simply Beautiful Art Aug 10 '19 at 15:01
  • @SimplyBeautfulArt It's not *that* difficult. For anyone who wants to learn more about defining real exponents with limits, I recommend Paramanand Singh's [blog post](https://paramanands.blogspot.com/2014/05/theories-of-exponential-and-logarithmic-functions-part-3.html?m=0). – Poder Rac Aug 13 '20 at 15:24

Firstly, let us give a definition to the exponential function, so we know the function has various properties:

$$ \exp(x) := \sum_{n=0}^{\infty} \frac{x^n}{n!}$$

so that we can prove that (as exp is a power series) :

  • The exponential function has radius of convergence $\infty$, and is thus defined on all of $\mathbb R$
  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $\mathbb R$
  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function

Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):

As $\log(x)$ is continuous and differentiable on $(0,\infty)$, we have that $\log(1+x)$ is continuous and differentiable on $[0,\frac{a}{n}]$, so by the mean value theorem we know there exists a $c \in [0,\frac{a}{n}]$ with

$$f'(c) = \frac {\log(1+ \frac{a}{n} ) - \log(1)} {\frac {a}{n} - 0 } $$ $$ \Longrightarrow \log[{(1+\frac{a}{n})^n}] = \frac{a}{1+c}$$ $$ \Longrightarrow (1+\frac{a}{n})^n = \exp({\frac{a}{1+c}})$$

for some $c \in [0,\frac{a}{n}]$ . As we then want to take the limit as $n \rightarrow \infty$, we get that:

  • As $c \in [0,\frac{a}{n}]$ and $\frac{a}{n} \rightarrow 0$ as $n \rightarrow \infty$, by the squeeze theorem we get that $ c \rightarrow 0$ as $n \rightarrow \infty$
  • As $ c \rightarrow 0$ as $n \rightarrow \infty$, $\frac{a}{1+c} \rightarrow a$ as $n \rightarrow \infty$
  • As the exponential function is continuous on $\mathbb R$, the limit can pass inside the function, so we get that as $\frac{a}{1+c} \rightarrow a$ as $n \rightarrow \infty$

$$ \exp(\frac{a}{1+c}) \rightarrow \exp(a) $$ as $n \rightarrow \infty$. Thus we can conclude that

$$ \lim_{n \to \infty} (1+\frac{a}{n})^n = e^a$$

(Of course, this is ignoring that one needs to prove that $\exp(a)=e^a$, but this is hardly vital for this question)

Andrew D
  • 2,294
  • 13
  • 34
  • If we're just about to *define* the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious... – DonAntonio Apr 11 '13 at 23:36
  • This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly) – Andrew D Apr 11 '13 at 23:39
  • I agree with that, @Andrew D, but then perhaps mentioning *some other* definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed. – DonAntonio Apr 11 '13 at 23:42
  • @DonAntonio The log's continuity assumption is just fine, though. Since $\exp$ is its inverse, it is continuous. – Pedro Apr 11 '13 at 23:50
  • Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things) – Andrew D Apr 11 '13 at 23:51

Another answer, assuming $x>0$:

Let $f(x)=\ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write $$ \begin{align} f'(x)&=\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}\\ &=\lim_{h\to 0}\frac{\ln(x+h)-\ln(x)}{h}\\ &=\lim_{h\to 0}\frac{1}{h}\ln\frac{x+h}{x}\\ &=\lim_{h \to 0}\ln\left(\frac{x+h}{x}\right)^\frac{1}{h}\\ &=\lim_{h\to 0}\ln\left(1+\frac{h}{x}\right)^\frac{1}{h} \end{align} $$ Then, using the fact that $\ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $\lim$ and $\ln$: $$ f'(x)=\ln\lim_{h\to 0}\left(1+\frac{h}{x}\right)^\frac{1}{h} $$ Now, let $m=1/h$. Then $m\to\infty$ as $h\to 0^+$, and $$ f'(x)=\ln\lim_{m\to\infty}\left(1+\frac{1}{mx}\right)^m $$ Now, assuming $x>0$, define $n=mx^2$, and so $n\to\infty$ as $m\to\infty$. Then we can write $$ f'(x)=\ln\lim_{n\to\infty}\left[\left(1+\frac{x}{n}\right)^n\right]^{1/x^2} $$ and from before, we still have $f'(x)=1/x$, so $$ \ln\lim_{n\to\infty}\left[\left(1+\frac{x}{n}\right)^n\right]^{1/x^2}=\frac{1}{x} $$ Exponentiating both sides, we find $$ \lim_{n\to\infty}\left[\left(1+\frac{x}{n}\right)^n\right]^{1/x^2}=e^{1/x} $$ Finally, raising both sides to the $x^2$, we find $$ \lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n=e^x $$ EDIT: This idea actually works for all reals—if we use $f(x)=\ln|x|$ instead, then we get eventually get: $$ e^x=\lim_{n\to\infty}\left|1+\frac{x}{n}\right|^{n}=\lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n $$ Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.

This leaves the case where $x=0$, but that is a trivial matter.

Mike Bell
  • 375
  • 5
  • 15

For any fixed value of $x$, define

$$f(u)= {\ln(1+ux)\over u}$$

By L'Hopital's Rule,


Now exponentiate $f$:


By continuity of the exponential function, we have


All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:

$$\lim_{n\rightarrow\infty}\left(1+{x\over n}\right)^n = e^x$$

Barry Cipra
  • 78,116
  • 7
  • 74
  • 151

Consider the functions $u$ and $v$ defined for every $|t|\lt\frac12$ by $$ u(t)=t-\log(1+t),\qquad v(t)=t-t^2-\log(1+t). $$ The derivative of $u$ is $u'(t)=\frac{t}{1+t}$, which has the sign of $t$, hence $u(t)\geqslant0$. The derivative of $v$ is $v'(t)=1-2t-\frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|\lt\frac12$ hence $v(t)\leqslant0$. Thus:

For every $|t|\lt\frac12$, $$ t-t^2\leqslant\log (1+t)\leqslant t. $$

The function $z\mapsto\exp(nz)$ is nondecreasing on the same domain hence $$ \exp\left(nt-nt^2\right)\leqslant(1+t)^n\leqslant\exp\left(nt\right). $$ In particular, using this for $t=x/n$, one gets:

For every $|x|<\frac12n$, $$ \exp\left(x-\frac{x^2}{n}\right)\leqslant\left(1+\frac{x}n\right)^n\leqslant\mathrm e^x. $$

Finally, $x^2/n\to 0$ when $n\to\infty$ and the exponential is continuous at $0$, hence we are done.

Facts/Definitions used:

  • The logarithm has derivative $t\mapsto1/t$.
  • The exponential is the inverse of the logarithm.
  • 271,033
  • 27
  • 280
  • 538
  • We need to evangelize the use of $\leqslant$ and $\geqslant$ in MSE. – Pedro Aug 10 '13 at 04:11
  • I used this in an application to lower bound $(1+x/n)^n$, thank you. – JP McCarthy Aug 16 '16 at 11:55
  • Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $\left(1+\frac xn\right)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$. – Mark Viola Jan 05 '17 at 19:39
  • @Dr.MV This reduces to showing $1+t\geqslant \exp(t-t^2)$, that is, $\frac1{1+t}\leqslant\exp(-t+t^2)$. What you call the trivial upper bound yields $\frac1{1+t}=1-\frac{t}{1+t}\leqslant\exp\left(-\frac{t}{1+t}\right)$ hence if $\frac{t}{1+t}\geqslant t-t^2$, we are done. This is asking that $t\geqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work. – Did Jan 08 '17 at 09:17

Using a single interval upper and lower Riemann sum for $\ln x:=\int_1^x\dfrac 1t\operatorname dt$, we get $\dfrac x{n+x}\le\ln(1+\frac xn)\le\dfrac xn$.

So, $e^{\frac x{n+x}}\le 1+\dfrac xn\le e^{\frac xn}$.

Now $e^{\frac {nx}{n+x}}\le (1+\dfrac xn)^n\le e^ x$.

Let $n\to\infty $.

I think I saw this in Best and Penner's Calculus.


$ (1+x/n)^n = \sum_{k=0}^n \binom{n}{k}\frac{x^k}{n^k} $

Now just prove that $\binom{n}{k}\frac{x^k}{n^k}$ approaches $\frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $\exp(x)$

  • 812
  • 3
  • 9
  • 13
  • 5
    This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here. – Qiaochu Yuan Apr 11 '13 at 23:17
  • 1
    What you want to do is work with $\limsup$ and $\liminf$ here, and show $e^x\leq\liminf $ and $e^x\geq \limsup$ – Pedro Apr 11 '13 at 23:53
  • How would you show that you can swap the two limits? – amarney Mar 26 '17 at 22:54
  • This equality is also only valid if $n$ is an integer (the index on the sum should go to infinity in full generality, and would then only be valid for $x/n < 1$) – D. Zack Garza Jun 28 '20 at 03:06

Let $t\in[1,1+\frac{1}{n}]$, and we have: $$\frac{1}{1+\frac{1}{n}}\leq \frac{1}{t}\leq 1$$ $$\int_1^{1+\frac{x}{n}}\frac{1}{1+\frac{1}{n}}dt\leq \int_1^{1+\frac{x}{n}}\frac{1}{t}\leq \int_1^{1+\frac{x}{n}} 1dt$$ $$\frac{x}{n+1}\leq \ln\Big(1+\frac{x}{n}\Big)\leq\frac{x}{n}$$ Since $e^x$ is increasing on $\mathbb{R}$, we have: $$e^{\frac{x}{n+1}}\leq 1+\frac{x}{n}\leq e^{\frac{x}{n}}$$ Since $x^n$ is increasing on $(0,\infty)$, on the RHS we have: $$\Big(1+\frac{x}{n}\Big)^n\leq e^x$$ On the LHS we have: $$e^x\leq \Big(1+\frac{x}{n}\Big)^{n+1} \ \ \Longrightarrow \ \ \frac{e^x}{1+\frac{x}{n}}\leq \Big(1+\frac{x}{n}\Big)^n$$ Then we have: $$\frac{e^x}{1+\frac{x}{n}}\leq \Big(1+\frac{1}{n}\Big)^n\leq e^x$$ As $n\to\infty$ we know $\Big\{\displaystyle\frac{e^x}{1+\frac{x}{n}}\Big\}\to e^x$. So, by Squeeze Theorem, we have proved the limit.

  • 469
  • 2
  • 13

There is at most one function $g$ on $\mathbb{R}$ such that $$g'(x)=g(x)\text{ for all } x\text{ in }\mathbb{R}\quad\text{and}\quad g(0)=1\,.$$ If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=\sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criterion is what you should probably have in mind when you think of "the exponential".

J. W. Tanner
  • 1
  • 3
  • 35
  • 77

This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess. If for example you take the exponential function to be the inverse of the logarithm:

$\log(\lim_n(1 + \frac{x}{n})^n) = \lim_n n \log(1 + \frac{x}{n}) = \lim_n n \cdot[\frac{x}{n} - \frac{x^2}{2n^2} + \dots] = x$

EDIT: The logarithm is defined as usual: $\log x = \int_1^x \frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($\log a^b = b \log a $), while to obtain the third it sufficies to have the Taylor expansion of $\log(1+x)$.

  • The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation. – DonAntonio Apr 11 '13 at 23:40
  • 2
    The logarithm is defined as $\int_1^x \frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose. –  Apr 11 '13 at 23:45
  • Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is. – DonAntonio Apr 11 '13 at 23:47
  • 1
    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer! –  Apr 12 '13 at 00:07

Start with the binomial theorem: $$\lim_{n\to\infty}\left(1+\frac{x}{n}\right)^n=\lim_{n\to\infty}\sum_{m=0}^n\frac{n!}{m!(n-m)!}\frac{x^m}{n^m}=\lim_{n\to\infty}\sum_{m=0}^n\frac{n!}{n^m(n-m)!}\frac{x^m}{m!}.$$ Define $$f(\mu,\nu)=\sum_{m=0}^{\nu}\frac{\mu!}{\mu^m(\mu-m)!}\frac{x^m}{m!}.$$ Thus you need to prove $$\lim_{n\to\infty}f(n,n)=\lim_{n\to\infty}\sum_{m=0}^n\frac{x^m}{m!},$$ which amounts to proving that $$\lim_{n\to\infty}f(n,n)=\lim_{\nu\to\infty}\left[\lim_{\mu\to\infty}f(\mu,\nu)\right]$$ and $$\forall{m\in\mathbb{N}},\lim_{\mu\to\infty}\frac{\mu!}{\mu^m(\mu-m)!}=1.$$ The latter can be proven rather easily using the principle of induction. The former follows immediately from the Tannery's theorem. With this done, we have that $$\lim_{n\to\infty}\sum_{m=0}^n\frac{n!}{n^m(n-m)!}\frac{x^m}{m!}=\lim_{\nu\to\infty}\left(\sum_{m=0}^{\nu}\left[\lim_{\mu\to\infty}\frac{\mu!}{\mu^m(\mu-m)!}\right]\frac{x^m}{m!}\right)=\lim_{\nu\to\infty}\sum_{m=0}^{\nu}\frac{x^m}{m!}:=\exp(x).$$ Even if the definition of $\exp$ you are using is not via its Maclaurin series, any other definition can be easily proven from Maclaurin series rather easily.

  • 3,393
  • 10
  • 13