To gain deeper insight to the Poisson and exponential random variables, I found that I could derive the random variables as follows:

I consider an experiment which consists of a continuum of trials on an interval $[0,t)$. The result of the experiment takes the form of an ordered $n$-tuple $\forall n \in \mathbb{N}$ containing distinct points on the interval. Every outcome is equally likely and I measure the size of the set containing tuples of $n$ different points by $I_n$ as:

$$ I_n = \int_0^{t} \int_0^{x_{n}} \int_0^{x_{n-1}} \cdots \int_0^{ x_2 } dx_1 dx_{2} dx_{3} \dots dx_{n-1} dx_{n} = \frac{ t^n } { n! }$$

It follows that on some interval, $[0,t)$, the probability that the experiment results in an $k$-tuple, $k \in \mathbb{N}$ is

$$P(X(t) = k) = \frac{I_k}{\sum_{n = 0}^{\infty} I_n} = \frac{e^{-t} t^k}{k!}$$

And for $k = 0$, we have $P(X(t) = 0) = e^{-t}$.


I was wondering if some similar intuition can applied to derive the Gaussian:

$$\frac{1}{\sqrt{2 \pi \sigma^2}} \exp \big(-\frac{(x-\mu)^2}{2\sigma^2} \big) \ \text{ or the standard normal, }\ \frac{e^{-x^2}}{\sqrt{\pi}}$$

I think that such an intuition might be obtained by gaining more insight into each term in the expansion of $\text{erf}(x)$ as is done for Poisson: $$ \begin{align} \text{erf}(x) &= \frac{1}{\sqrt \pi } \int_{-x}^{x} e^{-t^2} dt\\ &= \frac{2}{\sqrt \pi } \big( \sum_{n=0}^{\infty} \frac{ x^{2n+1} }{ n! (2n+1) } \big)^{-1} \end{align} $$

Any ideas aside from dismissal of the question are much appreciated!

  • 676
  • 5
  • 21
  • I don't think there is much intuition beyond the intuition underlying the proof of the CLT. However, you can get a lot of nice visualization by explicitly computing the PDF of the sum of $n$ iid uniform $(-\sqrt{3},\sqrt{3})$ random variables (which is a piecewise polynomial that starts looking more and more like a standard Gaussian even for $n$ as small as $4$). – Ian May 04 '18 at 18:18
  • yes, central limit theorem is extremely interesting: https://en.wikipedia.org/wiki/Central_limit_theorem . Worth reading! Every distribution when iterated tends to the Gaussian in the end... – anonymous Jun 18 '18 at 04:04
  • You may find [this video](https://www.youtube.com/watch?v=ebewBjZmZTw) interesting. – B. Mehta Aug 13 '18 at 00:18

1 Answers1


Take an experiment that has two outcomes, success (S) and failure (F), of probability $p$ and $q=1-p$ respectively. The probability of S after one trial is $p$. The probability of two S after two trials is $p^2$, for one S and one F is $2pq$ and for two F is $q^2$. In general, for $N$ trials, the probability of having k S is given by the Binomial distribution $P_S(k,N)= \frac{N!}{k!(N-k)!}p^k q^{N-k}$. These are just the coefficients in front of the terms in $(p+q)^N$ after multiplying them out

What happens if we take the limit of large $N$? The Binomial coefficients at large $N$ approximate a Gaussian, which can be seen visually by looking at a low row Pascal's Triangle. We can derive it from the above formula. To make this simple let's work with the symmetric case $p=q=1/2$, so that we have $$ P_S(k,N)= \frac{N!}{k!(N-k)!}\frac{1}{2^N} $$ An easy way to get the desired result is to exchange the index $k$ for one that starts from the center of the triangle where values are largest $ x\in (-N/2,N/2),\; k=x+N/2$. Then we swap this in and apply Stirling's approximation $n!=\sqrt{2\pi}n^ne^{-n}$:

$$ P_S(x,N)= \frac{N!}{(N/2+x)!(N/2-x)!}\frac{1}{2^N} \approx \frac{2}{\sqrt{2\pi N}}\frac{1}{(1-\frac{4 x^2}{N^2})^{(N+1)/2}}\left(\frac{1-\frac{2x}{N}}{1+\frac{2x}{N}}\right)^x $$

Finally exponentiation and taking the log we get $$ P_S(x,N) \approx \frac{2}{\sqrt{2\pi N }} e^{-\frac{1}{2}(N+1)\log(1-\frac{4 x^2}{N^2}) +x(\log(1-\frac{2x}{N})-\log(1+\frac{2x}{N})) } $$

Using the expansion $\log(1+x)=x-x^2/2+ \cdots$, keep everything to order $1/N$ $$ P_S(x,N) \approx \frac{2}{\sqrt{2\pi N}} e^{-\frac{2 x^2}{N} } $$

Now you can feel free to tack a on $dx$ and use a transformation of variables to scale $x \rightarrow \sqrt N x/2$ -- then $x$ is an "implicit" variable.

$$ P_S(x,N)dx \approx \frac{1}{\sqrt{2\pi}} e^{-\frac{ x^2}{2}} dx $$

For general $p,q$, the derivation is similar, see for example http://scipp.ucsc.edu/~haber/ph116C/NormalApprox.pdf

This isn't exactly an infinite expansion like in your example, but there's a similar vein of thought in the conclusion that $N$ choose $k$ limits to a Gaussian type shape for large $N$.

  • 456
  • 2
  • 6
  • thanks for the answer. I'm looking for a derivation in which is enlightening to the terms of infinite series expansion of the exponential, or the cdf of the normal. – jaslibra Aug 16 '18 at 19:14