To gain deeper insight to the Poisson and exponential random variables, I found that I could derive the random variables as follows:

I consider an experiment which consists of a continuum of trials on an interval $[0,t)$. The result of the experiment takes the form of an ordered $n$-tuple $\forall n \in \mathbb{N}$ containing distinct points on the interval. Every outcome is equally likely and I measure the size of the set containing tuples of $n$ different points by $I_n$ as:

$$ I_n = \int_0^{t} \int_0^{x_{n}} \int_0^{x_{n-1}} \cdots \int_0^{ x_2 } dx_1 dx_{2} dx_{3} \dots dx_{n-1} dx_{n} = \frac{ t^n } { n! }$$

It follows that on some interval, $[0,t)$, the probability that the experiment results in an $k$-tuple, $k \in \mathbb{N}$ is

$$P(X(t) = k) = \frac{I_k}{\sum_{n = 0}^{\infty} I_n} = \frac{e^{-t} t^k}{k!}$$

And for $k = 0$, we have $P(X(t) = 0) = e^{-t}$.

# Question:

I was wondering if some similar intuition can applied to derive the Gaussian:

$$\frac{1}{\sqrt{2 \pi \sigma^2}} \exp \big(-\frac{(x-\mu)^2}{2\sigma^2} \big) \ \text{ or the standard normal, }\ \frac{e^{-x^2}}{\sqrt{\pi}}$$

I think that such an intuition might be obtained by gaining more insight into each term in the expansion of $\text{erf}(x)$ as is done for Poisson: $$ \begin{align} \text{erf}(x) &= \frac{1}{\sqrt \pi } \int_{-x}^{x} e^{-t^2} dt\\ &= \frac{2}{\sqrt \pi } \big( \sum_{n=0}^{\infty} \frac{ x^{2n+1} }{ n! (2n+1) } \big)^{-1} \end{align} $$

Any ideas aside from dismissal of the question are much appreciated!