Could someone explain in plain English the parts of Gaussian PDF? Why $\pi$, why Euler constant etc.
$$f(x; \mu, \sigma^2) = \dfrac{1}{\sigma \sqrt{2\pi}} \exp\left\{-\frac{1}{2} \left( \frac{x - \mu}{\sigma}\right)^2\right\}.$$
Could someone explain in plain English the parts of Gaussian PDF? Why $\pi$, why Euler constant etc.
$$f(x; \mu, \sigma^2) = \dfrac{1}{\sigma \sqrt{2\pi}} \exp\left\{-\frac{1}{2} \left( \frac{x - \mu}{\sigma}\right)^2\right\}.$$
The simplest cumulant-generating function of a non-constant variable is quadratic, say $i\mu t-\frac12\sigma^2t^2$ (see here for further motivation). It can be shown the resulting distribution has mean $\mu$ and variance $\sigma^2$, and the PDF you cited. Because of how distributions respond to linear transformations, we need only check the $\mu=0,\,\sigma=1$ case, i.e. prove$$\varphi(t):=\exp-\frac12t^2\implies\int_{\Bbb R}\frac{1}{2\pi}\varphi(t)\exp(-itx)dt=\frac{1}{\sqrt{2\pi}}\exp-\frac{x^2}{2}dx.$$(This integral is the PDF, by the inversion formula.) The proportionality constants boil down to the $\alpha=\frac12$ special case of$$\int_{\Bbb R}\exp(-\alpha y^2)dy=\sqrt{\frac{\pi}{\alpha}}.$$Again, verification need only check $\alpha=1$. This has many proofs, the first here being the standard one in textbooks.
I don't have a specific answer why π and Euler (or natural logarithm) but I found a general answer to my question. The hint was found here Gaussian Integral where they mention function e^(-x^2) and its integral sqrt(π).
Second part of the hint was this page Normal distribution where they mention both functions as simplified PDF for variance 1/2 and mean 0.
Meaning the general form of PDF equals to f(x) / integral of f(x) where f(x) is a distribution function.