It's a standard exercise to find the Fourier transform of the Gaussian $e^{-x^2}$ and show that it is equal to itself. Although it is computationally straightforward, this has always somewhat surprised me. My intuition for the Gaussian is as the integrand of normal distributions, and my intuition for Fourier transforms is as a means to extract frequencies from a function. They seem unrelated, save for their use of the exponential function.

How should I understand this property of the Gaussian, or in general, eigenfunctions of the Fourier transform? The Hermite polynomials are eigenfunctions of the Fourier transform and play a central role in probability. Is this an instance of a deeper connection between probability and harmonic analysis?

  • 313
  • 3
  • 4
  • I like this question! – Stephen Mar 31 '13 at 21:22
  • 1
    Without a doubt, the connection will go through the [characteristic function](http://en.wikipedia.org/wiki/Characteristic_function_(probability_theory)), which is a Fourier transform. – Raskolnikov Mar 31 '13 at 21:26
  • @user819023 : The set of all Gaussian functions is closed under convolution operation (Why? I don't know!). In other words the set of all characteristic functions of Gaussians is closed under multiplication. This is not an answer but just thought it is one way of putting your question across. – Rajesh D Apr 04 '13 at 09:05
  • 1
    [This Math Overflow post](https://mathoverflow.net/questions/40268/why-is-the-gaussian-so-pervasive-in-mathematics) calls it a minor miracle. – Chris Jones Aug 16 '17 at 18:51

3 Answers3


The generalization of this phenomenon, from a probabilistic standpoint, is the Wiener-Askey Polynomial Chaos.

In general, there is a connection between orthogonal polynomial families in the Askey scheme and probability distribution/mass functions.

Orthogonality of these polynomials can be shown in an inner product space using a weighting function -- a weight function that typically happens to be, within a scale factor, the pdf/pmf of some distribution.

In other words, we can use these orthogonal polynomials as a basis for a series expansion of a random variable:

$$z = \sum_{i=0}^\infty z_i \Phi_i(\zeta).$$

The random variable $\zeta$ belongs to a distribution we choose, and the orthogonal polynomial family to which $\Phi$ belongs follows from this choice.

The deterministic coefficients $z_i$ can be computed easily by using Galerkin's method.

So, yes. There is a very deep connection in this regard, and it is extremely powerful, particularly in engineering applications. Strangely, many mathematicians do not know this relationship!

See also: http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA460654 and the Cameron-Martin Theorem.

  • 34,328
  • 6
  • 89
  • 135

There's a simple reason why taking a Fourier-like transform of a Gaussian-like function yields another Gaussian-like function. Consider the property $$\mathcal{T}[f^\prime](\xi) \propto \xi \hat{f}(\xi)$$ of a transform $\mathcal{T}$. We will call an invertible transform $\mathcal{F}$ "Fourier-like" if both it and its inverse have this property.

Define a "Gaussian-like" function as one with the form $$f(x) = A e^{a x^2}.$$ Functions with this form satisfy $$f^\prime(x) \propto x f(x).$$ Taking a Fourier-like transform of each each side yields $$\xi \hat{f}(\xi) \propto \hat{f}^\prime(\xi).$$ This is has the same form as the previous equation, so it is not surprising that its solutions have the Gaussian-like form $$\hat{f}(\xi) = B e^{b \xi^2}.$$

Jim Ferry
  • 809
  • 4
  • 6

Beside a probabilistic viewpoint, one can visualize the Gaussian's transformation behaviour by the Dirac delta distribution: Its Fouriertransform is $$(\mathcal{F}\delta)(k)=\frac{1}{\sqrt{2\pi}},$$ which means the Dirac delta contains all frequencies of the same amplitude (and what feels intuitively right to me). Since in distribution sense $$\delta(x)=\lim_{\alpha\to 0}\frac{1}{\alpha\sqrt{\pi}}e^{-x^2/\alpha^2},$$ informally the Dirac delta is a "infinetly narrow" Gaussian. Hence, if the Gaussian widens, less frequencies are contained until the distribution of frequencies is again a Gaussian (and also in the other way).

  • 1,532
  • 13
  • 33
  • 5
    This answer is problematic, since for a lot of functions $f$ holds $\delta(x) = \lim_{\epsilon\to 0} f(x/\epsilon)/\epsilon$. (In fact it is sufficient that $f$ is locally integrable with $\int f(x)dx = 1$), but generally you wont find an $\epsilon$ such that $f(x/\epsilon)/\epsilon$ is an eigenfunction of the Fourier transform. – Hyperplane Jun 20 '17 at 14:50