I understand "transform methods" as recipes, but beyond this they are a big mystery to me.

There are two aspects of them I find bewildering.

One is the sheer number of them. Is there a unified framework that includes all these transforms as special cases?

The second one is heuristic: what would lead anyone to discover such a transform in the course of solving a problem?

(My hope is to find a unified treatment of the subject that simultaneously addresses both of these questions.)

  • 13,386
  • 9
  • 41
  • 79
  • 16
    A lot of these transforms are just linear transformations on vector spaces of functions. The Fourier transform, for example, is just an orthogonal transformation on $L^2(\mathbb R)$, which I think of as changing the basis from a bunch of delta functions to a bunch of sinusoids. The Legendre transform is kind of interesting because [it turns out to be the exact analogue of the Fourier transform if you replace the ring $(\mathbb R,+,\times)$ with the "tropical semiring" $(\mathbb R\cup\{-\infty\},\max,+)$](http://www.encyclopediaofmath.org/index.php/Idempotent_analysis). –  Jun 08 '13 at 14:11
  • http://en.wikipedia.org/wiki/Pontryagin_duality#Pontryagin_duality_and_the_Fourier_transform with various twistings – yoyo Jun 08 '13 at 14:16
  • You might be interested in things like the [FHA cycle](http://en.wikipedia.org/wiki/Projection-slice_theorem#The_FHA_cycle)... – J. M. ain't a mathematician Jun 08 '13 at 15:51
  • 1
    I find this a little unsatisfying - I'd be interested in a more algebraic (as opposed to linear algebraic) unifying picture building off of the fact that the Mellin transform is the multiplicative analogue of the Laplace/Fourier transform, and the Legendre transform is the tropical analogue of the Fourier transform. I don't know if such a picture exists, but if it does, I would be very interested. (Or would everything in my picture just being a special case of a representation-theoretic picture?) – Davidac897 Jun 27 '13 at 15:52
  • in many cases an integral transform $ g(s)=s\int_{0}^{\infty}K(st)f(t) $ can be solved by using the Borel transform, not only for the case $ K(st)=e^{-st} $ – Jose Garcia Oct 31 '13 at 18:47
  • I had same question -- I'm getting some traction out of "Gelfand Theory" and "Banach Algebras". Of the names in the title that I recognize, their forms can be derived once one fixes a domain and looks for the Gelfand Transform. – cantorhead May 31 '15 at 18:27
  • 1
    I'm finding many treatments of Gelfand Transforms assume the Banach Algebra has a unit and/or is commutative, which would rule out a number of the transforms listed. I'm using Adam Bobrowski's Functional Analysis book which doesn't make this assumption. Instead it discusses the special properties that arise when the algebra has a unit or is commutative. – cantorhead Jun 01 '15 at 10:14
  • @Rahul Your link is dead now. – Cameron Williams Jan 18 '16 at 05:45
  • @Cameron: What would you like me to do about it? –  Jan 18 '16 at 06:33
  • @CameronWilliams: https://www.encyclopediaofmath.org/index.php/Idempotent_analysis – kjo Jan 18 '16 at 14:25

4 Answers4


The essential idea of many transforms is to change the basis in the space of functions with the hope that in the new basis the problem will simplify.

Let me give a finite-dimensional example. Suppose we have a $2\times2$ matrix $A$ and we want to compute $A^{1000}$. Direct approach would not be very wise. However, if we first diagonalize $A$ as $PA_dP^{-1}$ (i.e. rotate the basis by $P$), the calculation becomes much easier: the answer is given by $PA_d^{1000}P^{-1}$ and computing powers of diagonal matrix is a very simple task.

A somewhat analogous infinite-dimensional example would be the solution of the heat equation $u_t=u_{xx}$ using Fourier transform $u(x,t)\rightarrow \hat{u}(\omega,t)$. The point is that in the Fourier basis the operator $\partial_{xx}$ becomes diagonal: it simply multiplies $\hat{u}(\omega,t)$ by $-\omega^2$. Therefore, in the new basis, our partial differential equation simplifies and becomes ordinary differential equation.

In general, the existence of a transform adapted to a particular problem is related to its symmetry. The new basis functions are chosen to be eigenfunctions of the symmetry generators. For instance, in the above PDE example we had translation symmetry with the generator $T=-i\partial_x$. In the same way, e.g. Mellin transform is related to scaling symmetry, etc.

Start wearing purple
  • 52,227
  • 12
  • 155
  • 218
  • How exactly does the Fourier transform change the basis on $L^2(\mathbb R)$? What are the 'before' and 'after' bases? – Potato Sep 01 '13 at 07:01
  • 1
    @Potato The "before" basis is composed of "functions" $\delta(x-y)$, where $x$ is the argument and $y$ is a parameter. I.e. we represent $f(x)$ as $\int_{\mathbb{R}}f(y)\delta(x-y)dy$ and look at $f(y)$ as being coordinates of $f(x)$. The "after" basis is formed by the functions $e^{i\omega x}$, where $x$ is again the argument and $\omega$ is a parameter. – Start wearing purple Sep 01 '13 at 09:17
  • I'm confused because neither of those functions are in $L^2$. Do you know of a good reference where I can learn more? (I'm already familiar with basic Fourier theory, just not this perspective.) – Potato Sep 01 '13 at 16:45
  • @Potato I just attempted to give an informal explanation and I am not sure whether it can be made mathematically rigorous. However, this is how Fourier transformation is usually understood by physicists. If $f(x)$ is interpreted as a wave function, the old and new basis correspond to its coordinate and momentum representation. (The basis of plane waves $e^{ipx}$ diagonalizes the momentum operator $\hat{p}=-i\partial_x$.) I will think about the reference, but nothing comes to mind immediately. – Start wearing purple Sep 01 '13 at 17:08
  • 1
    After thinking about it some more, it makes a more sense. Thank you for the explanation. – Potato Sep 01 '13 at 17:14
  • 1
    If you want to look at Fourier transforming harmonic functions or the "delta function" rigorously, I think, you need to look at tempered distributions. I was very happy to learn about them, so I could start understanding the usual physicist's hand-waving a lot better. – Tilman Vogel Nov 21 '13 at 09:49
  • @Startwearingpurple Can separable Hilbert space have uncountable orthonormal basis? – Ziyuan Jan 17 '16 at 23:49
  • I owe everyone an apology for taking so long to accept an answer. They are all very good, as well as difficult to digest. – kjo Jan 18 '16 at 14:32

There are various ways one can find connections between the Fourier, Laplace, Z, Mellin, Legendre, et al transforms.

The rationale is to change the representation of a problem (e.g a differential equation) in order to simplify and thus solve easier.

For example, as stated before, most of the transforms stem directly from trying to solve specific Sturm-Liouville problems (e.g Fourier, Laplace etc..) so in this sense the different conditions of the problem specify a transform to use. For instance by diagonalising (or de-coupling in physics parlance) the (differential) operator of a system description. This process then defines the kernel of the integral transform, which itself describes the type of integral transform (e.g Laplace, Fourier and so on..).

The way this works and a certain transform is applied is the following:

  1. Given that the differential operator $d/dx$ has as eigen-vector (eigen-function) the exponential function $e^x$ (i.e $de^x/dx = e^x$), it is natural to express the (solution of) a certain differential equations in terms of (or as if expanded) the eigen-functions of the (differential) operator. This then derives naturaly some of the known transforms, like Laplace and Fourier.

  2. An eigen-vector/eigen-function of an operator is that function which is left as is by the application of the operator. Or in other words these functions on which the operator has the simplest effect. It is natural and easier to express solutions w.r.t these functions as they will have the simplest interaction with the (differential) operator which describes the system under study

  3. Now one sees how analysing the (differential) operator in terms of its own eigen-functions, simplifies (the solution of) the problem. These eigen-functions define the integral transform (its kernel) and this field of study (of which integral transforms is a part) is refered as Spectral Theory (of Operators).

For example, every integral transform is a linear operator, since the integral is a linear operator, and in fact if the kernel is allowed to be a generalized function then all linear operators are integral transforms (a properly formulated version of this statement is the Schwartz kernel theorem).

from wikipedia, Integral Transform

Another way to acquire a unified view of the transforms is to think as one transform that changes with respect to change of the underlying domain of the problem. For example the Mellin transform can be seen as the Fourier transform when doing a change of variables from $x \to log(y)$. The Laplace transform can be seen as the Fourier transform on a line instead of on a circle by change of variables $i\omega \to s$ or $e^{i\omega} \to e^s$

For discrete applications, the Discrete Fourier transform when changing the variables $e^{i\omega} \to e^{i\omega_k}=e^{i\omega_0k}=e^{2{\pi}k/n}$ or $\omega_k \to 2{\pi}k/n$ for $k=0,..,n-1$ (sampling the unit circle at regular angles, or sampling the frequency at regular $n$ intervals)

The Z transform can be seen either as the dicrete analog of the Laplace transform or as discrete fourier transform extended beyond the unit circle i.e $e^{i\omega_0k} \to z^k$

The Legendre transform can be seen as the Fourier transform on an idempotent semiring instead of a ring

Apart from that, most transforms have a direct geometrical meaning and also a group-theoretic meaning and characterization

If you want to generalise more, one can go into the non-linear domain and generalise the Fourier transform, one such generalisation is the Inverse Scattering Transform which is used to solve non-linear differential equations

Again, the rationale, is to simplify the representation of the problem and/or express it in other known terms. Only in this case (at least for IST) the problem is not of the Sturm-Liouville type but rather of the more general Riemann-Hilbert type

Integral Transforms

Parseval's Theorem for Fourier and Legendre Transforms

Nikos M.
  • 2,011
  • 11
  • 22

The closest thing to a general theory that leads to MANY of the above (though not all) is Sturm–Liouville theory. Basically, many of these transforms have come about from the study of physical phenomena via linear differential equations, where, as previous answers have noted, specific transforms diagonalize the differential operator. It turns out that MANY physical phenomena of interest obey second order differential equations of the Sturm-Liouville type. The same logic really applies for other differential equations (or difference equations in the case of the z-transform). Once you know what functions fundamentally solve a linear differential equation, you want to make up more functions that solve the problem by an integral or sum over these fundamental solutions; this idea leads to many of the transform above. Spectral theory of operators and ideas from Hilbert spaces generalize this for higher order operators. Each one of these equation types naturally appear in physical models of the world. I'll outline some of the differential equations I mean, the associated transform, and the physical applications in which they came about.

  1. Linear constant coefficient ODEs with zero boundary conditions before t=0. The function $e^{st}$ solves these for some values of $s$. Superposing these leads to the Laplace transform. Mellin is closely related. Equations model kinematics, circuits.

  2. Linear constant coefficient ODEs or PDEs in unbounded domains. Plane waves $e^{jkr}$ in multiple dimensions solve these for a continuum of $k$ values. Superposing these leads to the (multidimensional) Fourier transform. In bounded domains some of these dimensions reduce to summations instead of integrals. In certain cylindrical symmetry the solutions are Bessel and Hankel functions, reducing to the Hankel transform. Equations model wave mechanics, heat conduction, potential theory, etc.

  3. Linear constant coefficient difference equations in the variable $n$. The function $z^n$ will solve these equations for some particular values of $z$. Superposing these leads to the z transform. Linear recurrences appear in the math of sequences and series, digital filters, generating functions in probability.

Some of the methods you mention are not from this family of naturally arising from differential equations, namely the Legendre and Hilbert transforms. The Hilbert has a similar form of a linear integral transform, and could be considered unified with the rest. The Legendre transform is something else entirely however.

  • 4,550
  • 14
  • 30

There is actually a unified treatment of transform theory. This is what chapter seven of Keener's book is about.

There is a relationship between Linear differential equations, spectrum, Green functions, contour integration, Dirac Deltas, and Transforms.

The story goes like this. Given a linear operator (differential) $L$, form the operator $L - \lambda = 0$, with given boundary conditions, find its Green function $G(x, \xi, \lambda)$. Then

\begin{equation} \delta(x - \xi) = -\frac{1}{2 \pi \mathrm{i}} \int_{C_{\infty}} G(x, \xi , \lambda) \, d \lambda, \end{equation}

where the contour $C_{\infty}$ is a circle having all the spectrum of $L$ inside.

This particular Dirac delta representation is the product of the direct and inverse transforms.

Here are a few examples:

  1. Sine transform pair: Functions continuously differentiable in $[0,1]$. \begin{equation} Lu = - u'' \quad , \quad u(0)=u(1)=0. \end{equation}


\begin{eqnarray*} U_k &=& 2 \int_0^1 d \xi \sin( k \pi \xi) u( \xi ) \\ u(x) &=& \sum_{k=1}^{\infty} U_k \sin k \pi x. \end{eqnarray*}

  1. Cosine transform pair: \begin{equation} Lu = - u'' \quad , \quad u'(0)=u'(1)=0. \end{equation}


\begin{eqnarray*} U_k &=& 2 \int_0^1 d \xi \cos( k \pi \xi) u( \xi ) \\ u(x) &=& \sum_{k=1}^{\infty} U_k \cos k \pi x. \end{eqnarray*}

  1. Sine transform integral

    Functions in $L^2[0, \infty)$. \begin{equation} Lu = - u'' \quad , \quad u(0)=0 \quad , \quad \lim_{x \to \infty} u(x) = 0. \end{equation}


\begin{eqnarray*} U(\mu) &=& \frac{2}{\pi} \int_0^{\infty} dx u(x) \sin \mu x \\ u(\xi) &=& \ \int_0^{\infty} d \mu U(\mu) \sin \mu x \\ \end{eqnarray*}

  1. cosine transform integral \begin{equation} Lu = - u'' \quad , \quad u'(0)=0 \quad , \quad \lim_{x \to \infty} u(x) = 0. \end{equation}


\begin{eqnarray*} U(\mu) &=& \frac{2}{\pi} \int_0^{\infty} dx u(x) \cos \mu x \\ u(\xi) &=& \ \int_0^{\infty} d \mu U(\mu) \cos \mu x \\ \end{eqnarray*}

  1. Fourier transform Functions in $L^2(-\infty, \infty)$. \begin{equation} Lu = - u'' \quad , \quad , \quad \lim_{x \to \pm \infty} u(x) = 0. \end{equation}


\begin{eqnarray*} U(\mu) &=& \int_{-\infty}^{\infty} dx \; u(x) \; \mathrm{e}^{\mathrm{i} \mu \xi} \\ u(\xi) &=& \frac{1}{2 \pi} \int_{-\infty}^{\infty} U(\mu) \mathrm{e}^{-\mathrm{i} \mu \xi} d \mu. \end{eqnarray*}

The list follows: Mellin, Hankel, etc. Keener shows the linear operators with boundary conditions for these.

I do not know which operator and boundary conditions generate the Laplace transform. I opened this question in StackExchange Mathematics for this particular problem.

I am writing some notes about this matter here.

UPDATE: I solved the problem of connecting the Laplace transform to an ODE with boundary conditions. Please see here


Herman Jaramillo
  • 2,568
  • 21
  • 25