Does there exist any correspondence between difference equations and differential equations?

In particular, can one cast some classes of ODEs into difference equations or vice versa?

Alexey Bobrick
  • 635
  • 1
  • 6
  • 9
  • Correspondence in what sense? Could you suggest an example? – Pedro May 15 '12 at 16:19
  • Mostly I am interested to know if it is possible to reformulate ones in terms of anothers, but more general possible studied relations would be also interesting. – Alexey Bobrick May 15 '12 at 16:30
  • They are a bit, but not totally different. – AD - Stop Putin - May 15 '12 at 16:34
  • This might generate some interesting answers, I hope. – AD - Stop Putin - May 15 '12 at 16:36
  • Huh, atm I'm trying to understand something in methods mentioned at http://www.risc.jku.at/research/combinat/software/HolonomicFunctions/index.php As far as I understood there is a way to treat certain difference and differential equations (correspondingly definite integrals and sums) in one fashion. I can say nothing more, but hope somebody here could clarify whether it is what you are looking for or not. – Yrogirg May 15 '12 at 17:05
  • 1
    Found a related note in JD Murray, Mathematical Biology I: An Introduction, [Chapter 2](http://math.tut.fi/courses/MAT-35006/pIIexam/Murray_Ch2.pdf).: "*there is no simple connection between difference equation models and what might appear to be the continuous differential equation analogue, even though a finite difference approximation results in a discrete equation*". – np8 Dec 17 '20 at 20:32

5 Answers5


First order example

Consider the difference equation $$\begin{equation*} x_{n+1} = a x_n,\tag{1} \end{equation*}$$ where $x_0$ is given, with solution $$\begin{equation*} x_n = a^n x_0. \tag{2} \end{equation*}$$

Let $x_n = x(t_n)$, where $t_{n+1} = t_n + h$ and $t_0 = 0$, so $t_n = n h$. Rewrite the difference equation (1) as $$\frac{x(t_{n}+h)-x(t_n)}{h} = \frac{(a-1)}{h} x(t_n).$$ If $h$ is small, this can be approximated by the differential equation $$\begin{equation*} x'(t) = \frac{a-1}{h}x(t),\tag{3} \end{equation*}$$ with solution $$\begin{equation*} x(t) = x(0) \exp \left(\frac{a-1}{h} t\right).\tag{4} \end{equation*}$$ Notice that $\exp\left(\frac{a-1}{h} t\right) = a^{n} + O(n(a-1)^2)$, so for $n\ll 1/(a-1)^2$ we find good agreement with (2).

Going in the reverse direction, from the differential equation (3) to the difference equation (1), is called Euler's method.

There is a subtlety when approximating a difference equation by a differential equation. For this problem we require that $|a-1| \ll 1$ since we need $x_{n+1}- x_n = O(h) \ll 1$. Otherwise, we should expect (and will find) the approximation of (1) by (3) to be poor. Sometimes it is necessary to reformulate the difference equation so the difference between successive terms is small.${}^\dagger$

${}^\dagger$ For example, suppose $a$ in (1) were large and positive. Let $y_0=x_0$ and let $y_{n+1} = a^{1/p} y_n$, where $p$ is some large integer so $|a^{1/p} - 1|\ll 1$. Note that $y_{n p} = x_n$. The solution to the corresponding differential equation for $y$ will be a good approximation to $y_n$, and this in turn can be used to approximate $x_n$.

Second order example

Consider the differential equation for the harmonic oscillator $$\begin{equation*} x''+x = 0, \hspace{5ex} x(0)=1, \hspace{5ex}x'(0)=0, \tag{5} \end{equation*}$$ the solution for which is $x(t) = \cos t$. The simplest related difference equation is $$\begin{equation*} \frac{1}{h^2}(x_{n+2}-2x_{n+1}+x_n) + x_n = 0, \tag{6} \end{equation*}$$ with $x_0 = x_1 = 1$. (We show how to get (6) from (5) below.) There are standard techniques for solving simple recurrence relations such as (6) in closed form. We find $$\begin{equation*} x_n = \frac{1}{2} \left((1+i h)^{n}+(1 - i h)^{n}\right). \end{equation*}$$ Note that $x_n$ is real since $x_n^* = x_n$. This is the closed form for the solution a computer would find solving (6) iteratively. Recall that $n=t_n/h$. For small $h$, $x_n = \cos t_n + \frac{1}{2} h t_n \cos t_n + O(h^2)$, so the numerical solution to (6) will well approximate $\cos t_n$ for $h t_n \ll 1$. In the limit $h\to 0$, $x_n \to \cos t_n,$ as expected.

Derivatives and finite differences

Morally, a difference equation is a discrete version of a differential equation and a differential equation is a continuous version of a difference equation. The method of numerical integration of ODEs is essentially the rewriting of a differential equation as a difference equation which is then solved iteratively by a computer. This is a big subject with many subtleties. The method can also be applied to nonlinear and partial differential equations. Below we give a brief dictionary between finite difference and differential operators.

Define the shift operator $E$ such that $E x_n = x_{n+1}$. The difference operator $\Delta = E-1$ then gives $\Delta x_n = x_{n+1}-x_n$. These operators are connected to differentiation by Taylor series. Let $D x_n = x_n' = d x(t)/d t|_{t=t_n}$. Then $$x_{n+1} = E x_n = \sum_{k=0}^\infty \frac{h^k}{k!} D^k x_n = e^{h D}x_n.$$ Thus, as an operator, $$E = e^{h D},$$ and so $$D = \frac{1}{h} \ln E = \frac{1}{h} \ln(1+\Delta) = \frac{1}{h}\sum_{k=1}^\infty (-1)^{k+1} \frac{\Delta^k}{k}.$$ (Think of these operators as acting on polynomials, possibly of very high order.) This formalism gives us a way to convert any ODE into a difference equation and vice-versa. Notice that higher order derivatives can be approximated by $D^k\approx (\Delta/h)^k$. Thus, for example, $$x'' = D^2 x \rightarrow \frac{1}{h^2}\Delta^2x_n = \frac{1}{h^2}\Delta(x_{n+1}-x_n) = \frac{1}{h^2} (x_{n+2}-2 x_{n+1} + x_n).$$

When using Euler's method we let $D \approx \Delta/h$. We could just as well keep higher order terms in $\Delta$ to get recursion relations with three or more terms. It is a sign of the nontrivial nature of the subject that this simple change leads to numerical instabilities. There are many named algorithms that do improve and generalize Euler's method, building on the ideas sketched above. (See, for example, the Runge-Kutta methods, a family of robust algorithms used to numerically solve linear and nonlinear differential equations.)


This is in response to Drew N's question about seeing directly that $$D = \frac{1}{h}\sum_{k=1}^\infty (-1)^{k+1} \frac{\Delta^k}{k}$$ is the differential operator. Here we show that $D t^n = n t^{n-1}$ for $n=0,1,\ldots$, which shows that $D$ is the differential operator on polynomials of arbitrarily high degree. (It is straightforward to extend this proof to $n\in\mathbb{C}$.) We have \begin{align*} D t^n &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k}\Delta^k t^n \\ &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k}(E-1)^k t^n \\ &= \frac{1}{h} \sum_{k=1}^\infty (-1)^{k+1}\frac{1}{k} \sum_{j=0}^k{k\choose j}(-1)^{k-j}E^j t^n & \textrm{binomial theorem} \\ &= \frac{1}{h} \sum_{k,j} (-1)^{j+1}\frac{1}{k}{k\choose j}(t+j h)^n \\ &= \frac{1}{h} \sum_{k,j} (-1)^{j+1}\frac{1}{k}{k\choose j} \sum_{l=0}^n {n\choose l}j^l h^l t^{n-l} & \textrm{binomial theorem} \\ &= \left.\frac{1}{h} \sum_{k,j,l} (-1)^{j+1}\frac{1}{k} {k\choose j}{n\choose l} (x D)^l x^j h^l t^{n-l}\right|_{x=1} & D = d/dx, (x D)^l = \underbrace{(x D)\cdots (x D)}_{l} \\ &= \left.-\frac{1}{h} \sum_{k,l} \frac{1}{k} {n\choose l} h^l t^{n-l} (x D)^l \sum_{j=0}^k {k\choose j}(-x)^j \right|_{x=1} \\ &= \left.-\frac{1}{h} \sum_{k,l} \frac{1}{k} {n\choose l} h^l t^{n-l} (x D)^l (1-x)^k \right|_{x=1} & \textrm{binomial theorem} \\ &= \left.-\frac{1}{h} \sum_{l} {n\choose l} h^l t^{n-l} (x D)^l \sum_{k=1}^\infty \frac{1}{k} (1-x)^k \right|_{x=1} \\ &= \left.\frac{1}{h} \sum_{l} {n\choose l} h^l t^{n-l} (x D)^l \log x\right|_{x=1} & \textrm{series for natural logarithm} \\ &= \frac{1}{h} \sum_{l=0}^n {n\choose l} h^l t^{n-l} \delta_{l1} & \textrm{see below} \\ &= \frac{1}{h} {n\choose 1} h t^{n-1} \\ &= n t^{n-1}. \end{align*} Note that $(x D)x^j = j x^j$ so $$(x D)^l x^j = j^l x^j.$$ Also
\begin{align*} (x D)^0 \log x|_{x=1} &= \log 1 = 0 \\ (x D)^1 \log x &= x \frac{1}{x} = 1 \\ (x D)^2 \log x &= (x D)1 = 0 \\ (x D)^3 \log x &= (x D)^2 0 = 0. \end{align*} Thus, $$(x D)^l \log x|_{x=1} = \delta_{l1},$$ where $\delta_{ij}$ is the Kronecker delta.

  • 18,473
  • 2
  • 39
  • 92
  • 6
    Great work. Very helpful, concise, and clear. Thank you. – wesssg Oct 07 '16 at 05:10
  • Why did you apply limit only to left side in formula `(3)`? I though that `h` should not appear at right side – LmTinyToon Dec 16 '16 at 10:28
  • @LmTinyToon: For small $h$, the left hand side of (3) will be well-approximated by $x'(t)$. (But only as long as $x_{n+1}-x_n$ is also small.) The right hand side does not simplify in this limit and fundamentally depends on $h$. This is discussed in some detail in the last paragraph of "First order example" and in the footnote to that section. Cheers! – user26872 Dec 16 '16 at 21:40
  • 1
    Using operator formalism you establish $D = (1/h) \sum_{k=1}^\infty (-1)^{k+1} \Delta^{k}/k$, which I believe expands to $1/h\big[ (x_{n+1} - x_n) - (1/2) (x_{n+2} - 2x_{n+1} + x_n) + \dots \big]$. But I just don't intuit why this expression of $D$ must correctly give $Dx_n = x'(t_n)$. One can follow the operator formalism line by line, but is there a way to be convinced this really is $x'(t_n)$ just from the form of this expression alone? – Drew N Dec 16 '18 at 04:05
  • @DrewN: Good question! I added an addendum to address this. – user26872 Jan 02 '19 at 19:57
  • 2
    I don't understand how $\mathrm{Exp}(n(a-1)) = a^n + O(n(a-1)^2)$ with Taylor series. Can someone please explain that? – user3433489 Sep 05 '20 at 21:50
  • 1
    @user3433489: We expand in small $a-1$ for fixed $n$. We have \begin{align*} e^{n(a-1)} &= (e^{a-1})^n \\ &= (1+(a-1)+(a-1)^2/2+\ldots)^n \\ &= (a+(a-1)^2/2+\ldots)^n \\ &= a^n + n a^{n-1}(a-1)^2/2+\ldots \\ &= a^n + n(a-1)^2/2+\ldots \end{align*} In the last step we use that $a^{n-1} = (1+(a-1))^{n-1} = 1+O((n-1)(a-1))$. – user26872 Sep 07 '20 at 16:57
  • Okay I see! Thank you! – user3433489 Sep 07 '20 at 18:07
  • I am glad to help! – user26872 Sep 07 '20 at 18:18

Yes, obviously there is some correspondence (numerical solutions of ordinary differential equations are discrete dynamical systems, many proofs in bifurcation theory uses continuous time dynamical systems to analyze discrete original problems, etc).

However, the most profound attempt to build a theory that unites difference and differential equations is the time scale calculus.

  • 13,369
  • 3
  • 32
  • 51

(Adding to user26872's answer as this was not obvious to me so it might help someone else going through his derivation)

The identity

$$x_{n+1} = Ex_n = \sum\limits_{k=0}^\infty \frac{h^k}{k!}D^k x_n=e^{hD}x_n$$

is true if we consider the following. Let $x_n = x(t_n)$. Let's assume that $x(t)$ is differentiable around some point $t_n$, then it's Taylor series can be defined as

$$x(t) = \sum\limits_{k=0}^\infty \frac{x^{(k)}(t_n)}{k!}(t-t_n)^k.$$

If we now perform the same expansion at $t' = t_n+h$ we have

$$x(t_n+h) = \sum\limits_{k=0}^\infty \frac{x^{(k)}(t_n)}{k!}h^k = \sum\limits_{k=0}^\infty \frac{h^k D^k}{k!}x(t) = e^{hD} x(t_n)$$

and so

$$x_{n+1} = Ex_n \leftrightarrow x(t_n + h) = E x(t_n)$$

thus giving

$$E = e^{hD}.$$

  • 183
  • 1
  • 6

One example:

You can take the Laplace transform of a discrete function by expressing the target of the Laplace transform as the product of a continuous function and a sequence of shifted delta functions and summing up: $\sum_{k=0}^\infty\int_{0}^\infty f(t)\delta(t - kT)e^{-st}dt$ = $\sum_{k=0}^\infty f(kT)e^{-skT}$, where T is the sampling period.

The (unilateral) Z transform is defined as $\sum_{n=0}^{\infty}x[k]z^{-k}$, for a discrete time signal $x[k]$. By substituting $z = e^{sT}$ in the formula for the discretized Laplace transform, the Z transform is obtained, and the s and Z domains are related by $s = \frac{1}{T}\log(Z)$. This is obviously a nonlinear transformation; points left of the imaginary axis in the s plane are mapped to the interior of the unit circle of the Z plane, while points to the right of the imaginary axis are mapped to the exterior. One can use the Bilinear transform, $s = \frac{2(z -1)}{T(z + 1)}$, as a first order approximation to $s = \frac{1}{T}\log(Z)$.

So it's possible to (approximately) transform any equation in the s domain into a corresponding equation in the Z domain. This is useful for deriving a difference equation, since it can be shown that the Z transform of a shift, $Z(x[k - n])$, is equal to simply $z^{-n}X(z).$ Due to this property it is often possible to go easily from the Z transform of an expression to a difference equation, simply by inspecting inverse powers of Z and their coefficients in the equation. In short, if the differential equation is amenable to the Laplace transform (i.e. linear time invariant), it can be quite straightforward using this method to get a difference equation representation.

  • 2,163
  • 3
  • 20
  • 32

First see the correspondence between discrete and continuous dynamical systems. Both are acts of a monoid on some set $X$.

For discrete dynamical systems the monoid that acts is $\mathbb{N}$. For continuous dynamical systems the monoid that acts is $\mathbb{R}$.

This action comes in the form of a left multiplication $(.,.):\mathbb{N}\times X \rightarrow X$ or $(.,.):\mathbb{R}\times X \rightarrow X$ and is compatible with the addition structure of $\mathbb{N}$ and $\mathbb{R}$ respectively.

I.e. for all $n,m \in \mathbb{N}$ or $\mathbb{R}$ and $x \in X$ we have $(0,x) = x$ and $(n,(m,x)) = (n + m,x)$.

Now on the story of difference and differential equations. A first order difference equation equals a discrete dynamical system. Note that any difference equation can be converted to a system of first order difference equations (see higher order difference equations). Hence any difference equation equals a discrete dynamical system. Note that the $\mathbb{N}$-act is given by $(n,x) = f^n(x)$.

If the differential condition is sufficiently smooth there exists a unique solution for any point ($\phi_x^t$). We use this solution as the $\mathbb{R}$-act. I.e. $(t,x) = \phi_x^t$. Hence any sufficiently smooth differential equation equals a continuous dynamical system.

Jens Wagemaker
  • 1,212
  • 9
  • 15