If a function is a combination of other functions whose derivatives are known via composition, addition, etc., the derivative can be calculated using the chain rule and the like. But even the product of integrals can't be expressed in general in terms of the integral of the products, and forget about composition! Why is this?

Rodrigo de Azevedo
  • 18,977
  • 5
  • 36
  • 95
  • 1,471
  • 2
  • 10
  • 7
  • 3
    @Patrick: This doesn't exactly answer your question but is on related lines. http://math.arizona.edu/~mleslie/files/integrationtalk.pdf –  Feb 05 '11 at 20:42
  • Interesting! I wonder if maybe it has something to do with the fact that our set of 'elementary' functions isn't suitable for integration. – Venge Feb 05 '11 at 20:46
  • 13
    This question reminded me of [a tangentially related post on MathOverflow](http://mathoverflow.net/questions/2358/most-harmful-heuristic/16702#16702): "... *on formulas* differentiation is nice and integration is hard, but on *computable functions* differentiation is hard and integration is nice." -- Jacques Carette –  Feb 05 '11 at 21:34
  • Why is your default to assume that they are of comparable difficulty? Squaring is much easier than taking square roots, for example. – Qiaochu Yuan Feb 05 '11 at 22:35
  • Qiaochu: I think this is a good question even if one operation is harder. Why should integration be the "hard" one? – Michael Lugo Feb 05 '11 at 23:35
  • 3
    @sivaram: that's an interesting slideshow! The last slide is hilarious. – Myself Feb 06 '11 at 02:15
  • 1
    @Myself: True. The last slide is the reason why I have bookmarked that pdf file :) –  Feb 06 '11 at 02:19
  • 6
    I think most analysts out there would say integration is much easier than differentiation...but of course they have a different thing in mind than your question. – Matt Feb 06 '11 at 02:39
  • 10
    To rephrase Jacques's quote: differentiation is symbolically easy but numerically hard, while integration is numerically easy but symbolically hard. – J. M. ain't a mathematician Apr 20 '11 at 11:03
  • 1
    A similar question has been asked today in MO: http://mathoverflow.net/questions/66377/why-is-differentiating-mechanics-and-integration-art. It's nice to be able to refer MO to MSE for a change. :-) – lhf May 30 '11 at 01:12
  • 1
    $\int f(x) g(x) dx$ is like a sum of products. And in general you cannot factor out $a_1+a_2+a_3+...$ from $a_1 b_1 + a_2 b_2 + a_3 b_3 + ...$. So there is no recursive rule for integrals. – Calmarius Jan 09 '16 at 16:16
  • Differentiation is more difficult. – Weltschmerz Jan 29 '16 at 16:52
  • In case anybody else is looking for those slides by Martin Leslie: https://web.archive.org/web/20150226124236/http://math.arizona.edu/~mleslie/files/integrationtalk.pdf – Hans Lundmark Oct 23 '17 at 17:37
  • https://xkcd.com/2117/ – Torsten Schoeneberg Jul 23 '19 at 17:46

6 Answers6


Here is an extremely generic answer. Differentiation is a "local" operation: to compute the derivative of a function at a point you only have to know how it behaves in a neighborhood of that point. But integration is a "global" operation: to compute the definite integral of a function in an interval you have to know how it behaves on the entire interval (and to compute the indefinite integral you have to know how it behaves on all intervals). That is a lot of information to summarize. Generally, local things are much easier than global things.

On the other hand, if you can do the global things, they tend to be useful because of how much information goes into them. That's why theorems like the fundamental theorem of calculus, the full form of Stokes' theorem, and the main theorems of complex analysis are so powerful: they let us calculate global things in terms of slightly less global things.

Qiaochu Yuan
  • 359,788
  • 42
  • 777
  • 1,145
  • 10
    Patrick asked about the integration of of "function terms", i.e. finite expressions. In your answer you omitted to say that such an expression is a global object to begin with. – Christian Blatter Feb 06 '11 at 13:26
  • The question is about *symbolic* derivation and integration. Nothing to do with the local/global behavior of the underlying functions. This answer pertains to *numerical* derivation and integration. –  May 12 '14 at 14:37
  • 5
    @YvesDaoust This is an old answer, for one. For another, that's not true. The symbolic integrability/differentiability is absolutely dependent on local/global contexts. In fact, the definitions of those things depend on those local/global characteristics, and the symbolic results follow from those definitions. In particular, the Risch algorithm requires the identification of terms that are identically zero; this cannot be done locally. – Emily Jun 04 '14 at 20:06
  • I'd also argue that numerical integration is often a local concept, as well. Plenty of quadrature rules will happily integrate right through a removable singularity of the integrand, provided it does not coincide with a quadrature point, for instance. – Emily Jun 04 '14 at 20:08
  • @Arkamis: I can't agree with you. Symbolic integration, seen as the inverse of symbolic derivation, is a pure abstract symbol manipulation. Saying that the primitive of $x^2$ is $\frac{x^3}3$ or its derivative is $2x$ doesn't tell you anything about the behavior of the function $x^2$. Actually, $x^2$ needn't be a function at all. –  Jun 05 '14 at 21:00
  • 1
    True, without an underlying notion of an integral/derivative, saying $d/dx\ x^2 = 2x$ is nothing more than arbitrary symbol mucking. But we do have an underlying sense of what is and what isn't a function, a derivative, and an integral, so those things do matter. Symbolic manipulations are mathematically meaningless when you elide the mathematical context, so trying to make other mathematical arguments about them is meaningless as well. – Emily Jun 05 '14 at 21:07
  • 2
    In fact, if we're going to just talk general abstract (nonsensical) symbol manipulation, then I declare the symbolic derivative of $x^2$ to be $4x$. You can't prove me wrong. – Emily Jun 05 '14 at 21:08
  • @YvesDaoust 'symbolic' derivation and 'symbolic' integration, in the sense you propose, are meaningless in a discussion about how 'hard' derivation is compared to integration. Both operations are simply a matter of looking up the result in a table. – Jonas Dahlbæk Oct 16 '14 at 10:37
  • @user161825: perfectly wrong. Symbolic derivatives are obtained by systematic application of simple rules. Symbolic derivatives are (painfully) performed by humans using heuristics, trials and errors, and by computers using highly sophisticated algorithms. There can be no exhaustive tables. On this very site, you will find innumerable questions about "how do I integrate this ?", and virtually none about "how do I derive this ?" –  Oct 16 '14 at 11:45
  • @Arkamis: defining the derivative of $x^2$ being $4x$ is absurd, as it will not be consistent with the symbolic derivation rules: for instance by the product rule, $(x^2)'=(x.x)'=1.x+x.1=2.x$. The symbolic rules are pure symbol manipulation, there is no need to assign any mathematical interpretation to them. There is no right nor wrong, just a consistent set of rules. –  Oct 16 '14 at 11:58
  • @YvesDaoust The simple rules need to be applied to objects that you know how to differentiate in the first place. So basically, you are restricting yourself to differentiating a few known functions (i.e. from a table) and then considering a number of operations on those functions, say multiplication, addition and composition, whose action you then look up in a table. I dare you to tell me what $$\frac{d}{dt}\int_0^3 e^{-tx^2}dx$$ is without simply giving me the answer as an integral. Symbolically, certain integrals are easy to perform, just like certain functions are easy to derive. – Jonas Dahlbæk Oct 16 '14 at 12:21
  • @user161825: you are denying the obvious. Integration is way harder than derivation. The derivative of a closed formula is a closed formula. The vast majority of antiderivatives of closed formulas can't be expressed as closed formulas. –  Oct 16 '14 at 13:05
  • 1
    @user161825: you also seem to ignore that the OP is explicitly about symbolic derivation/integration. –  Oct 16 '14 at 13:07
  • 1
    @YvesDaoust It seems you are not understanding the point I am making. Maybe you should go back and read my comments again. – Jonas Dahlbæk Oct 16 '14 at 13:32
  • There ought to be some kind of clever way of saying this with topology. – J. Abrahamson Oct 29 '14 at 03:31
  • Whether a function is an antiderivative is a local property. – Acccumulation Dec 21 '21 at 01:10

The family of functions you generally consider (e.g., elementary functions) is closed under differentiation, that is, the derivative of such function is still in the family. However, the family is not in general closed under integration. For instance, even the family of rational functions is not closed under integration because you $\int 1/x = \log$.

  • 208,399
  • 15
  • 224
  • 525
  • 3
    It seems to be hard to find [families of functions that are closed under integration](http://math.stackexchange.com/questions/474034/families-of-functions-closed-under-integration). – lhf Oct 12 '13 at 02:55
  • 4
    See also https://xkcd.com/2117/ and https://matheducators.stackexchange.com/a/16671 – lhf Jan 06 '20 at 17:58

I guess the OP asks about the symbolic integration. Other answers already dealt with the numeric case where integration is easy and differentiation is hard.

If you recall the definition of the differentiation, you can see it's just a subtraction and division by a constant. Even if you can't do any algebraic changes, it won't get any more complex than that. But usually you can do many simplifications due to the zero limit, as many terms fall out as being too small. From this definition it can be shown that if you know the derivative of $f(x)$ and $g(x)$, then you can use these derivatives to express the derivative of $f(x) \pm g(x)$, $f(x)g(x)$ and $f(g(x))$. This makes symbolic differentiation easy as you just need to apply the rules recursively.

Now about integration. Integration is basically an infinite sum of small quantities. So if you see an $\int f(x) \, dx$. You can imagine it as an infinite sum of $(f_1 + f_2 + ...) \, dx$ where $f_i$ are consecutive values of the function.

This means if you need to calculate integral of $\int (a f(x) + b g(x)) \,d x$. Then you can imagine the sum $((af_1 + bg_1) + (af_2 + bg_2) + ...) \,d x$. Using the associativity and distributivity, you can transform this into: $a(f_1 + f_2 +...)\,d x + b(g_1 + g_2 + ...)\,d x$. So this means $\int (a f(x) + b g(x)) \, d x = a \int f(x) \,d x + b \int g(x) \, dx$.

But if you have $\int f(x) g(x) \, d x$, you have the sum $(f_1 g_1 + f_2 g_2 + ...) \,d x$. From which you cannot factor out the sum of $f$s and $g$s. This means there is no recursive rule for multiplication.

Same goes for $\int f(g(x)) \,d x$. You cannot extract anything from the sum $(f(g_1) + f(g_2) + ...) \,d x$ in general.

So far, only linearity is the useful property. What about the analogues of the Differentiation rules? We have the product rule: $$\frac{d f(x)g(x) }{\, d x} = f(x) \frac{d g(x)}{\, d x} + g(x) \frac{d f(x)}{\, d x}.$$ Integrating both sides and rearranging the terms, we get the well-known integral by parts formula:

$$\int f(x) \frac{d g(x)}{\, d x} \, d x = f(x)g(x) - \int g(x) \frac{d f(x)}{\, d x} \, d x.$$

But this formula is only useful if $\frac{d f(x)}{dx} \int g(x) \, d x$ or $\frac{d g(x)}{dx} \int f(x) \, d x$ is easier to integrate than $f(x)g(x)$.

And it's often hard to see when this rule is useful. For example, when you try to integrate $\mathrm{ln}(x)$, it's not obvious to see that it's $1 \mathrm{ln}(x)$. The integral of $1$ is $x$ and the derivative of $\mathrm{ln}(x)$ is $\frac{1}{x}$, which lead to a very simple integral of $x\frac{1}{x} = 1$, whose integral is again $x$.

Another well-known differential rule is the chain rule $$\frac{d f(g(x))}{\, d x} = \frac{d f(g(x))}{d g(x)} \frac{d g(x)}{\, d x}.$$

Integrating both sides, you get the reverse chain rule:

$$f(g(x)) = \int \frac{d f(g(x))}{d g(x)} \frac{d g(x)}{\, d x} \, d x.$$

But again it's hard to see when it is useful. For example what about the integration of $\frac{x}{\sqrt{x^2 + c}}$? Is it obvious to you that $\frac{x}{\sqrt{x^2 + c}} = 2x \frac{1}{2\sqrt{x^2 + c}}$ and this is the derivative of $\sqrt{x^2 + c}$? I guess not, unless someone showed you the trick.

For differentiation, you can mechanically apply the rules. For integration, you need to recognize patterns and even need to introduce cancellations to bring the expression into the desired form and this requires lot of practice and intuition.

For example how would you integrate $\sqrt{x^2 + 1}$?

First you turn it into a fraction:

$$\frac{x^2 + 1}{\sqrt{x^2+1}}$$

Then multiply and divide by 2:

$$\frac{2x^2 + 2}{2\sqrt{x^2+1}}$$

Separate the terms like this:

$$\frac{1}{2}\left(\frac{1}{\sqrt{x^2+1}}+\frac{x^2+1}{\sqrt{x^2+1}}+\frac{x^2}{\sqrt{x^2+1}} \right)$$

Play with 2nd and 3rd term:

$$\frac{1}{2} \left( \frac{1}{\sqrt{x^2+1}}+ 1\sqrt{x^2+1}+ x2x\frac{1}{2\sqrt{x^2+1}} \right)$$

Now you can see the first bracketed term is the derivative of $\mathrm{arsinh(x)}$. The second and third term is the derivative of the $x\sqrt{x^2+1}$. Thus the integral will be:

$$\frac{\mathrm{arsinh}(x)}{2} + \frac{x\sqrt{x^2+1}}{2} + C$$

Were these transformations obvious to you? Probably not. That's why differentiation is just a mechanic while integration is an art.

  • 314
  • 1
  • 12
  • 873
  • 8
  • 12
  • I honestly don't get why this doesn't have more Upvotes, and in fact had a negative score when I found it! FANTASTIC answer! – Brevan Ellefsen Oct 26 '16 at 06:13
  • 1
    **"While during differentiation you can mechanically apply the rules. During integration you need to recognize patterns and even need to introduce cancellations to bring the expression into the desired form and this requires lot of practice and intuition."** Sums it all up. –  Mar 21 '17 at 01:23
  • 3
    I don't think the boldface part "sums up" an *answer* to the question. It just expands *in what way* symbolic integration is harder; one can still ask *why* that is. – Torsten Schoeneberg Jul 23 '19 at 17:52

Answering an old question just because I saw it on the main page. From Roger Penrose (Road To Reality):

... there is a striking contrast between the operations of differentiation and integration, in this calculus, with regard to which is the ‘easy’ one and which is the ‘difficult’ one. When it is a matter of applying the operations to explicit formulae involving known functions, it is differentiation which is ‘easy’ and integration ‘difficult’, and in many cases the latter may not be possible to carry out at all in an explicit way. On the other hand, when functions are not given in terms of formulae, but are provided in the form of tabulated lists of numerical data, then it is integration which is ‘easy’ and differentiation ‘difficult’, and the latter may not, strictly speaking, be possible at all in the ordinary way. Numerical techniques are generally concerned with approximations, but there is also a close analogue of this aspect of things in the exact theory, and again it is integration which can be performed in circumstances where differentiation cannot.

  • 8
    Just for the record: Penrose discusses these matters on p. 103-120. Yet he gives no good reason *why* this is so (esp. for the symbolic case). – vonjd May 31 '11 at 09:22

In the MIT lecture 6.001 "Structure and Interpretation of Computer Programs" by Sussman and Abelson this contrast is briefly discussed in terms of pattern matching. See the lecture video (at 3:56) or alternatively the transcript (p. 2 or see the quote below). The book used in the lecture does not provide further details.

Edit: Apparently, they discuss the Risch algorithm. It might be worthwhile to have a look at the same question on mathoverflow.SE: Why is differentiating mechanics and integration art?

And you know from calculus that it's easy to produce derivatives of arbitrary expressions. You also know from your elementary calculus that it's hard to produce integrals. Yet integrals and derivatives are opposites of each other. They're inverse operations. And they have the same rules. What is special about these rules that makes it possible for one to produce derivatives easily and integrals why it's so hard? Let's think about that very simply.

Look at these rules. Every one of these rules, when used in the direction for taking derivatives, which is in the direction of this arrow, the left side is matched against your expression, and the right side is the thing which is the derivative of that expression. The arrow is going that way. In each of these rules, the expressions on the right - hand side of the rule that are contained within derivatives are subexpressions, are proper subexpressions, of the expression on the left - hand side.

So here we see the derivative of the sum, with is the expression on the left - hand side is the sum of the derivatives of the pieces. So the rule of moving to the right are reduction rules. The problem becomes easier. I turn a big complicated problem it's lots of smaller problems and then combine the results, a perfect place for recursion to work.

If I'm going in the other direction like this, if I'm trying to produce integrals, well there are several problems you see here. First of all, if I try to integrate an expression like a sum, more than one rule matches. Here's one that matches. Here's one that matches. I don't know which one to take. And they may be different. I may get to explore different things. Also, the expressions become larger in that direction. And when the expressions become larger, then there's no guarantee that any particular path I choose will terminate, because we will only terminate by accidental cancellation. So that's why integrals are complicated searches and hard to do.

Lenar Hoyt
  • 932
  • 10
  • 18

I will try to bring this to you in another way. Let us start by thinking in terms of something as simple as a straight line. If I give you the equation of a line $y = mx + c$, its slope can be easily determined which in this case is nothing but $m$.

Now let me make the question a bit trickier. Let me say that the line given above intersects the $x$ and $y$ axis at some points. I ask you to give me the area between the line, the abscissa and the ordinate.

This is obviously not as easy as finding the slope. You shall have to find the intersection of the line with the axis and get two points of intersection and then taking the origin as a third point find the area. This is not the only method of finding the area; as we know there are loads of formulas for finding the area of a triangle.

Let us now view this in terms of curves. If the simple process of finding the slope in case of a line is translated to curves we get differential calculus which is a bit more complicated than the method of finding slopes of straight lines.

Add finding the area under the curve to that and you get integral calculus which by our experience from straight lines we know should be much harder than finding the slope ie differentiation. Also there is no one fixed method for finding the area of a figure. Hence the many methods of integration.

Brian Tung
  • 31,234
  • 3
  • 33
  • 64
  • 49
  • 2
  • 9
    Reading this is unbearable...1/ Please space up by putting a double return to the line for new paragraphs. 2/ The full stop "." has a space **after** and not before. 3/ Sentences start with capital letters. – user88595 Jun 01 '14 at 10:19
  • I apologize for the errors I have caused .I typed the entire thing out on my phone and was in a hurry.I will make the required changes asap . – dhruv Jun 04 '14 at 19:32
  • Edited for formatting errors because the original answerer was evidently not going to come back and edit this ASAP. :-) ETA: I guess they did some of the clean-up... – Brian Tung Mar 12 '22 at 00:08