I know the Taylor Series are infinite sums that represent some functions like $\sin(x)$. But it has always made me wonder how they were derived? How is something like $$\sin(x)=\sum\limits_{n=0}^\infty \dfrac{x^{2n+1}}{(2n+1)!}\cdot(1)^n = x\dfrac{x^3}{3!}+\dfrac{x^5}{5!}\dfrac{x^7}{7!}\pm\dots$$ derived, and how are they used? Thanks in advance for your answer.

9If you'll forgive the selfreference, I wrote a [dedicated blog post](http://davidlowryduda.com/?p=1520) to getting the idea around Taylor series for my students  and sine was a key example! – davidlowryduda Mar 10 '14 at 04:53

1As for *using* the Taylor series, they are particularly useful when you're doing asymptotics of some kind, e.g. computing a limit, trying to understand asymptotic behavior of a function, etc.. Taylor series are extremely useful tools for that. – TMM Mar 15 '14 at 00:19
9 Answers
$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle #1 \right\rangle} \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace} \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack} \newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,} \newcommand{\dd}{{\rm d}} \newcommand{\down}{\downarrow} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\fermi}{\,{\rm f}} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\pars}[1]{\left( #1 \right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\vphantom{\large A}\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert} \newcommand{\wt}[1]{\widetilde{#1}}$ Note that $$ \fermi\pars{x} = \fermi\pars{0} + \int_{0}^{x} \fermi'\pars{t}\,\dd t \,\,\,\stackrel{t\ \mapsto\ x  t}{=}\,\,\, \fermi\pars{x} = \fermi\pars{0} + \int_{0}^{x}\fermi'\pars{x  t}\,\dd t $$
Integrating by parts: \begin{align} \color{#00f}{\fermi\pars{x}}&= \fermi\pars{0} + \fermi'\pars{0}x + \int_{0}^{x}t\fermi''\pars{x  t}\,\dd t \\[5mm] & = \fermi\pars{0} + \fermi'\pars{0}x + \half\,\fermi''\pars{0}x^{2} +\half\int_{0}^{x}t^{2}\fermi'''\pars{x  t}\,\dd t \\[8mm]& = \cdots = \color{#00f}{\fermi\pars{0} + \fermi'\pars{0}x + \half\,\fermi''\pars{0}x^{2} + \cdots + {\fermi^{{\rm\pars{n}}}\pars{0} \over n!}\,x^{n}} \\[2mm] & + \color{#f00}{{1 \over n!}\int_{0}^{x}t^{n} \fermi^{\rm\pars{n + 1}}\pars{x  t}\,\dd t} \end{align}
 84,132
 10
 143
 185

3

@BrightChancellor Thanks. It's just the esence without decorations. – Felix Marin Dec 31 '17 at 16:03

May I ask why have you used an upright $\mathrm{f}$ instead of an italic $f$? – Apoorv Potnis Feb 09 '20 at 10:54

@FelixMarin Would it break the equality when we add the term $f(0)$ to $f(x) = \int_{0}^{x} f'(t)dt$? – niebayes Jun 11 '20 at 15:17

Why doesn't the same trick work for directly deriving the Maclaurin series? When I skip the t > xt transformation, integration by parts gives f(0) +f'(x)x  nextIntegral – David Bandel Mar 22 '21 at 15:08
This is the general formula for the Taylor series:
$$\begin{align} &f(x) \\ &= f(a) + f'(a) (xa) + \frac{f''(a)}{2!} (x  a)^2 + \frac{f^{(3)}(a)}{3!} (x  a)^3 + \dots + \frac{f^{(n)}(a)}{n!} (x  a)^n + \cdots \end{align}$$
You can find a proof here.
The series you mentioned for $\sin(x)$ is a special form of the Taylor series, called the Maclaurin series, centered $a=0$.
The Taylor series is an extremely powerful because it shows that every function can be represented as an infinite polynomial (with a few disclaimers, such as interval of convergence)! This means that we can differentiate a function as easily as we can differentiate a polynomial, and we can compare functions by comparing their series expansions.
For instance, we know that the Maclaurin series expansion of $\cos(x)$ is $1\frac{x^2}{2!}+\frac{x^4}{4!}\dots$ and we know that the expansion of $\sin(x)$ is $x\dfrac{x^3}{3!}+\dfrac{x^5}{5!}\dfrac{x^7}{7!}\dots$. If we do termbyterm differentiation, we can clearly confirm that the derivative of $\sin(x)$ is $\cos(x)$ by differentiating its series.
We can also use the Maclaurin series to prove that $e^{i\theta}=\cos{\theta}+i\sin{\theta}$ and thus $e^{\pi i}+1=0$ by comparing their series:
$$\begin{align} e^{ix} &{}= 1 + ix + \frac{(ix)^2}{2!} + \frac{(ix)^3}{3!} + \frac{(ix)^4}{4!} + \frac{(ix)^5}{5!} + \frac{(ix)^6}{6!} + \frac{(ix)^7}{7!} + \frac{(ix)^8}{8!} + \cdots \\[8pt] &{}= 1 + ix  \frac{x^2}{2!}  \frac{ix^3}{3!} + \frac{x^4}{4!} + \frac{ix^5}{5!}  \frac{x^6}{6!}  \frac{ix^7}{7!} + \frac{x^8}{8!} + \cdots \\[8pt] &{}= \left( 1  \frac{x^2}{2!} + \frac{x^4}{4!}  \frac{x^6}{6!} + \frac{x^8}{8!}  \cdots \right) + i\left( x  \frac{x^3}{3!} + \frac{x^5}{5!}  \frac{x^7}{7!} + \cdots \right) \\[8pt] &{}= \cos x + i\sin x \ . \end{align}$$
Also, you can use the first few terms of the Taylor series expansion to approximate a function if the function is close to the value on which you centered your series. For instance, we use the approximation $\sin(\theta)\approx \theta$ often in differential equations for very small values of $\theta$ by taking the first term of the Maclaurin series for $\sin(x).$
 2,229
 1
 18
 28

16Be mindful when you say "any function can be *represented* by a polynomial". Really, what you mean is that any "sufficiently nice" function can be _approximated_ by a polynomial. Taylors series don't apply at all to discontinuous functions. And the polynomials give only approximations. – TacTics Mar 10 '14 at 04:34

@TacTics: Are you sure that the (infinite) polynomial is an approximation? I know any finite polynomial can only be an approximation of sin(x), but the Taylor series is not finite. – MSalters Mar 10 '14 at 10:49

4Polynomials always have finite degree. It's better to explicitly call it an "infinite polynomial" or even better, a power series. – TacTics Mar 10 '14 at 15:53

@Mathemusician Sorry if this is something of a late comment, but I was wondering if the second last line in your proof is entirely valid. I always thought you could only rearrange the terms in a positive series. – k_g Mar 23 '15 at 22:19

@k_g Sorry if this is something of an even later comment, but I believe the second line is valid because you can rearrange the terms in any absolutely convergent series. Your skepticism is valid by the Riemann series theorem (saying that you can rearrange the terms of a conditionally convergent series to get a series that sums to anything or even diverges). But note that for cosine, taking the absolute value of each summand yields a series that is strictly less than the Taylor series for $\frac{1}{1x^2}$, so the series is convergent by the comparison test. – William Chang Dec 28 '15 at 23:17

@k_g And since a similar method works for sine, we have that the Taylor series for $e^{ix}$ is absolutely convergent to a complex value. – William Chang Dec 28 '15 at 23:17

Taylor's theorem can be proved using only the Fundamental Theorem of Calculus, basic algebraic and geometric facts about integration, and some combinatorics. Although it's a little long to write out, the basic ideas are pretty simple.
The FTOC gives us: $$f(x) = f(a) + \int_a^x f'(x_1)dx_1$$ $$f'(x_1) = f'(a) + \int_a^{x_1} f''(x_2)dx_2$$ $$f''(x_2) = f''(a) + \int_a^{x_2} f'''(x_3)dx_3$$ $$\ldots$$ $$f^{(m)}(x_m) = f^{(m)}(a) + \int_a^{x_{m}} f^{(m+1)}(x_{m+1}) dx_{m+1},$$ where $f^{(m)}$ denotes the $m$'th derivative of $f$. Substitute the second, third, ... expressions successively into the first gives: $$f(x) = f(a) + \int_{a<x_1<x} f'(a) dx_1 +\iint_{a<x_2<x_1<x} f''(a)dx_2dx_1 + \ldots + {\int \ldots \int}_{a<x_{m}< \ldots < x_1 < x} f^{(m)}(a)dx_{m} \ldots dx_1 + {\int \ldots \int}_{a<x_{m+1}< \ldots < x_1 < x} f^{(m+1)}(x_{m+1})\,dx_{m+1} \ldots dx_1 $$ For all the multiple integrals except the last one, the integrand is constant and can be pulled outside the integral. This gives us terms of the form: $$f^{(m)}(a){\int \ldots \int}_{a<x_{m}< \ldots < x_1 < x} \,dx_{m} \ldots dx_1$$ The ordering of variables $a<x_{m}< \ldots < x_1 < x$ is one of $m!$ orderings of the variables $x_1,\ldots,x_m$. Each one of these orderings corresponds to a region in $m$dimensional space. These regions are all disjoint, by symmetry (or change of variable) they all have the same volume, and their union is an $m$cube with volume $(xa)^m$. From this we conclude: $${\int \ldots \int}_{a<x_{m}< \ldots < x_1 < x} dx_{m} \ldots dx_1 = \frac{(xa)^m}{m!}.$$ Hence we have $$f(x) = f(a) + f'(a)(xa) + f''(a)\frac{(xa)^2}{2!} + f^{(m)}(a)\frac{(xa)^m}{m!} + {\int \ldots \int}_{a<x_{m+1}< \ldots < x_1 < x} f^{(m+1)}(x_{m+1})dx_{m+1} \ldots dx_1 $$ As to the last integral, we have bounds on the integrand: $$ \min_{a<y<x} f^{(m+1)}(y) \le f^{(m+1)}(x_{m+1}) \le \max_{a<y<x} f^{(m+1)}(y),$$ which gives us: $$f(x) = f(a) + f'(a)(xa) + f''(a)\frac{(xa)^2}{2!} + \ldots + f^{(m)}(a)\frac{(xa)^m}{m!} + R_{m+1} $$ where $$\left(\min_{a<y<x} f^{(m+1)}(y) \right) \frac{(xa)^{m+1}}{(m+1)!} \le R_{m+1} \le \left(\max_{a<y<x} f^{(m+1)}(y) \right) \frac{(xa)^{m+1}}{(m+1)!}.$$ Note that this proof does not even require that $f^{(m+1)}$ be continuous. If $f^{(m+1)}(y)$ is continuous on $a \le y \le x$, then the more conventional form of the remainder follows immediately from the intermediate value theorem.
 61
 1
 1
Well, what we really want to do is approximate a function $f(x)$ around an value, $a$.
We will call our Taylor series $T(x)$. Naturally we want our series to have the exact of $f(x)$ when $x = a$. For this, we will start our Taylor approximation with the constant term $f(a)$. We have $$T(x) = f(a)$$ as our first approximation and it is good assuming the function doesn't change much near $a$.
We can obtain a much better approximation of our function had the same slope (or derivative) as $f(x)$ at $x = a$. We want $T'(a) = f'(a)$. The best way to accomplish this is to add the term $f'(x)(xa)$ to our approximation. We now have $T(x) = f(a) + f'(a)(xa)$. You can verify that $T(a) = f(a)$ and that $T'(a) = f'(a)$.
If we were to continue this process we would derive the complete Taylor series where $T^{(n)}(a) = f^{(n)} (a)$ for all $n \in \mathbb{Z}^{+}$ (or n is a positive integer).
This is where the series comes from. If you write it in summation notation you reach what Juan Sebastian Lozano Munoz posted.
 4,976
 2
 17
 50
Like you said, Taylor series are meant to represent some function, let's call it $f(x)$. We often have functions, like $\sin(x)$ or $\log(x)$, that have a few easy to compute point near the where we want to compute the value, and it is often useful to approximate things, and so we can come up with an approximation method for $f(x)$.
Let some point $a$ be near our desired $x$ value, if a is easy to compute, then an easy approximation for $f(x)$ would simply be $f(a)$. However we might want to know with a little more accuracy what $f(x)$ is, and so what we do is take our first derivative at $a $, $f'(a)$, and use that as our coefficient, for an approximation polynomial: $$f(a) + f'(a)(xa)$$ where instead of $x$, we use the difference between $x$ and $a$.
The general formula for a Taylor series expansion of $f(x)$, if $f$ is infinity differentiable is the following: $$f(x) = \sum\limits^{\infty}_{n = 0} \frac{f^{(n)}(a)}{n!} (xa)^n$$ where $a$ is the point of approximation.
The reason for this has to to with power series, because the Taylor series is a power series, as well as our approximations. See, if we were to carry out our approximation over and over (in infinite amount of times), we would be getting closer and closer to the actual function, until (at infinity) we do. The Taylor series is extremely important in both mathematics and in applied fields, as it both deals with some fundamental properties of function, as well as provides an amazing approximation tool (as polynomials are easier to compute than nearly any other functions).
If you want to find out more, here are some resourses:
 MIT covers power series and Taylor series in this module of their single variable calculus course
 Khan Academy has a series (pun intended) on Taylor series.
 these Math. SE questions talk more about the applications of Taylor series.
 2,443
 1
 15
 30
Another way you can use Taylor series that I've always liked  using the definition of a derivative to show that $$\frac{d}{dx} e^x = e^x.$$
The definition is $$\lim \limits_{h \to 0} \frac{e^{x+h}  e^x}{h},$$
Which is equal to
$$\lim \limits_{h \to 0} \frac{e^x(e^h  1)}{h}.$$
If we can show that $\lim \limits_{h \to 0} \frac{e^h  1}{h} = 1$, we'll be home free. This is where Taylor/MacLaurin series come in. We know that $e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots$, so we can substitute:
$$\lim \limits_{h \to 0} \frac{1 + 1 + h + \frac{h^2}{2!} + \frac{h^3}{3!} + \dots}{h}$$
$$\lim \limits_{h \to 0} \frac{h + \frac{h^2}{2!} + \frac{h^3}{3!} + \dots}{h}$$
$$\lim \limits_{h \to 0} 1 + \frac{h}{2!} + \frac{h^2}{3!} + \dots$$
$$ = 1$$
 18,965
 8
 37
 80

7If you don't know the derivative of $e^x$ (and you want to find it), how can you know and use the fact that the Taylor series of $e^x$ is $1+x+(1/2!)x^2+\cdots$? – Eric Dec 23 '17 at 04:30

@Eric You're totally right. I wrote this when I was much newer to this all... – MCT Apr 19 '21 at 03:07
If you want to kill 2 birds with one stone, Kenneth Iverson's Elementary Functions builds up to the Taylor series approximation of sine by way of the polynomial and simple concepts like slope and area (slyly avoiding the dreaded buzzwords differential and integral and bizarrely, avoiding even the word calculus). The style is always to show you the concept in action, and then tell you the name.
All of this while teaching you APL from scratch.
Disclaimer: I just read this book a week ago, and I'm gushing about it to everyone.
 295
 3
 14
Taylor series can often be derived by doing arithmetic with known Taylor series.
Do you want the Taylor series for $\operatorname{sinc}(x) = \sin(x) / x$? Don't try to take the derivatives of $\operatorname{sinc}(x)$! Instead, compute
$$\operatorname{sinc}(x) = \sin(x) / x = x^{1} \sum_{n=0}^{+\infty} \frac{(1)^n x^{2n+1}}{(2n+1)!} = \sum_{n=0}^{+\infty} \frac{(1)^n x^{2n}}{(2n+1)!} $$
In fact, if your goal was to compute the values of the derivative of $\operatorname{sinc}(x)$ at $0$, the easiest way is to first compute its Taylor series by the means above, and then read the values of the derivatives off from the coefficients of the Taylor series.
More complicated arithmetic is harder, but sometimes you only need a few terms and can just multiply things out.
Do you want the fourth order Taylor series for $\sin \sin x$? Compute
$$ \begin{align}\sin \sin x &= \sin\left( x  \frac{x^3}{6} + O(x^5) \right) \\&= \left( x  \frac{x^3}{6} + O(x^5) \right)  \frac{1}{6}\left( x  \frac{x^3}{6} + O(x^5) \right)^3 + O(x^5) \\&= \left(x  \frac{x^3}{6} + O(x^5)\right)  \frac{1}{6}\left(x^3 + O(x^5) \right) + O(x^5) \\&= x  \frac{x^3}{3} + O(x^5) \end{align} $$
Where $O(x^5)$ just means that there are (possibly) more terms but they all have an exponent on $x$ that is 5 or greater. (this $O$ notation can be given more general meaning, but that's all that's needed here)
This simple derivation of the Taylor series of a function is taken from lecture 37 of the MIT course Single Variable Calculus. More specifically, this derivation is taken from these lecture notes. A similar derivation can also be found here.
Given an infinitely differentiable function $f(x)$, we want to represent it as the power series $$ f(x) = a_0 + a_1(x  c) + a_2(xc)^2 + a_3(xc)^3 + a_4(xc)^4 + a_5(xc)^5 + \cdots $$ The goal is to determine the coefficients $a_0,a_1,a_2,\dots$. The general strategy for doing this is substituting $x=c$ into $f(x)$ and its derivatives and picking out the coefficients. For example, $$ f(c) = a_0 $$ and so we know that $a_0$ has the value $f(c)$. Next, we can differentiate $f(x)$ to get $$ f'(x) = a_1 + 2 \cdot a_2(xc) + 3 \cdot a_3(xc)^2 + 4 \cdot a_4(xc)^3 + 5 \cdot a_5(xc)^4 + \cdots $$ Note that $$ f'(c) = a_1 $$ We can then differentiate $f'(x)$ to get $$ f''(x) = 2 \cdot a_2 + 3 \cdot 2 \cdot a_3(xc) + 4 \cdot 3 \cdot a_4(xc)^2 + 5 \cdot 4 \cdot a_5(xc)^3 + \cdots $$ Note that $$ f''(c) = 2 \cdot a_2 $$ or $$ a_2 = \frac{f''(c)}{2} $$ We can repeat this process several times to get \begin{align} f'''(x) &= 3 \cdot 2 \cdot a_3 + 4 \cdot 3 \cdot 2 \cdot a_4(xc) + 5 \cdot 4 \cdot 3 \cdot a_5(xc)^2 + \cdots \\ f''''(x) &= 4 \cdot 3 \cdot 2 \cdot a_4 + 5 \cdot 4 \cdot 3 \cdot 2 \cdot a_5(xc) + \cdots \\ \end{align} and we can substitute $x = c$ into these derivatives to get \begin{align} f'''(c) &= 3 \cdot 2 \cdot a_3 \\ f''''(c) &= 4 \cdot 3 \cdot 2 \cdot a_4 \end{align} rearranging, we get \begin{align} a_3 &= \frac{f'''(c)}{3 \cdot 2} \\ a_4 &= \frac{f''''(c)}{4 \cdot 3 \cdot 2} \end{align} Therefore, the general pattern is \begin{align} a_0 &= f(c) = \frac{f(c)}{0!} \\ a_1 &= f'(c) = \frac{f'(c)}{1!} \\ a_2 &= \frac{f''(c)}{2} = \frac{f''(c)}{2!} \\ a_3 &= \frac{f'''(c)}{3 \cdot 2} = \frac{f'''(c)}{3!} \\ a_4 &= \frac{f''''(c)}{4 \cdot 3 \cdot 2} = \frac{f''''(c)}{4!} \\ &\vdots \\ a_n &= \frac{f^{(n)}(c)}{n!} \end{align} So, the power series representation of $f(x)$ is \begin{align} f(x) &= \frac{f(c)}{0!} + \frac{f'(c)}{1!}(x  c) + \frac{f''(c)}{2!}(xc)^2 + \frac{f'''(c)}{3!}(xc)^3 + \cdots \\ &= \sum_{n=0}^\infty \frac{f^{(n)}(c)}{n!} (xc)^n \end{align}
 809
 4
 14