Excellent question! I'm glad you asked!

There are *lots* of reasons, but I would say the most fundamental are the following:

### 1. Because Taylor series approximate *using ONLY basic arithmetic*

I wish someone told me this back in school. It's why we study polynomials and Taylor series.

The *fundamental* mathematical functions we really *understand* deeply are $+$, $-$, $\times$, $\div$... to me, it's fair to say the study of polynomials is really the study of *"what can we do with basic arithmetic?"*

So when you prove that a function can be approximated by a Taylor series, what you're *really* saying is that you can evaluate that function to a desired precision **via basic arithmetic**.

If this doesn't sound impressive, it's probably because someone else has already done the work for you so you don't have to. ;) To elaborate:

You probably type in `sin(sqrt(2))`

into a calculator and take it for granted that it gives you back an answer (and notice it's an **approximate** one!) without ever knowing how it actually does this. Well, there isn't a magic `sin`

and `sqrt`

circuit in your calculator. **Everything** is done via a sequence of $+$, $-$, $\times$, $\div$ operations, *because those are the only things it knows how to do*.

So how does it know *which* exact sequence of basic arithmetic operations to use? Well, frequently, someone has used Taylor series to derive the steps needed to approximate the function you want (see e.g. Newton's method). *You* might not have to do this if all you're doing is punching things into a calculator, because someone else has already done it for you.

In other words: **Taylor series are the basic ***building blocks* of fundamental functions.

But that's not all. There's also another important aspect to this:

### 2. Taylor series allow *function composition* using **ONLY** basic arithmetic

To understand this part, consider that the Taylor series for $f(x) = g(h(x))$ is pretty easy to evaluate: you just differentiate via the chain rule ($f'(x) = g'(h(x)) h'(x)$, etc.) and now you have obtained the Taylor series for $f$ from the derivatives of $g$ and $h$ **using ONLY basic arithmetic**.

In other words, when $f$ is analytic and you've "solved" your problem for $g$ and $h$, you've "solved" it for $f$ too! (You can think of "solving" here to mean that we can evaluate something in terms of its individual building blocks that we already know how to evaluate.)

If composability seems like a trivial thing, well, it is most definitely *not*!! There are lots of other approximations for which composition only makes your life harder! Fourier series are one example. If you try to compose them arbitrarily (say, $\sin e^x$) you'll quickly run into a brick wall.

So, in other words, **Taylor series also provide a "glue" for these building blocks**.

That's a pretty good deal!!