## Limits are exact

You have a misunderstanding about limits! A limit, when it exists is just a value. An *exact* value.

It doesn't make sense to talk about the limit reaching some value, or there being some error. $\lim_{x \to 1} x^2$ is just number, and that number is *exactly* one.

What you are describing — these ideas about "reaching" a value with some "error" — are descriptions of the behavior of the expression $x^2$ as $x \to 1$. Among the features of this behavior is that $x^2$ is "reaching" one.

By its very definition, the limit is the *exact* value that its expression is "reaching". $x^2$ may be "approximately" one, but $\lim_{x \to 1} x^2$ is exactly one.

## Taylor polynomials

In this light, nearly everything you've said in your post is not about Taylor *series*, but instead about Taylor *polynomials*. When a Taylor series exists, the Taylor *polynomial* is given simply by truncating the series to finitely many terms. (Taylor polynomials can exist in situations where Taylor series don't)

In general, the definition of the $n$-th order Taylor polynomial for an $n$-times differentiable function is the sum

$$ \sum_{k=0}^n f^{(k)}(x) \frac{x^k}{k!} $$

Taylor polynomials, generally, are not exactly equal to the original function. The only time that happens is when the original function is a polynonial of degree less than or equal to $n$.

The sequence of Taylor polynomials, as $n \to \infty$, may converge to something. The Taylor *series* is *exactly* the value that the Taylor polynomials converge to.

The error in the approximation of a function by a Taylor polynomial is something people study. One often speaks of the "remainder term" or the "Taylor remainder", which is precisely the error term. There are a number of theorems that put constraints on how big the error term can be.

## Taylor series can have errors!

Despite all of the above, one of the big surprises of real analysis is that a function might not be equal to its Taylor series! There is a notorious example:

$$ f(x) = \begin{cases} 0 & x = 0 \\ \exp(-1/x^2) & x \neq 0 \end{cases} $$

you can prove that $f$ is infinitely differentiable everywhere. However, all of its derivatives have the property that $f^{(k)}(0) = 0$, so its Taylor series around zero is simply the zero function.

However, we define

A function $f$ is **analytic** at a point $a$ if there is an interval around $a$ on which $f$ is (exactly) equal to its Taylor series.

"Most" functions mathematicians actually work with are analytic functions (e.g. all of the trigonometric functions are analytic on their domain), or analytic except for obvious exceptions (e.g. $|x|$ is not analytic at zero, but it is analytic everywhere else).