Taylor series expansion of function, $f$, is a vector in the vector space with basis: $\{(x-a)^0, (x-a)^1, (x-a)^3, \ldots, (x-a)^n, \ldots\}$. This vector space has a countably infinite dimension. When $f$ is expressed as linear combination of the basis vector the scalar multiple for the $n$-th basis vector is $\operatorname{Diff}_n{f}(a)/n!$

Fourier series expansion of function, $f$, is a vector in the vector space with basis: $\{\sin(1x), \cos(1x), \sin(2x), \cos(2x), \ldots, \sin(nx), \cos(nx), \ldots\}$. This vector space has a countably infinite dimension. When $f$ is expressed as linear combination of the basis vector the scalar multiple for the $n$-th basis vectors are $\operatorname{Int}\{f\cdot\sin(nx)\}$ and $\operatorname{Int}\{f\cdot\cos(nx)\}$.

Questions:

The vector space for the Fourier series has an inner product, $\operatorname{Int}\{f\cdot g\}$, and it's this inner product that provides the above expressions like $\operatorname{Int}\{f\cdot\sin(nx)\}$ and $\operatorname{Int}\{f\cdot\cos(nx)\}$. Is there a similar inner product based derivation of the scalar multiples for the vector space of spanned by the polynomial basis in Taylor series?

What is the relationship, if any, between the vector space produced by Taylor Series and that of Fourier Series? E.g. is one a subspace of the other?

When Fourier series is taught, why isn't Taylor Series re-explained in the vector space framework used for Fourier series? And would this approach not lead the discussion of the implication of the choice of basis (and perhaps the choice of inner product) for function spaces?

Just as Fourier series get generalized to Fourier Transform (the summation of the series becomes an integral), is there something equivalent to Taylor series?

Are there any recommended resources (books, courses, etc.) available which can help clarify my thinking regarding these issues?