Suppose $f:\mathbb{R} \to \mathbb{R}$ is a smooth function such that $f^{(n)}(0) = 0$ for all $n \in \mathbb{N}_{\geq 0}$ and that $f$ is not analytic. In particular, we assume that $f$ is not identically $0$ in any neighbourhood of $x=0$.

Does it follow that for all $\epsilon>0$

$$\lim_{n \to +\infty} \sup\{f^{(n)}(x) : x \in (-\epsilon, \epsilon)\} = +\infty$$


$$\lim_{n \to +\infty} \inf\{f^{(n)}(x) : x \in (-\epsilon, \epsilon)\} = -\infty?$$

Motivation: the standard examples we all know and love [e.g., $\exp (-\frac{1}{|x|}), \exp\left(-\frac{1}{x^2}\right)$] have derivatives which exhibit extreme oscillatory behaviour near the origin when $n$ gets large.

For these functions, this makes sense intuitively. To ensure the function "smoothly" and "flatly" reaches $x=0$, the first derivative needs to rapidly become small in magnitude, which is only possible if the second derivative temporarily becomes large in magnitude, but then it's necessary for the second derivative to rapidly become small again (since $f^{(2)}(0) = 0$), which means that the third derivative has to do some work, and you can see a pattern developing.

  • 12,427
  • 5
  • 40
  • 93
  • Look at the radius of convergence of $f(x) = e^{-1/x} 1_{x > 0}$'s Taylor series at $x=a$. – reuns Jul 17 '17 at 10:02
  • $f$ is analytic on $[b,c]$ iff there is $r > 0$ such that $\displaystyle\lim_{n \to \infty} r^n\frac{\sup_{x \in[b,c]} |f^{(n)}(x)|}{n!} = 0$. – reuns Jul 17 '17 at 10:07
  • @reuns Oh, I don't know how they immediately lead to an answer. If you post a full solution I'll be happy to award you the bounty. – MathematicsStudent1122 Jul 19 '17 at 21:44
  • 1
    Did you read my comments ? Also let $\phi(x) = \int_{-2}^2 e^{-1/(1-(x-y)^2)} 1_{|x-y| < 1} dy \in C^\infty$ constant on $[-1,1]$ and identically zero on $|x| > 3$. Can you use it to build a $C^\infty$ function with arbitrary sequence of derivatives at $x=0$ ? – reuns Jul 19 '17 at 21:47

3 Answers3


[Partial answer]

Let $\epsilon>0$. Pick $c \in (-\epsilon, \epsilon)$ such that $f(c) \neq 0$ and such that $|c|<1$. Given $n$, Taylor's theorem (not Taylor's series!) implies the existence of $\xi_n$ between $0$ and $c$ such that $$f(c)=\frac{f^{(n)}(\xi_n)}{n!}c^n.$$ Therefore, $$|f^{(n)}(\xi_n)|=\frac{n!|f(c)|}{|c|^n} .$$

Since $n!|f(c)|\leq f^{(n)}(\xi_n)\leq \sup\{|f^{(n)}(x)| : x \in (-\epsilon,\epsilon)\}$, we have that $$\lim_{n \to \infty}\sup\{|f^{(n)}(x)| : x \in (-\epsilon,\epsilon)\}=+\infty.$$

This is the general case. If we impose that for every $(-\epsilon,\epsilon)$ there exists $c<0$ such that $f(c) \neq 0$, we can pick such $c$ with $-1<c<0$ and then again Taylor's theorem implies the existence of $\xi_n$ between $0$ and $c$ such that $$f(c)=\frac{f^{(n)}(\xi_n)}{n!}c^n$$ and then $$f^{(n)}(\xi_n)=\frac{n!f(c)}{c^n} .$$ Supposing without loss of generality that $f(c)>0$, we now have $(2n)!f(c)\leq f^{(2n)}(\xi_{2n})\leq \sup\{f^{(2n)}(x) : x \in (-\epsilon,\epsilon)\}$, and therefore $$\limsup_{n \to \infty} \left(\sup\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}\right) =+\infty.$$ Analogously, $\inf\{f^{(2n+1)}(x) : x \in (-\epsilon,\epsilon)\}\leq f^{(2n+1)}(\xi_{2n+1}) \leq (2n+1)!f(c)$, and therefore: $$\liminf_{n \to \infty} \left(\inf\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}\right)=-\infty.$$

These results certainly have some content which might interest you, but are not exactly what you want. To be explicit, the followings points must be made:

  • If you are willing to analyze the absolute value of the derivatives instead of the derivatives themself, the above gives a complete answer.
  • If not, we have the following issues:
    1. The above argument only works for functions which have points arbitrarily close to $0$ from the left with non-zero values.
    2. We do not prove that the sequence $\sup\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}$ goes to infinity. Only that its $\limsup$ is infinity (analogously for $\liminf$).
Aloizio Macedo
  • 32,160
  • 5
  • 56
  • 126

Another partial answer. I suspect this is essentially Aloizio's, but I reference only the MVT and IVT instead of Taylor's Theorem. Assume $f$ takes some positive value at some $x_0\in(-\epsilon,0)$. (If $f$ takes some negative value in $(-\epsilon,0)$, a symmetric argument can follow). Note $x_0$ is negative. Then by the Mean Value Theorem, there exists $x_1\in(x_0,0)$ such that $f'(x_1)=\frac{1}{x_0}\,f(x_0)<0$. And this implies $\left\lvert f'(x_1)\right\rvert>\frac{1}{\epsilon}\,f(x_0)$.

Inductively, assume that we have found $x_0<x_1<\cdots<x_{n-1}<x_n<0$, and that the sign of $f^{(n)}(x_n)$ is $(-1)^n$, and that $\left\lvert f^{(n)}(x_n)\right\rvert>\frac{1}{\epsilon^n}\,f(x_0)$. The above establishes this holds for $n=0$ and $n=1$, and models the inductive argument which follows.

Using the MVT applied to $f^{(n)}$ at endpoints $x_n$ and $0$, there is $c_{n+1}\in(x_n,0)$ such that $f^{(n+1)}(c_{n+1})=\frac{1}{x_n}\,f^{(n)}(x_n)$. By the Intermediate Value Theorem, since $f^{(n+1)}(0)=0$, then on $(c_{n+1},0)$, $f^{(n+1)}$ takes on all values between $0$ and $f^{(n+1)}(c_{n+1})=\frac{1}{x_n}\,f^{(n)}(x_n)$. In particular, it takes the value $(-1)^{n}\frac{1}{\epsilon^nx_n}f(x_0)$, which has the same sign as $f^{(n+1)}(c_{n+1})$ and is smaller in absolute value. So there is $x_{n+1}\in(c_{n+1},0)$ such that $f^{(n+1)}(x_{n+1})$ has sign $(-1)^{n+1}$ and $\left\lvert f^{(n+1)}(x_{n+1})\right\rvert=\frac{1}{\epsilon^n\lvert x_n\rvert}f(x_0)>\frac{1}{\epsilon^{n+1}}\,f(x_0)$.

This inductively proves that on $(-\epsilon,0)$, $f^{(n)}$ alternates sign, and takes an absolute value as large as $f(x_0)/\epsilon^{n}$, which goes to infinity when $\epsilon<1$. So assuming that $f$ takes a positive value at all (or a negative value, symmetrically) in $(-\epsilon,0)$, your two conjectures are true.

This means that for either conjecture to be false, then on $(-\epsilon,0)$ the function would have to be the zero function.

Now I won't repeat the argument, but on $(0,\epsilon)$ the exponential growth bound argument is the same, but you lose the alternation. If $f(x_0)$ is positive, then the first conjecture will hold by this argument. And if $f(x_0)$ is negative, then the second conjecture will hold by this argument. But it remains to show that both conjectures hold simultaneously.

If some derivative of $f$ takes a positive (negative) value, then you can pick up the argument there and to prove the first (second) conjecture holds.

So the only smooth functions for which both conjectures might not hold together would be a function that is all zero on the negatives, and has all of its derivatives positive (or, all negative) on $(0,\epsilon)$. The simultaneous conjectures reduce to showing no such function exists.

Does there exist a smooth function on $(-\epsilon,\epsilon)$ such that $f^{(n)}(0)=0$ for all $n$, $f(x)=0$ for all $-\epsilon<x<0$, and $f^{(n)}(x)>0$ for all $n$, for all $0<x<\epsilon$?

2'5 9'2
  • 51,425
  • 6
  • 76
  • 143

Yes, the derivatives of a non-analytic smooth function on an interval are unbounded above and below.

By scaling $x$ it suffices to take $\epsilon>1.$ Suppose there exists a finite $C<0$ such that $f^{(n)}(x)\geq C$ for all $x\in[-1,1].$ By adding $-Ce^{x+1},$ we get a function with $f^{(n)}(x)\geq 0$ on $[-1,1],$ and $f$ will be analytic on $[-1,1]$ if and only if the original function was analytic on $[-1,1].$ It is a theorem of S. Bernstein that a function with non-negative derivatives on an interval is real-analytic on that interval. The proof is fairly short, see Bernstein's Theorem of Analytic Function Proof. If you prefer books, it's also at the end of Boas's A primer of real functions, at least in the 1960 edition.

  • 24,775
  • 15
  • 47