So I've been playing around with some functions for a while, and started wondering about a slowest divergent function(as in $\lim_{x\to\infty} f(x)\to\infty$) and so I searched around for an answer.

I can see that there are ways to construct a new function that is necessarily diverging slower than the original one. But then it struck me that it really feels like properties of an open set where there are no smallest element in $(0,\infty)$.

So the question is this, is it possible to define recursively a function $f(x)$ such that $\lim_{x\to\infty} f(x)\to\infty$ and $$ (\forall g(x) \neq f(x), \lim_{x\to\infty} g(x)\to\infty) \lim_{x\to\infty} {g(x)\over f(x)}\to\infty $$ For an example of a function defined recursively, consider $$ f(x)={x^{1\over f(x)}\over ln(x)} $$ which I have no idea how it behaves.

The reason I'm stressing the recursion is beacuse despite having no smallest element on $(0,\infty)$ , elements can get arbitrarily close to the endpoints, and to do that with a function, I'm guessing recursion is the way to go.

Asaf Karagila
  • 370,314
  • 41
  • 552
  • 949
Passer By
  • 299
  • 1
  • 7
  • 1
    Call two function $f$ and $g$ (converging or diverging) equivalent if $\lim_{x->\infty} \frac{g(x)}{f(x)}$ is not zero and not infinite. Is the set of equivalence classes isomorph to $(-\infty,\infty)$? If so, which subset of $(-\infty,\infty)$ corresponds to diverging functions? (I don't know the answers, but I feel the questions are related, so I just post this as comment.) –  Mar 19 '15 at 13:53
  • I don't think that this set $X$ of equivalence classes is order-isomorphic to $\mathbb{R}$. The reason is that any countable subset of $X$ is necessarily bounded below (a proof of which I sketched in my answer), which is not true of $\mathbb{R}$. (Although some people may want to recheck my proof of this, I don't know is it's totally correct). Hence $X$ is larger in some sense, maybe order-isomorphic to the long line? – shalop Mar 19 '15 at 18:27

3 Answers3


No such $f$ exists. If $f$ is any function such that $\lim_{x \to \infty} f(x) = \infty$, then consider the function $\log(f)$. We have that $\lim_{x \to \infty} \log(f(x)) = \infty$. But the function $\log(f)$ diverges slower than $f$, because we have that $\lim_{x \to \infty} \frac{\log(f(x))}{f(x)}=0$.

Now your idea of recursion brings up another interesting point. We can recursively define a sequence of functions $f_n$ as follows: $f_0(x)=x$, and $f_{n+1}(x)=\log f_n(x)$. Then $f_{n+1}$ always diverges slower than $f_n$. Then one may ask the question: for any function $f$ such that $\lim_{x \to \infty} f(x) = \infty$, does there exist $n \in \mathbb{N}$ such that $f_n$ diverges slower than $f$??

The answer is still no. To prove this, we'll construct a function $f$ which diverges slower than all of the $f_n$. Let $a_1=1$. Suppose $a_1 < \cdots < a_n$ have been constructed so that $f_{i+1}(a_{i+1})>\frac{1}{i}+f_i(a_i)$ for all $1 \leq i \leq n-1$. Then choose $a_{n+1}>a_n$ such that $f_{n+1}(a_{n+1})>\frac{1}{n}+f_n(a_n)$ [such $a_{n+1}$ exists since $f_{n+1}$ diverges]. Continuing inductively, we get a sequence $(a_n)$ such that $f_n(a_n) \to \infty$. Define a function $f$ by interpolating all of the points $(a_n, f_n(a_n))$. Then you can check that $\lim_{x \to \infty} f(x)= \infty$, but $f$ diverges slower than all of the $f_n$.

We can do a similar construction to show that for any countable collection of divergent functions, we can find a function which diverges slower than all of these (which is stronger than your original question, pertaining to just one function).

  • 12,669
  • 1
  • 24
  • 48
  • consider the recursively defined function $f(x)=log(f(x))$, $log(f(x))$ would be the same as $f(x)$ – Passer By Mar 19 '15 at 08:15
  • 9
    No such function exists, since the equation $\log u=u$ has no real solutions. You can't just define functions like that arbitrarily. You have to make sure that the equation doesn't contradict basic principles. – shalop Mar 19 '15 at 08:18
  • oops that was a blunder. Though i'm still not thoroughly convinced that there is no way of defining a function so that most elementary functions can't be somehow "absorbed" – Passer By Mar 19 '15 at 08:35
  • Is log the only way to construct such a function? Are there any other non-log functions? (not counting stuff like $\log\log$...) – Jus12 Jun 13 '15 at 04:09
  • @Jus12: You can use anything that diverges slower than the identity function $g(x)=x$. For example, $f(x)=\sqrt x$ would work just as well as log. – shalop Jun 15 '15 at 19:05

What you call a "recursive definition" isn't a really definition, it's a formula which may be satisfied by one, several, or no functions at all.

Your question seems a little contradictory - you say in the second paragraph that you know no slowest function exists since you can always construct a slower one, and then immediately go on to ask for a slowest function regardless. The fact that you have a novel method of defining a function doesn't change the fact that no such slowest function exists, as you said in your first paragraph. The fact that you can always construct a slower function means that you don't even need to consider new methods of defining your function, because the operating word is always. It would be a bit like if you proved for me that no rational number $a$ satisfies $a^2=2$, and I said "okay, but what if I used a really big denominator?". You would simply reply, "I said no rational number".

On the other hand, as discussed in Shalop's answer, if you're willing to relax your definition of a "slowest" function, then the idea of defining a sequence of functions via some recurrence relation may yield some interesting results. A very simple such relation would be $f_{n+1} = \sqrt[3] {f_n}$. As long as $f_0$ diverges then so do all $f_n$, and each converges more slowly than the last.

Jack M
  • 26,283
  • 6
  • 57
  • 113

Let me try to connect one of your questions with one of your other questions. As stated by Shalop, no such function exists. You can see this, for example as follows. Let $f(x)$ be a (positively) divergent function - a function such that $\lim\limits_{x\to\infty} f(x) = \infty$. Since you can define $g(x) := \log f(x)$, which diverges more slowly than $f(x)$, there is always a function which is "less divergent, but still divergent". Thus, there cannot exist a "least divergent" function.

This matter is exactly similar to the properties of an open set, where, for example, the interval $(0, \infty)$ has no smallest element. The derivative of a function measures its rate of change - a function whose derivative is very positive at a point $x$ is increasing very quickly at that point. Now, if we think of functions $f(x)$ like $x, \log x, \log(\log x), \ldots$, then we can look at how their derivatives behave.

In order for such a function to stop being divergent, intuitively, we need it to level out at some point. That is, its derivative - its rate of change - must become less than or equal to zero. Afterall, as soon as the derivative crosses zero, the function is no longer headed toward $+\infty$. In particular, the interval of values such that $f(x)$ no longer heads toward $+\infty$ has and end-point: $(-\infty, 0]$. Namely, the interval $(0, \infty)$ is the interval of values for which $f(x)$ will still be heading toward infinity - it's an open interval!

Granted, the description I gave is very non-rigorous, and only deals with increasing functions. To deal with functions like $x\exp(\sin(x))$ or $|x \sin(x)|$, we would need a different approach. Each still tends to $+\infty$ as $x \to \infty$, but their derivatives fluctuate between positive and negative values! To use the same approach, we could instead look at the largest value of these functions so far: the envelope function

$E(x) := \sup\limits_{0\leq y\leq x} f(x).$

Notice that $E(x)$ is now an increasing function: for all $x_2 \geq x_1 \geq 0$, $E(x_2)\geq E(x_1)$. Then in order for $f(x)$ not to be a divergent function, we require that $\lim\limits_{x\to\infty} E(x) < \infty$. But this can only happen if there is some point $x_0$ past which $E(x)$ stops increasing - i.e.

There exists $x_0 \geq 0$ such that there does not exist $x > x_0$ with $E(x) > E(x_0)$. If this is true, then there is an interval $I = (a,\infty)$ such that $E(x) \equiv 0$ on $I$. This means that $E(x)$ once again takes values in the closed interval $(-\infty, 0]$, so that $f(x)$ cannot tend to $+\infty$.

  • 663
  • 4
  • 12
  • Re the 'derivative going non-positive' argument, that may be a necessary condition but it is not sufficient. The function `f(x) = x/2 + sin(x)` has a non-positive derivative much of the time, yet clearly diverges to positive infinity. – abligh Mar 19 '15 at 13:53