1

I have two mathematical functions: log(log*n) and 2^(log*n). Now, I want to calculate the asymptotic growth of these two functions(especially I want to find big theta). Finally, I want to compare their complexity. Can anyone please share a formal/intuitive solution that can solve this kind of problems?

Thanks.

templatetypedef
  • 328,018
  • 92
  • 813
  • 992
kayas
  • 541
  • 1
  • 4
  • 12
  • 1
    Are you talking the complexity of calculating the function? Or the growth of the function itself? As far as you are concerned, both go to infinity more slowly than anything else you've heard of, except that `log(log*n) < log*n < 2^(log*n)`. But for example for `n = 2^65536` we are talking `1.6 < 5 < 32`. For all practical purposes, all 3 are constants. – btilly Feb 06 '19 at 19:10
  • Thanks. And sorry for the unclear question. I am looking for the growth of the functions. The question is corrected. – kayas Feb 06 '19 at 19:48
  • Umm... `2^(log n)` equals to `n`. `log(log n))` is the asymptotic by itself, you cannot simplify it (and it grows very _s l o w l y_). – user58697 Feb 06 '19 at 19:53

1 Answers1

1

This is an interesting one. Let's begin by using a standard technique: it's hard to reason about exponents, so let's take their logarithms instead and see what happens. That leaves us with these functions:

log(log(log* n)) and log* n.

Now the question is how they compare against one another. Generally speaking, the log of some function will always grow more slowly than the function itself, provided that the function keeps growing as it gets bigger. Using the fact that log k < k for all k ≥ 1, if we pick any n ≥ 22222 we'll have that log* n ≥ 4, so log log* n ≥ 2, so log log log* n ≤ log* n, which gives us that log log log* n = O(log* n). From there, it's not hard to show that log log* n = O(2log* n).

templatetypedef
  • 328,018
  • 92
  • 813
  • 992