0

I have an algorithm that has a run time proportional to log(1) + log(2) + ... + log(N). Clearly this algorithm runs in O(N log(N)) time. However, I have an intuition that there might be a tighter bound, because the one I've produced uses the value of the largest logarithm term to bound all of the logarithm terms, even though many of the terms are much smaller. Am I correct? Is there a tighter bound that is still simple to express arithmetically?

Jordan
  • 141
  • 8

2 Answers2

2

log(1) + log(2) + ... + log(N) will be log(1*2*3*4*5*.......*n)

which is equal to log (n!) Striling's approximation Sterling's Approximation

which can be equated to log(n^n) since log (a^b)= b log a

which is equivalent to (n log n) and to answer your another question, I can't see any tighter bound in this case. I hope this helps.

Kashif Faraz Shamsi
  • 473
  • 1
  • 7
  • 21
1

Since log(a) + log(b) = log(ab) the complexity would be log(n!)

log(n!) = n log(n) - n + O(log(n))

For large n, the right side is dominated by the term n log(n). That implies that O(log(n!)) = O(n log(n)).
We can prove log(n!) = O(nlogn) using sterling approximation, you read it further at Why is log(n!) O(nlogn)?

marvel308
  • 9,593
  • 1
  • 16
  • 31