I have an algorithm that has a run time proportional to log(1) + log(2) + ... + log(N)
. Clearly this algorithm runs in O(N log(N))
time. However, I have an intuition that there might be a tighter bound, because the one I've produced uses the value of the largest logarithm term to bound all of the logarithm terms, even though many of the terms are much smaller. Am I correct? Is there a tighter bound that is still simple to express arithmetically?
Asked
Active
Viewed 94 times
0
![](../../users/profiles/3442698.webp)
Jordan
- 141
- 8
-
It's not completely a duplicate, but the step sum(log(k), k=1..n) = log(n!) is trivial. – Paul Hankin Aug 06 '17 at 13:52
2 Answers
2
log(1) + log(2) + ... + log(N)
will be log(1*2*3*4*5*.......*n)
which is equal to log (n!)
Striling's approximation
Sterling's Approximation
which can be equated to log(n^n)
since log (a^b)= b log a
which is equivalent to (n log n)
and to answer your another question, I can't see any tighter bound in this case.
I hope this helps.
![](../../users/profiles/7018546.webp)
Kashif Faraz Shamsi
- 473
- 1
- 7
- 21
1
Since log(a) + log(b) = log(ab) the complexity would be log(n!)
log(n!) = n log(n) - n + O(log(n))
For large n, the right side is dominated by the term n log(n). That implies that O(log(n!)) = O(n log(n)).
We can prove log(n!) = O(nlogn) using sterling approximation, you read it further at Why is log(n!) O(nlogn)?
![](../../users/profiles/4989284.webp)
marvel308
- 9,593
- 1
- 16
- 31