2

If inserting an element takes log(N) time in a map, where N is the size of the map. Then inserting elements of the array one by one means log(1)+log(2)+....+log(N) = log(N!) complexity. But the normal best complexity for getting the elements sorted is Nlog(N), Where am I going wrong?

shri2k
  • 23
  • 2

1 Answers1

0

Nowhere, O(log n!) == O(n log n). The proof is a bit of math.

First, we have log(n!) = log(1) + log(2) + ... + log(n) <= log(n) + log(n) + ... + log(n) = n log(n). On the other hand, we also get the following

2 log(n!) = 2*(log(1) + log(2) + ... + log(n)) 
          = (log(1) + log(n)) + (log(2) + log(n-1)) + ... + (log(i) + log(n-i+1)) + ... + (log(n) + log(1)) 
          = log(1*n) + log(2*(n-1)) + ... + log(i*(n-i+1)) + ... log(n*1) 
          >= log(n) + ... + log(n) = n log(n)

Where we get the inequality since i*(n-i+1) = i*n - i*i + i >= n (seems a bit mysterious but it basically says that products grow faster than sums).

So we have log(n!) <= n log(n) <= 2 log(n!). By the defintion of the O-notation this means that O(log(n!)) = O(n log(n)).

n314159
  • 4,701
  • 1
  • 3
  • 20
  • Okay, i got that, but doesn't this mean it's worst log(n!) is better than sorting an array for which worst case can go upto nlog(n)? – shri2k Dec 28 '19 at 10:45
  • I think you should think/read again about why we use the O-notation. The runtime of sorting the array is not exactly `O(n log n)` and inserting an element to a map does not tale exactly `log n` time. Both are a rough estimate which only is true up to a constant factor. So for reasons of runtime analysis we don't differentiate between `log(n!)` and `2log(n!)` and it also does not make sense to do so, since the knowledge on which we base our analysis (the runtime of one insertion) is not exact to begin with but also only true up to a constant factor. – n314159 Dec 28 '19 at 11:29