0

Now initially it seemed that it should be O(Nlog(N)) , where N is the number of elements in the heap but, assuming worst case, it will take log(N) time to sift each elements until N/2 nodes have been popped (Since that would mean that the height of heap has been reduced by one) , and then it will take log(N)-1 time to sift each element until N/4 nodes have been popped
Therefore it becomes a series like

N/2*(log(N)) + N/4*(log(N)-1) + N/8*(log(N)-1) + ... N/(2^(log(N))*(log(N) - Height of Heap)


Where the last term is basically N/N * 0 - 0

I cant figure out the sum of this series, I tried integrating it in its standard form
integral of N*(log(N) - x)/2^(x+1)dx , limits 0 to log(N)
but wolfram gave me a complicated answer

Meherwan Gill
  • 61
  • 1
  • 5

1 Answers1

1

If you have n items in a heap, then popping the root item has a worst case complexity of log(n). You then have n-1 items on the heap, and complexity of popping the root item is log(n-1). So the series you want to sum is:

log(n) + log(n-1) + log(n-2) + log(n-3) + ... + log(n-n+1)

Or, easier to understand:

log(1) + log(2) + log(3) + ... + log(n)

https://stackoverflow.com/a/21152768/56778 explains how that is O(n log n), as well as Θ(n log n).

Alternately, log(a) + log(b) is equal to log(a*b). So the summation of logs from 1 to n is equal to log(n!). See https://math.stackexchange.com/questions/589027/whats-the-formula-to-solve-summation-of-logarithms

See also Is log(n!) = Θ(n·log(n))?

Jim Mischel
  • 122,159
  • 16
  • 161
  • 305