0

I read everywhere that the time complexity of heapsort is O(nlog(n)) in the worst case. But we also read everywhere that it is a common misconception that a heap is built in O(nlog(n)). Instead, you can make a heap in O(n). So considering that a heap can be made in O(n), look at the following sorting algorithm and tell me where I am wrong in analyzing its time complexity.

  1. Put n elements into a heap (time: O(n))
  2. Until the heap is empty, pop each element and copy it into an array. (time: O(n). Why? because in the same way all elements can be put into a heap in O(n), all of them can also be extracted in O(n). Right?).

All in all, the complexity is O(n)+O(n) which is O(n). But here, we also need an additional memory of O(n).

I know the traditional heapsort has time complexity of O(nlog(n)) and memory complexity of O(1). But isn't this heapsort too? And it provides O(n) even in the worst case, unlike the traditional heapsort algorithm.

Ameer Jewdaki
  • 1,658
  • 4
  • 19
  • 35
Chirag Arora
  • 710
  • 7
  • 17
  • popping the toppest element is O(1), but you also need to rearrange the heap which takes log(n) time to maintain the heap property. – nice_dev Jun 24 '19 at 13:51
  • No, it has a log(n) complexity. But the size of the heap is also shrinking. It is not log(n) for the elements that will be popped later. Similarly, when you create a heap, most of the elements don't take log(n) to be inserted. – Chirag Arora Jun 24 '19 at 13:53
  • @ChiragArora See the very detailed discussion in my suggested dupe. The most important point is that although the tree is shrinking, it doesn't shrink fast enough. – molbdnilo Jun 24 '19 at 13:55
  • @molbdnilo But we also give a similar explanation as to why a heap is made in o(n). Though insertion is a log(n) process, not every element takes log(n) time to be inserted as the heap grows. Moreover, insertion and removal processes are very similar for a heap. It only makes sense for them to have the same time complexity. – Chirag Arora Jun 24 '19 at 13:59
  • Average time of insertion into a heap is O(1) and the worst case is O(log n). There seems to be a lot of study on this. See [this](https://stackoverflow.com/questions/39514469/argument-for-o1-average-case-complexity-of-heap-insertion) – SomeDude Jun 24 '19 at 14:29
  • Superficially, insertion and removal look similar, but the behavior is quite different. As discussed above, *on average* insertion is O(1) because half the time, the item will belong on the leaf level and there's no need to move it. But removing an item involves replacing the root with the last item on the heap and then sifting it down. Most often, that last item will get sifted all the way back down to the leaf level, which means log(n) operations. – Jim Mischel Jun 25 '19 at 17:27

1 Answers1

1

Note that you cant sort an array in O(n) without any additional information on your data. actually, we can prove a lower bound of O(nlogn) on sorting an array while using comaprison based algorithm! and for the same reason, we can prove that lower bound on Heaport!

meaning- you cant, ever, sort any data structure in O(n)! any linear sorting algorithm have to assume some prior knowledge about your data. for more information about how to prove this lower bound of O(nlogn) search for "decision trees"

Ido
  • 57
  • 7