3

A way of finding the median of a given set of n numbers is to distribute them among 2 heaps. 1 is a max-heap containing the lower n/2 (ceil(n/2)) numbers and a min-heap containing the rest. If maintained in this way the median is the max of the first heap (along with the min of the second heap if n is even). Here's my c++ code that does this:

priority_queue<int, vector<int> > left;
priority_queue<int,vector<int>, greater<int> > right;
cin>>n; //n= number of items
for (int i=0;i<n;i++) {
    cin>>a;
    if (left.empty())
        left.push(a);
    else if (left.size()<=right.size()) {
            if (a<=right.top())
                left.push(a);
            else {
                left.push(right.top());
                right.pop();
                right.push(a);
            }
    }
    else {
        if (a>=left.top())
            right.push(a);
        else {
            right.push(left.top());
            left.pop();
            left.push(a);
        }
    }
}

We know that the heapify operation has linear complexity . Does this mean that if we insert numbers one by one into the two heaps as in the above code, we are finding the median in linear time?

Community
  • 1
  • 1
kabir
  • 125
  • 5

3 Answers3

5

Linear time heapify is for the cost of building a heap from an unsorted array as a batch operation, not for building a heap by inserting values one at a time.

Consider a min heap where you are inserting a stream of values in increasing order. The value at the top of the heap is the smallest, so each value trickles all the way down to the bottom of the heap. Consider just the last half of the values inserted. At this time the heap will have very nearly its full height, which is log(n), so each value trickles down log(n) slots, and the cost of inserting n/2 values is O(n log(n))

If I present a stream of values in increasing order to your median finding algorithm one of the things it has to do is build a min heap from a stream of values in increasing order so the cost of the median finding is O(n log(n)). In, fact the max heap is going to be doing a lot of deletes as well as insertions, but this is just a constant factor on top so I think the overall complexity is still O(n log(n))

mcdowella
  • 18,736
  • 2
  • 17
  • 24
2
  1. When there is one element, the complexity of the step is Log 1 because of a single element being in a single heap.

  2. When there are two elements, the complexity of the step is Log 1 as we have one element in each heap.

  3. When there are four elements, the complexity of the step is Log 2 as we have two elements in each heap.

So, when there are n elements, the complexity is Log n as we have n/2 elements in each heap and

  • adding an element; as well as,
  • removing element from one heap and adding it to another;

takes O(Log n/2) = O(Log n) time.


So for keeping track of median of n elements essentially is done by performing:

2 * ( Log 1 + Log 2 + Log 3 + ... + Log n/2 ) steps.

The factor of 2 comes from performing the same step in 2 heaps.


The above summation can be handled in two ways. One way gives a tighter bound but it is encountered less frequently in general. Here it goes:

  • Log a + Log b = Log a*b (By property of logarithms)
  • So, the summation is actually Log ((n/2)!) = O(Log n!).

The second way is:

  • Each of the values Log 1, Log 2, ... Log n/2 is less than or equal to Log n/2
  • As there are a total n/2 terms, the summation is less than (n/2) * Log (n/2)
  • This implies the function is upper bound by (n/2) * Log (n/2)
  • Or, the complexity is O(n * Log n).

The second bound is looser but more well known.

displayName
  • 12,673
  • 7
  • 50
  • 70
1

This is a great question, especially since you can find the median of a list of numbers in O(N) time using Quickselect.

But the dual priority-queue approach gives you O(N log N) unfortunately.

Riffing in binary heap wiki article here, heapify is a bottom-up operation. You have all the data in hand and this allows you to be cunning and reduce the number of swaps/comparisons to O(N). You can build an optimal structure from the get-go.

Adding elements from the top, one at a time, as you are doing here, requires reorganizing every time. That's expensive so the whole operation ends up being O(N log N).

Richard
  • 44,865
  • 24
  • 144
  • 216