31

It seems there are lots of improvements in .NET 4.0 related to concurrency that might rely on concurrent priority queues. Is there decent priority queue implementation inside framework available for reuse?

SwDevMan81
  • 45,922
  • 20
  • 140
  • 177

9 Answers9

20

There is an implementation as part of "Samples for Parallel Programming with the .NET Framework" at msdn. See ParallelExtensionsExtras.

Direct link to source code for file ConcurrentPriorityQueue.cs

Markus Jarderot
  • 79,575
  • 18
  • 131
  • 135
  • 1
    there appears to be a bug in that implementation. If you wrap it in a BlockingCollection and call Add 5 times, it puts the items in the wrong order in the exposed collection. If you dig down into the private members of the blockingConcurrentPriorityQueue, you can see that the underlying CPQ itself contains the right data in the right order. But the exposed collection is out of order (CPQ contains 0,1,2,3,4; exposed collection contains 0,4,3,2,1). So - don't use this version as part of a Blocking Collection. – Joe Aug 29 '12 at 05:44
  • 1
    But which one to use? Do you know anything better? – ironic Apr 29 '13 at 12:14
  • @Joe: I've also tried to use it wrapped into BlockingCollection and your problem doesn't repro for me... so if you have some more details on how to repro the problem - I'll appreciate it! – Timur Sadykov Apr 21 '15 at 04:18
5

You may need to roll your own. A relatively easy way would be to have an array of regular queues, with priority decreasing.

Basically, you would insert into the queue for the appropriate priority. Then, on the consumer side, you would go down the list, from highest to lowest priority, checking to see if the queue is non-empty, and consuming an entry if so.

Steven Sudit
  • 18,659
  • 1
  • 44
  • 49
  • 1
    +1, I forget to mention how to implement due think it pretty straightful :) Without that my answer is not full. – Nick Martyshchenko Oct 25 '10 at 16:40
  • 1
    It's not much of an implementation. It could probably be improved by having a ManualResetEvent that's signaled after inserting into *any* of the queues, so that the consumer can poll once, then wait until the signal fires. In practice, it would be best for that wait to be finite and relatively short (perhaps a quarter of a second) so as to avoid circumstances where the signal is missed. – Steven Sudit Oct 25 '10 at 17:20
  • Paw's implementation locks unnecessarily, but the overall structure is a good starting point. – Steven Sudit Oct 25 '10 at 22:49
5

Maybe you can use my own implementation of a PriorityQueue. It implements alot more than the usual push/pop/peek, features that I implemented whenever I found the need for it. It also has locks for concurrency.

Comments to the code is much appreciated :)

public class PriorityQueue<T> where T : class
{
    private readonly object lockObject = new object();
    private readonly SortedList<int, Queue<T>> list = new SortedList<int, Queue<T>>();

    public int Count
    {
        get
        {
            lock (this.lockObject)
            {
                return list.Sum(keyValuePair => keyValuePair.Value.Count);
            }
        }
    }

    public void Push(int priority, T item)
    {
        lock (this.lockObject)
        {
            if (!this.list.ContainsKey(priority))
                this.list.Add(priority, new Queue<T>());
            this.list[priority].Enqueue(item);
        }
    }
    public T Pop()
    {
        lock (this.lockObject)
        {
            if (this.list.Count > 0)
            {
                T obj = this.list.First().Value.Dequeue();
                if (this.list.First().Value.Count == 0)
                    this.list.Remove(this.list.First().Key);
                return obj;
            }
        }
        return null;
    }
    public T PopPriority(int priority)
    {
        lock (this.lockObject)
        {
            if (this.list.ContainsKey(priority))
            {
                T obj = this.list[priority].Dequeue();
                if (this.list[priority].Count == 0)
                    this.list.Remove(priority);
                return obj;
            }
        }
        return null;
    }
    public IEnumerable<T> PopAllPriority(int priority)
    {
        List<T> ret = new List<T>();
        lock(this.lockObject)
        {
            if (this.list.ContainsKey(priority))
            {
                while(this.list.ContainsKey(priority) && this.list[priority].Count > 0)
                    ret.Add(PopPriority(priority));
                return ret;
            }
        }
        return ret;
    }
    public T Peek()
    {
        lock (this.lockObject)
        {
            if (this.list.Count > 0)
                return this.list.First().Value.Peek();
        }
        return null;
    }
    public IEnumerable<T> PeekAll()
    {
        List<T> ret = new List<T>();
        lock (this.lockObject)
        {
            foreach (KeyValuePair<int, Queue<T>> keyValuePair in list)
                ret.AddRange(keyValuePair.Value.AsEnumerable());
        }
        return ret;
    }
    public IEnumerable<T> PopAll()
    {
        List<T> ret = new List<T>();
        lock (this.lockObject)
        {
            while (this.list.Count > 0)
                ret.Add(Pop());
        }
        return ret;
    }
}
Paw Baltzersen
  • 2,352
  • 2
  • 22
  • 27
  • 2
    Aside from not following .NET conventions, it looks correct but slow. The slowness comes from locking everything, whereas the .NET 4.0 concurrent queue is lockless. See: http://www.codethinked.com/post/2010/02/04/NET-40-and-System_Collections_Concurrent_ConcurrentQueue.aspx – Steven Sudit Oct 25 '10 at 22:48
  • In any case, +1 for the overall structure. – Steven Sudit Oct 25 '10 at 22:50
  • What do you mean by not following .NET conventions? – Paw Baltzersen Oct 26 '10 at 06:43
  • I was thinking about making the Queue inside the SortedList a ConcurrentQueue and then only lock when adding or deleting a new item in the SortedList, and not when working on the Queues. But I'd still have to lock when checking if a given priority is present in the SortedList, so that didn't help much. – Paw Baltzersen Oct 26 '10 at 06:45
  • What's about this implementation http://stackoverflow.com/a/4994931/206730 ? which is better ? – Kiquenet Jun 08 '13 at 10:30
  • @Kiquenet Depends if you need the generic functionality that my solution offers. – Paw Baltzersen Jun 25 '13 at 12:22
2

Well, 7 years passed, but for posterity, I would like to answer with my implementation.

Documentation: Optionally awaitable simple to use Concurrent Priority Queue

Sourcecodes: github

nuget package

  • Lock-Free,
  • Highly Concurrent,
  • generic in stored item type,
  • generic in priority type, but constrained to priorities represented by .net enum, strongly typed priority,
  • explicitly defined descending order of priorities during construction,
  • ability to detect items count and per priority level items count,
  • ability to dequeue - descending order of priorities,
  • ability to override dequeue priority level,
  • potentially awaitable,
  • potentially priority based awaitable,
ipavlu
  • 1,493
  • 12
  • 20
  • 1
    It would probably be better if you linked to the library with the priority queue, and not to a CodeProject article containing downloadable zip files which include a reference to a nuget package that contains the priority queue. Or you could just add one more level of abstraction and obfuscate it further through bit.ly or goo.gl – Lasse V. Karlsen Dec 30 '17 at 22:00
  • The article explains it all. How you can use it and where it can be downloaded from github/nuget. I force nobody to download the zip files from the article. I just felt like it is nice to give away with article simple examples. At the end of the article, there are links on github source codes and nuget. I am little behind with github documentation, which I will be working on next year. – ipavlu Dec 30 '17 at 23:50
1

Since all the current answers are out-of-date or don't offer a viable solution, there's an implementation on MSDN that's usable. Note that lower priorities get processed first in this implementation.

Hameer Abbasi
  • 1,137
  • 12
  • 30
  • 1
    That implementation seems to be badly broken; see http://stackoverflow.com/q/38836447/1149773 – Douglas Aug 08 '16 at 18:51
1

Check Thread-safe Collections in .NET Framework 4 and Their Performance Characteristics but AFAIK there are no ready to use priority queue. All new thread-safe collections doesn't maintain order but you can make your own on top of them. Check @Steven's way.

Niall Connaughton
  • 14,009
  • 10
  • 50
  • 46
Nick Martyshchenko
  • 4,113
  • 2
  • 18
  • 24
0

Options:

1) If your queue isn't ever going to become large, use a heap and lock the entire structure for each insertion and deletion.

2) If your queue is going to become large, you could use an algorithm like this:

http://www.research.ibm.com/people/m/michael/ipl-1996.pdf

This algorithm allows multiple threads to be working with the heap structure concurrently without risking corruption or deadlocks by supporting fine-grained locking of just parts of the tree at once. You'd have to benchmark to see whether the overhead of additional locking and unlocking operations cost more than contention over locking the entire heap.

3) If you aim to avoid locks altogether, another algorithm, mentioned in the link above, suggests using a FIFO queue of requests (easily implementable with no locks), and a separate thread which is the only thing that touches the heap. You'd have to measure to see how the overhead of switching focus between threads using synchronization objects compared to the overhead of plain straight-up locking.

Before you even get started, it would be worthwhile seeing just how bad the hit is on a straightforward implementation using locking. It may not be the most efficient implementation, but if it still performs orders of magnitude faster than you'll ever need then the ease of maintenance (that is, anyone, including yourself a year can now, being able to simply look at the code and understand what it does) may outweigh the tiny fraction of CPU time spent busy in the queuing mechanism.

Hope this helps :-)

Jonathan Gilbert
  • 3,223
  • 17
  • 24
  • To elaborate slightly -- it is worth noting that the number of lock operations in the algorithm in 2) is proportional to the height of the tree, which is O(lg n). So, each time you want to add one lock/unlock operation, you need to *double* the number of items you're queuing up. – Jonathan Gilbert May 09 '13 at 18:38
  • Also, to clarify on 3), I haven't read the referenced paper directly, but I believe the idea is that the front end presented to consumers who want to enqueue or dequeue items is basically an adapter to the thread running the actual requests. In either case, a request structure is built with something like a ManualResetEvent and put into the FIFO. It then waits on the event. The request processor thread picks it up, does the work, and sets the event before moving on. This serializes all access to the heap without using locks, but the blocking operations and thread switches may be just as bad. – Jonathan Gilbert May 09 '13 at 18:41
  • Worth noting that only consumers would actually have to wait on their requests, though. Producers could "fire and forget". :-) – Jonathan Gilbert May 09 '13 at 18:41
0

Recently, I was creating a state machine in which I needed time-stamped events. Rather than just a simple clock tick, I needed timed events with their own IDs so that I could distinguish one event from another.

Researching this problem led me to the idea of using a priority queue. I could en-queue the timed events along with their information in any order; the priority queue would take care of ordering the events properly. A timer would periodically check the priority queue to see if it is time for the event at the head of the queue to fire. If so, it de-queues the event and invokes the delegate associated with it. This approach was exactly what I was looking for.

Searching here at CodeProject

https://www.codeproject.com/Articles/13295/A-Priority-Queue-in-C

I found that a priority queue[^] class had already been written. However, it occurred to me that I could easily write my own using my old friend, the skip list. This would have the advantage that the de-queue operation would only take O(1) time, while the en-queue operation would still be log(n) on average. I thought that using skip lists in this way was novel enough that it merits its own article.

So here it is. I hope you find it interesting.

BozoJoe
  • 5,512
  • 3
  • 39
  • 63
0

I've found a great example of a concurrent priority queue here. Hope it will help you a little.

var priorityQueue = new ConcurrentPriorityQueue<TKey, TValue>();

TKey in the context of this queue could be an int value or any other object that implements IComparable.

For consuming such a queue you may do a following:

var priorityQueue = new ConcurrentPriorityQueue<int, object>(); 

// Add elements
priorityQueue.Enqueue(2, elementP2); 
priorityQueue.Enqueue(1, elementP1);

// Here you will receive elementP1
bool result = priorityQueue.TryDequeue(out KeyValuePair<int, object> element);