4

Does the .NET framework provide an ability to implement access to a shared resource in a manner such that some Writers trying to access that resource will have priority over others?

My problem has the following constraints:
1. Only 1 concurrent write request to the resource can be granted
2. There are many Writers waiting for access to this resource, but some Writers have precedence over others (starvation of low-priority Writers is okay).
3. Thread affinity is a NON-requirement. One thread can set the lock, but another may reset it.
4. All Writer threads are from the same process.

In short, I need a primitive that exposes its wait-queue and allows modify access to it. If there isn't any such thing, any tips on how I can proceed on building one for myself, using the already available classes, such as Semaphore?

Wim Coenen
  • 63,995
  • 12
  • 149
  • 237
Satyajit
  • 513
  • 4
  • 17
  • What resource? Some kind of Stream? –  Jan 15 '11 at 16:01
  • how about having the Writer trying to lock and if i can't throw itself in a prioritized queue. When the Write with the lock is about to release the lock, it could check the queue and grab the next Writer. – kenny Jan 15 '11 at 16:06
  • @Will - The resource is a COM dll which disallows concurrent access to any method or property exposed by it. It throws a "object is not yet ready" COMException in case concurrent accesses occur. @kenny - Let me think if I can use your idea, I'll reply tomorrow (it is late night here). – Satyajit Jan 15 '11 at 16:18
  • Have you looked at ReaderWriterLockSlim? http://msdn.microsoft.com/en-us/library/system.threading.readerwriterlockslim.aspx – zebrabox Jan 16 '11 at 12:57

2 Answers2

1

Here is some quick'n'dirty code I could come up with. I will refine this, but as a POC this works...

public class PrioritisedLock
{
    private List<CountdownEvent> waitQueue; //wait queue for the shared resource
    private Semaphore waitQueueSemaphore; //ensure safe access to wait queue itself

    public PrioritisedLock()
    {
        waitQueue = new List<CountdownEvent>();
        waitQueueSemaphore = new Semaphore(1, 1);
    }

    public bool WaitOne(int position = 0)
    {
        //CountdownEvent needs to have a initial count of 1
        //otherwise it is created in signaled state
        position++;
        bool containsGrantedRequest = false; //flag to check if wait queue still contains object which owns the lock

        CountdownEvent thisRequest = position<1 ? new CountdownEvent(1) : new CountdownEvent(position);
        int leastPositionMagnitude=Int32.MaxValue;
        waitQueueSemaphore.WaitOne();

        //insert the request at the appropriate position
        foreach (CountdownEvent cdEvent in waitQueue)
        {
            if (cdEvent.CurrentCount > position)
                cdEvent.AddCount();
            else if (cdEvent.CurrentCount == position)
                thisRequest.AddCount();

            if (cdEvent.CurrentCount == 0)
                containsGrantedRequest = true;
        }

        waitQueue.Add(thisRequest);

        foreach (CountdownEvent cdEvent in waitQueue)
            if (cdEvent.CurrentCount < leastPositionMagnitude)
                leastPositionMagnitude = cdEvent.CurrentCount;

        //If nobody holds the lock, grant the lock to the current request
        if (containsGrantedRequest==false && thisRequest.CurrentCount == leastPositionMagnitude)
            thisRequest.Signal(leastPositionMagnitude);

        waitQueueSemaphore.Release();

        //now do the actual wait for this request; if it is already signaled, it ends immediately
        thisRequest.Wait();

        return true;
    }

    public int Release()
    {
        int waitingCount = 0, i = 0, positionLeastMagnitude=Int32.MaxValue;
        int grantedIndex = -1;

        waitQueueSemaphore.WaitOne();

        foreach(CountdownEvent cdEvent in waitQueue)
        {
            if (cdEvent.CurrentCount <= 0)
            {
                grantedIndex = i;
                break;
            }
            i++;
        }

        //remove the request which is already fulfilled
        if (grantedIndex != -1)
            waitQueue.RemoveAt(grantedIndex);

        //find the wait count of the first element in the queue
        foreach (CountdownEvent cdEvent in waitQueue)
            if (cdEvent.CurrentCount < positionLeastMagnitude)
                positionLeastMagnitude = cdEvent.CurrentCount;

        //decrement the wait counter for each waiting object, such that the first object in the queue is now signaled
        foreach (CountdownEvent cdEvent in waitQueue)
        {
            waitingCount++;
            cdEvent.Signal(positionLeastMagnitude);
        }

        waitQueueSemaphore.Release();
        return waitingCount;
    }
}

}

Satyajit
  • 513
  • 4
  • 17
0

Use priority queue to keep list of pending requests. See here: Priority queue in .Net. Use stanadrd Monitor functionality to lock and signal what and when to do, as proposed by kenny.

Community
  • 1
  • 1
Al Kepp
  • 5,545
  • 2
  • 20
  • 45
  • That can work, I will try doing that. It also seems that in .NET framework 4, there's a new class called System.Threading.CountdownEvent, I was able to use it to created a mechanism of my own. I will do some perf runs to see which implementation (priority queue + mutex/monitor, or my class) is less resource intensive. – Satyajit Jan 16 '11 at 12:35