5

Since .NET 4.0 the autogenerated add/remove event handlers are thread safe (here and here). Therefore the clients who register their listeners to an exposed event can do so concurrently from multiple threads without races.

But what if I want to fire the event in a thread safe manner? The recommended practice seems to be the following (here):

public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
    EventHandler myEvent = MyEvent;
    if (myEvent != null)
    {
        myEvent(this, e);
    }
}

However, having read something about the .NET memory model (e.g. MSDN magazine 2012-12 and 2013-01) I no longer think this is correct. My concern is that memory reads may be introduced by the compiler and so the above code could be JIT-ted to something like this:

public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
    // JIT removed the local variable and introduced two memory reads instead.
    if (MyEvent != null)
    {
        // A race condition may cause the following line to throw a NullReferenceException.
        MyEvent(this, e);
    }
}

It is legal to remove the local variable and use repeated memory reads since it does not change the behavior of the method if executed in a single threaded environment. This is by the ECMA specification (ECMA-335: I.12.6.4). Comprehensible example is also provided in the 2013-01 issue of the MSDN magazine.

Am I missing something here? If not then please advise a workaround.

Community
  • 1
  • 1
Tomas
  • 135
  • 8
  • Thank you for pointing this out. I searched SO and the Internet but I obviously missed that post. To add my two cents I actually think that a wrong answer has been chosen for the question you are referring to. The correct one seems to be this - http://stackoverflow.com/a/2365885/385737. The ECMA specification truly allows local variables to be replaced by repeated memory reads. – Tomas Aug 30 '13 at 12:05
  • Adding two more duplicates of my own question ‒ [this](http://stackoverflow.com/questions/16162745/event-raising-and-read-introduction-in-net-4-5) and notably [this](http://stackoverflow.com/questions/14799876/read-introduction-in-c-sharp-how-to-protect-against-it) with a great answer by Eric Lippert. – Tomas Sep 02 '13 at 19:57

1 Answers1

2

You have to add the only line to make the first snippet correct in multithreaded environment:

public event EventHandler MyEvent;
protected void OnMyEvent(EventArgs e)
{
    EventHandler myEvent = MyEvent;
    Thread.MemoryBarrier();
    if (myEvent != null)
    {
        myEvent(this, e);
    }
}

Memory barrier refuses to reorder read and writes for both compiler and CPU. That's how volatile reads/writes are implemented. You can read more about memory barrier here.

Jury Soldatenkov
  • 317
  • 1
  • 5
  • 11
  • My understanding is that memory barriers are fences that forbid reordering of memory reads and writes (e.g. by a compiler or by processor caches). This does not however imply that memory barriers also influence the way in which memory reads are intoruced/eliminated. Please advise or point me to a reference that would support your point. – Tomas Sep 02 '13 at 15:55
  • Compiler and processor do not give us any warranty, how much physical writes and reads this code will produce. It may differ even from call to call. And yes, memory barrier does not affect such behaviour. This means that without MB reference to event delegate could be taken from any location: memory, cache, register. Memory and cache are modified by many threads. That's why NRE is possible. Memory barrier forces compiler to store reference to delegate in thread stack. Other thread cannot modify it. This makes it safe to check and call. – Jury Soldatenkov Sep 02 '13 at 18:57
  • Thanks Jury for bearing with me, but moving the value from the heap to the thread's stack does not make sense to me ‒ how would the value then make its way back to the heap? Besides I believe that the ECMA standard does not mention any such operation. – Tomas Sep 02 '13 at 20:04