Does lock() guarantee acquired in order requested?
Asked Answered
L

5

76

When multiple threads request a lock on the same object, does the CLR guarantee that the locks will be acquired in the order they were requested?

I wrote up a test to see if this was true, and it seems to indicate yes, but I'm not sure if this is definitive.

class LockSequence
{
    private static readonly object _lock = new object();

    private static DateTime _dueTime;

    public static void Test()
    {
        var states = new List<State>();

        _dueTime = DateTime.Now.AddSeconds(5);
        
        for (int i = 0; i < 10; i++)
        {
            var state = new State {Index = i};
            ThreadPool.QueueUserWorkItem(Go, state);
            states.Add(state);
            Thread.Sleep(100);
        }
        
        states.ForEach(s => s.Sync.WaitOne());
        states.ForEach(s => s.Sync.Close());
    }

    private static void Go(object state)
    {
        var s = (State) state;

        Console.WriteLine("Go entered: " + s.Index);

        lock (_lock)
        {
            Console.WriteLine("{0,2} got lock", s.Index);
            if (_dueTime > DateTime.Now)
            {
                var time = _dueTime - DateTime.Now;
                Console.WriteLine("{0,2} sleeping for {1} ticks", s.Index, time.Ticks);
                Thread.Sleep(time);
            }
            Console.WriteLine("{0,2} exiting lock", s.Index);
        }

        s.Sync.Set();
    }

    private class State
    {
        public int Index;
        public readonly ManualResetEvent Sync = new ManualResetEvent(false);
    }
}

Prints:

Go entered: 0

0 got lock

0 sleeping for 49979998 ticks

Go entered: 1

Go entered: 2

Go entered: 3

Go entered: 4

Go entered: 5

Go entered: 6

Go entered: 7

Go entered: 8

Go entered: 9

0 exiting lock

1 got lock

1 sleeping for 5001 ticks

1 exiting lock

2 got lock

2 sleeping for 5001 ticks

2 exiting lock

3 got lock

3 sleeping for 5001 ticks

3 exiting lock

4 got lock

4 sleeping for 5001 ticks

4 exiting lock

5 got lock

5 sleeping for 5001 ticks

5 exiting lock

6 got lock

6 exiting lock

7 got lock

7 exiting lock

8 got lock

8 exiting lock

9 got lock

9 exiting lock

Lucerne answered 19/11, 2010 at 19:57 Comment(0)
C
85

IIRC, it's highly likely to be in that order, but it's not guaranteed. I believe there are at least theoretically cases where a thread will be woken spuriously, note that it still doesn't have the lock, and go to the back of the queue. It's possible that's only for Wait/Notify, but I have a sneaking suspicion it's for locking as well.

I definitely wouldn't rely on it - if you need things to occur in a sequence, build up a Queue<T> or something similar.

EDIT: I've just found this within Joe Duffy's Concurrent Programming on Windows which basically agrees:

Because monitors use kernel objects internally, they exhibit the same roughly-FIFO behavior that the OS synchronization mechanisms also exhibit (described in the previous chapter). Monitors are unfair, so if another thread tries to acquire the lock before an awakened waiting thread tries to acquire the lock, the sneaky thread is permitted to acquire a lock.

The "roughly-FIFO" bit is what I was thinking of before, and the "sneaky thread" bit is further evidence that you shouldn't make assumptions about FIFO ordering.

Corin answered 19/11, 2010 at 19:59 Comment(6)
+1 also note that your test app might work fine on your desktop and then work differently on the 64-processor machine your production code runs onSolution
Wouldn't adding items to a Queue<T> from multiple threads require lock anyways and thus exhibit the exact same behavior where it's possible that items might get added to the queue out of order for the exact same reason?Lucerne
@Sam - use threadsafe ConcurrentQueue<T> for this use case msdn.microsoft.com/en-us/library/dd267265.aspx. For guaranteed sequential processing, use a single producer thread and a single consumer thread.Solution
@Sam: There'll be a race condition between two threads trying to add something at exactly the same time, yes - but the difference is that once they've been added, you can fetch in a guaranteed order... whereas in the case of locking, even if you could somehow tell that one thread had started to obtain the lock way before another one, you wouldn't have any guarantee that it would actually acquire it first.Corin
@Jon Skeet, I guess the real question I'm asking is if the code inside the lock is no more complicated or time consuming than adding something to the queue, is lock the best option or is there some other better option?Lucerne
@Sam: If you can use .NET 4, use ConcurrentQueue... otherwise, I'd stick with locking. ReaderWriterLockSlim can be more efficient in some cases, but it's probably not going to help you here.Corin
W
16

Normal CLR locks are not guaranteed to be FIFO.

But, there is a QueuedLock class in this answer which will provide a guaranteed FIFO locking behavior.

Worse answered 1/6, 2015 at 21:46 Comment(1)
Perfect! I don't remember which project I needed it for five years ago, but very glad to have it in my repository for future. Thanks for posting the link.Lucerne
B
12

The lock statement is documented to use the Monitor class to implement its behavior, and the docs for the Monitor class make no mention (that I can find) of fairness. So you should not rely on requested locks being acquired in the order of request.

In fact, an article by Jeffery Richter indicates in fact lock is not fair:

Granted - it's an old article so things may have changed, but given that no promises are made in the contract for the Monitor class about fairness, you need to assume the worst.

Bedbug answered 19/11, 2010 at 20:11 Comment(0)
C
2

Slightly tangential to the question, but ThreadPool doesn't even guarantee that it will execute queued work items in the order they are added. If you need sequential execution of asynchronous tasks, one option is using TPL Tasks (also backported to .NET 3.5 via Reactive Extensions). It would look something like this:

public static void Test()
{
    var states = new List<State>();

    _dueTime = DateTime.Now.AddSeconds(5);

    var initialState = new State() { Index = 0 };
    var initialTask = new Task(Go, initialState);
    Task priorTask = initialTask;

    for (int i = 1; i < 10; i++)
    {
        var state = new State { Index = i };
        priorTask = priorTask.ContinueWith(t => Go(state));

        states.Add(state);
        Thread.Sleep(100);
    }
    Task finalTask = priorTask;

    initialTask.Start();
    finalTask.Wait();
}

This has a few advantages:

  1. Execution order is guaranteed.

  2. You no longer require an explicit lock (the TPL takes care of those details).

  3. You no longer need events and no longer need to wait on all events. You can simply say: wait for the last task to complete.

  4. If an exception were thrown in any of the tasks, subsequent tasks would be aborted and the exception would be rethrown by the call to Wait. This may or may not match your desired behavior, but is generally the best behavior for sequential, dependent tasks.

  5. By using the TPL, you have added flexibility for future expansion, such as cancellation support, waiting on parallel tasks for continuation, etc.

Conjecture answered 19/11, 2010 at 20:40 Comment(1)
Thanks. ThreadPool was just for the demo, and I used waits to make sure they all acquired the lock in the right order for the demo.Lucerne
S
0

I am using this method to do FIFO lock

public class QueuedActions
{
    private readonly object _internalSyncronizer = new object();
    private readonly ConcurrentQueue<Action> _actionsQueue = new ConcurrentQueue<Action>();


    public void Execute(Action action)
    {
        // ReSharper disable once InconsistentlySynchronizedField
        _actionsQueue.Enqueue(action);

        lock (_internalSyncronizer)
        {
            Action nextAction;
            if (_actionsQueue.TryDequeue(out nextAction))
            {
                nextAction.Invoke();
            }
            else
            {
                throw new Exception("Something is wrong. How come there is nothing in the queue?");
            }
        }
    }
}

The ConcurrentQueue will order the execution of the actions while the threads are waiting in the lock.

Slunk answered 26/3, 2016 at 6:15 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.