We've been looking to use a lock-free queue in our code to reduce lock contention between a single producer and consumer in our current implementation. There are plenty of queue implementations out there, but I haven't been too clear on how to best manage memory management of nodes.
For example, the producer looks like this:
queue.Add( new WorkUnit(...) );
And the consumer looks like:
WorkUnit* unit = queue.RemoveFront();
unit->Execute();
delete unit;
We currently use memory pool for allocation. You'll notice that the producer allocates the memory and the consumer deletes it. Since we're using pools, we need to add another lock to the memory pool to properly protect it. This seems to negate the performance advantage of a lock-free queue in the first place.
So far, I think our options are:
- Implement a lock-free memory pool.
- Dump the memory pool and rely on a threadsafe allocator.
Are there any other options that we can explore? We're trying to avoid implementing a lock-free memory pool, but we may to take that path.
Thanks.