Does a 'blocking' queue defeat the very purpose of multi threading
Asked Answered
R

8

5

The ArrayBlockingQueue will block the producer thread if the queue is full and it will block the consumer thread if the queue is empty.

Does not this concept of blocking goes against the very idea of multi threading? if I have a 'main' thread and let us say I want to delegate all 'Logging' activities to another thread. So Basically inside my main thread,I create a Runnable to log the output and I put the Runnable on an ArrayBlockingQueue. The whole purpose of doing this is have the 'main' thread return immediately without wasting any time in an expensive logging operation.

But if the queue is full, the main thread will be blocked and will wait until a spot is available. So how does it help us?

Ricky answered 5/7, 2013 at 18:23 Comment(3)
Thread contention, thread starvation, and unsafe concurrent access are all problems that must be addressed when multithreading. Threads don't play nice with shared resources unless explicitly told to.Tempo
You're only taking into account the case where the queue is full. In every other case the thread won't block.Sun
@Tempo What are you taking about? This has nothing to do with the question. By the way BlockingQueues' are thread safe... See docs.oracle.com/javase/6/docs/api/java/util/concurrent/…Udometer
A
5

I think this is the designer's decision. If he chose blocking mode ArrayBlockingQueue provides it with put method. If the desiner dont want blocking mode ArrayBlockingQueue has offer method which will return false when queue is full but then he needs to decide what to do with regected logging event.

Arita answered 5/7, 2013 at 18:33 Comment(0)
A
10

The queue doesn't block out of spite, it blocks to introduce an additional quality into the system. In this case, it's prevention of starvation.

Picture a set of threads, one of which produces work units really fast. If the queue were to be allowed unbounded growth, potentially, the "rapid producer" queue could hog all the producing capacity. Sometimes, prevention of such side-effects is more important than having all threads unblocked.

Angeliaangelic answered 5/7, 2013 at 18:33 Comment(0)
A
5

I think this is the designer's decision. If he chose blocking mode ArrayBlockingQueue provides it with put method. If the desiner dont want blocking mode ArrayBlockingQueue has offer method which will return false when queue is full but then he needs to decide what to do with regected logging event.

Arita answered 5/7, 2013 at 18:33 Comment(0)
W
4

In your example I would consider blocking to be a feature: It prevents an OutOfMemoryError.

Generally speaking, one of your threads is just not fast enough to cope with the assigned load. So the others must slow down somehow in order not to endanger the whole application.

On the other hand, if the load is balanced, the queue will not block.

Wreak answered 5/7, 2013 at 18:34 Comment(0)
C
2

Blocking is a necessary function of multithreading. You must block to have synchronized access to data. It does not defeat the purpose of multithreading.

I would suggest throwing an exception when the producer attempts to submit an item to a queue which is full. There are methods to test if the capacity is full beforehand I believe.

This would allow the invoking code to decide how it wants to handle a full queue.

If execution order when processing items from the queue is unimportant, I recommend using a threadpool (known as an ExecutorService in Java).

Chalmers answered 5/7, 2013 at 18:31 Comment(0)
I
1

It depends on the nature of your multi threading philosophy. For those of us who favour Communicating Sequential Processes a blocking queue is nearly perfect. In fact, the ideal would be one where no message can be put into the queue at all unless the receiver is ready to receive it.

So no, I don't think that a blocking queue goes against the very purpose of multi-threading. In fact, the scenario that you describe (the main thread eventually getting stalled) is a good illustration of the major problem with the actor-model of multi-threading; you've no idea whether or not it will deadlock / block, and you can't exhaustively test for it either.

In contrast, imagine a blocking queue that is zero messages deep. That way for the system to work at all you'd have to find a way to ensure that the logger is always guaranteed to be able to receive a message from the main thread. That's CSP. It might mean that in your hypothetical logger thread you have to have application defined buffering (as opposed to some framework developer's best guess of how deep a FIFO should be), a fast I/O subsystem, checks for keeping up, ways of dealing with falling behind, etc. In short it doesn't let you get away with it, you're forced to address every aspect of your system's performance.

That is of course harder, but that way you end up with a system that's definitely OK rather than the questionable "maybe" that you have if your blocking queues are an unknown number of messages deep.

Infancy answered 5/7, 2013 at 19:4 Comment(0)
L
0

It sounds like you have the general idea right of why you'd use something like an ArrayBlockingQueue to talk between threads.

Having a blocking queue gives you the option to do something different in case something goes wrong with your background worker threads, rather than blindly adding more requests to the queue. If there is room in the queue, there is no blocking.

For your specific use case, though, I would use ExecutorService rather than reading/writing queues directly, which creates a pool of background worker threads:

http://docs.oracle.com/javase/6/docs/api/java/util/concurrent/ExecutorService.html

pool = Executors.newFixedThreadPool(poolSize);
pool.submit(myRunnable);
Lifeboat answered 5/7, 2013 at 18:31 Comment(5)
Using an executor with multiple threads won't gaurentee ordering when processing the queue's items. This wasn't a requirement of the question, but is something worth mentioning.Chalmers
True - although if you have only one thread (poolSize = 1) it's equivalent.Lifeboat
Good point. It'd be really nice to get a result from a runnable, queue it up into some priority event processing manager, then notify listeners when the next item was done. That way you could take advantage of multiple cores too. Again, this assumes ordering is important. And that this additional synchronization and ordering doesn't slow things down to the point using a single thread is faster.Chalmers
Sorry but if i use executor service, i dont need to put and take items to/from teh queue inside my run() method? can you elaborate a bit?Ricky
Correct, the ExecutorService abstracts the background thread's loop while (not finished) { wait for job on queue; run job; }. You simply submit your Runnable.Lifeboat
M
0

You have to make a choice about what to do when a Queue is full. In the case of an Array Blocking queue, that choice is to wait.

Another option would be to just throw away new Objects if the queue was full; you can achieve this with offer.

You have to make a trade-off.

Millwater answered 5/7, 2013 at 18:31 Comment(2)
The class BlockingQueue already does the "throwing away when full" if you want. Just use .offer().Writer
@BrunoReis Thanks, I glanced at the API and missed it!Millwater
T
0

A multithreaded program is non-deterministic insofar as you can't say beforehand: n producer actions will take exactly as long as m consumer actions. Therefore, synchronization between n producers and m consumers is necessary in every case.

You'll want to choose the queue size so that the number of active producers and consumers is maximized most of the time. But the thread model of java does not guarantee that any consumer will run unless it is the only unblocked thread. (Yet, of course, on multi-core CPUs it is very likely that the consumer will run).

Titanium answered 5/7, 2013 at 18:39 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.