I'm writing an application which involves writing considerably big chunks of data to an OutputStream (belonging to a Socket). The thing that makes this a bit complicated is that there are usually multiple threads trying to write to the same OutputStream. Currently, I have it designed so that the OutputStream to which data is being written to is in its own thread. The thread contains a queue (LinkedList) which polls byte arrays and writes them as soon as possible.
private class OutputStreamWriter implements Runnable {
private final LinkedList<byte[]> chunkQueue = new LinkedList<byte[]>();
public void run() {
OutputStream outputStream = User.this.outputStream;
while (true) {
try {
if (chunkQueue.isEmpty()) {
Thread.sleep(100);
continue;
}
outputStream.write(chunkQueue.poll());
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
The problem with this design is that as more and more writes occur, more and more data queues up and it is not getting written any faster. Initially, when data is put into the queue, it is written practically immediately. Then after about 15 seconds or so, the data begins to lag behind; a delay develops from the time the data is queued to the time the data is actually written. As time goes on, this delay becomes longer and longer. It is very noticeable.
A way to fix this would be some sort of ConcurrentOutputStream implementation that allows data to be sent without blocking so that writes do not start to get backed up (heck, the queue would be unnecessary then). I don't know if there is such an implementation -- I have been unable to find one -- and personally I don't think it's even possible to write one.
So, does anybody have any suggestions of how I can re-design this?