When you create a batchblock with bounded capacity and call triggerBatch while (In parallel to) posting a new item - posting new item will fail during the trigger batch execution time.
Calling Trigger batch (every X time) is made in order to ensure that the data isn't delayed for too long in the block, in cases where the incoming data stream paused or slowed down.
The following code will output some "post failure" events. For example:
public static void Main(string[] args)
{
var batchBlock = new BatchBlock<int>(10, new GroupingDataflowBlockOptions() { BoundedCapacity = 10000000 });
var actionBlock = new ActionBlock<int[]>(x => ProcessBatch(x), new ExecutionDataflowBlockOptions() { MaxDegreeOfParallelism = 1 });
batchBlock.LinkTo(actionBlock);
var producerTask = Task.Factory.StartNew(() =>
{
//Post 10K Items
for (int i = 0; i < 10000; i++)
{
var postResult = batchBlock.Post(i);
if (!postResult)
Console.WriteLine("Failed to Post");
}
});
var triggerBatchTask = Task.Factory.StartNew(() =>
{
//Trigger Batch..
for (int i = 0; i < 1000000; i++)
batchBlock.TriggerBatch();
});
producerTask.Wait();
triggerBatchTask.Wait();
}
public static void ProcessBatch(int[] batch)
{
Console.WriteLine("{0} - {1}", batch.First(), batch.Last());
}
*Note that this scenario is reproducible only when the batchBlock is Bounded.
Am I missing something or is it an issue with batchBlock?