IProgress<T> how often to report progress?
Asked Answered
W

3

22

When using IProgress<T> to report progress, should it be

  • the responsibility of the code reporting progress to limit its progress reports to a frequency that is "reasonable", -or-
  • the responsibility of the specific implementation of IProgress<T> to be aware that the reporting of progress might be at a higher frequency than is reasonable for the way it will be presenting this progress.

The context of the question is I have some code which uses IProgress<T> to report progress, and it reports progress at a very high rate. I want to display progress with a UI progress bar. If I use the provided Progress<T> implementation (which posts progress to the UI SyncronizationContext), then it causes the UI to be unresponsive (i.e. there are so many messages sent to the message queue, that the user can't even click the "Cancel" button on the dialog).

So,

  • I could fix this by reporting less, but what if I had an IProgress<T> implementation that just wrote progress to a log file (and could handle the high reporting frequency). -or-
  • I could fix this by creating my own specific IProgress<T> implementation that limited how often I processed/reported progress. Presumably, this implementation would record the latest progress on a non-UI thread, and then (perhaps) the UI would updated based on a timer.
Welker answered 29/10, 2013 at 14:35 Comment(3)
Interesting question, but probably off-topic? My personal standpoint: It's up to the consumer to make the best of what the producer gives. So no limiting of the output +1. EDIT: OK, if the producer limits himself for performance reasons, then this might be a good reason.Phlebotomize
A progress bar typically takes an integer value 0..100. If you're update doesn't go to the next integer then there is no reason to report it to the UI. But I'm interested in what the experts sayTimeconsuming
@Harrison, I would say progress is typically reported in a way that makes sense to the code reporting the progress. The consumer of the progress is responsible for converting that into how they want to present it (e.g. a percentage complete).Welker
D
20

Write a decorator which throttles the calls. This way you separate the logic of throttling and the actual reporting, and you can use it for any other IProgress<T> implementation.

Use this decorator when you want to throttle the progress reporting. Simple wrap your progress reportor with instance of the class below.

I've left the throttling logic up to you. You can make it time-based, amount-of-calls-based or some other criteria.

public class ProgressThrottler<T>: IProgress<T> {
    public ProgressThrottler(IProgress<T> progress) {
        _progress = progress ?? throw new ArgumentNullException("progress");
    }

    private readonly IProgress<T> _progress;

    public void Report(T value) {
        // Throttles the amount of calls
        bool reportProgressAfterThrottling = ...;

        if (reportProgressAfterThrottling) {
            _progress.Report(value);
        }
    }
}
Dominick answered 29/10, 2013 at 14:43 Comment(2)
The idea is correct, but there's a minor error in the implementation. Report returns void. Also, I don't see the point of explicitely implementing the interface, an implicit implementation does the job at least as good IMHO.Rainer
@KrisVandermotten explicitly implementing is fairly common among DI users to discourage accidental coupling directly to an implementation.Eubank
I
0

Here is a custom Progress<T> implementation, that enforces a minimum interval policy between consecutive progress reports:

/// <summary>
/// A derivative of the <see cref="Progress{T}"/> class, that enforces a minimum
/// interval policy between consecutive callback invocations.
/// </summary>
public class ThrottledProgress<T> : Progress<T>, IDisposable
{
    private readonly System.Threading.Timer _timer;
    private readonly TimeSpan _dueTime;
    private bool _timerEnabled;
    private (T Value, bool HasValue) _latest;

    public ThrottledProgress(Action<T> handler, TimeSpan dueTime) : base(handler)
    {
        if (dueTime < TimeSpan.Zero || dueTime.TotalMilliseconds >= UInt32.MaxValue)
            throw new ArgumentOutOfRangeException(nameof(dueTime));
        _dueTime = dueTime;
        _timer = new(TimerCallback); // Initially the timer is disabled.
    }

    protected override void OnReport(T value)
    {
        lock (_timer)
        {
            if (_timerEnabled)
            {
                // The timer is enabled. Store the value and return.
                // The value will be either picked by the timer, or get overwritten.
                _latest = (value, true);
                return;
            }
            // The timer is disabled. Report the value and enable the timer.
            Debug.Assert(!_latest.HasValue);
            if (!_timer.Change(_dueTime, _dueTime))
                throw new ObjectDisposedException(nameof(ThrottledProgress<T>));
            _timerEnabled = true;
        }
        base.OnReport(value);
    }

    private void TimerCallback(object obj)
    {
        T value;
        lock (_timer)
        {
            if (!_latest.HasValue)
            {
                // Nothing was submitted during the last period.
                // Disable the timer.
                // The Change() doesn't throw when the Timer is disposed.
                _timer.Change(Timeout.Infinite, Timeout.Infinite);
                _timerEnabled = false;
                return;
            }
            // A value was submitted during the last period.
            // Report the value and keep the timer enabled.
            value = _latest.Value;
            _latest = default;
        }
        base.OnReport(value);
    }

    /// <summary>Report immediately any pending value.</summary>
    public void Flush()
    {
        T value;
        lock (_timer)
        {
            if (!_latest.HasValue) return;
            value = _latest.Value;
            _latest = default;
        }
        base.OnReport(value);
    }

    /// <summary>Dispose the component without reporting any pending value.</summary>
    public void Dispose()
    {
        lock (_timer)
        {
            _timer.Dispose();
            _timerEnabled = false;
            _latest = default;
        }
    }
}

When a message is received, it initiates a period of tranquility during which all subsequent received messages are dropped (ignored), except from the last one. The last message of each period is buffered and emitted when the period ends, and then a new tranquil period is initiated etc. The duration of each tranquil period is configured with the dueTime argument.

Usage example:

async void Button_Click(object sender, RoutedEventArgs e)
{
    using ThrottledProgress<string> progress = new(msg => TextBox.Text += msg,
        TimeSpan.FromMilliseconds(50));
    Task[] tasks = Enumerable.Range(1, 10)
        .Select(i => Task.Run(() => Worker(i, progress)))
        .ToArray();
    await Task.WhenAll(tasks);
    progress.Flush();
}

The Flush method can be called after the completion of the asynchronous operation, so that any message that may still be buffered and scheduled for future emission, to be emitted immediately.

For an almost functionally identical version that uses the PeriodicTimer instead of the System.Threading.Timer class, that might be easier to follow but runs only on .NET 6+, see the 2st revision of this answer.

Inobservance answered 7/7, 2021 at 20:20 Comment(2)
I would recommend you, to work on your nestings. My rule of thumb is max 3 levels deep.Absorbent
@Absorbent I updated the answer and fixed the extensive nesting issue.Inobservance
I
0

Here is a different approach to throttling. I've already posted an answer that is based on a Timer and works well, but another intriguing idea is to throttle by reporting the progress updates to the UI thread one-by-one. The progress values are stored in a buffer, and when the UI thread has finished processing a value, the next value in the buffer is pushed to the UI thread. This scheme prevents the flooding of the UI message loop, allowing a large number of updates without freezing the UI.

The implementation below has a buffer with a default size of 1. In case the buffer is full, the UI thread is working, and a new value is reported, the oldest value stored in the buffer is discarded in order to make room for the new value.

/// <summary>
/// A <see cref="IProgress{T}"/> implementation that reports progress updates
/// to the captured <see cref="SynchronizationContext"/> one by one.
/// </summary>
private class BufferedProgress<T> : IProgress<T>
{
    private readonly List<Entry> _buffer = new();
    private readonly int _boundedCapacity = 1;
    private readonly Action<T> _handler;
    private readonly TaskScheduler _scheduler;

    private record struct Entry(T Value, TaskCompletionSource TCS);

    public BufferedProgress(Action<T> handler)
    {
        ArgumentNullException.ThrowIfNull(handler);
        _handler = handler;
        if (SynchronizationContext.Current is not null)
            _scheduler = TaskScheduler.FromCurrentSynchronizationContext();
        else
            _scheduler = TaskScheduler.Default;
    }

    public int BoundedCapacity
    {
        get => _boundedCapacity;
        init
        {
            ArgumentOutOfRangeException.ThrowIfNegative(value,
                nameof(BoundedCapacity));
            _boundedCapacity = value;
        }
    }

    public void Report(T value)
    {
        TaskCompletionSource discardedTcs = null;
        bool startNewTask = false;
        lock (_buffer)
        {
            if (_buffer.Count > _boundedCapacity)
            {
                // The maximum size of the buffer has been reached.
                if (_boundedCapacity == 0) return;
                Debug.Assert(_buffer.Count >= 2);
                // Discard the oldest inert entry in the buffer, located in index 1.
                // The _buffer[0] is the currently running entry.
                // The currently running entry removes itself when it completes.
                discardedTcs = _buffer[1].TCS;
                _buffer.RemoveAt(1);
            }
            _buffer.Add(new(value, null));
            if (_buffer.Count == 1) startNewTask = true;
        }
        discardedTcs?.SetCanceled(); // Notify any waiter of the discarded value.
        if (startNewTask) StartNewTask(value);
    }

    private void StartNewTask(T value)
    {
        // The starting of the Task is offloaded to the ThreadPool. This allows the
        // UI thread to take a break, so that the UI remains responsive.

        // The Post method is async void, so it never throws synchronously.
        // The async/await below could be omitted, because the Task will always
        // complete successfully.
        ThreadPool.QueueUserWorkItem(async state => await Task.Factory.StartNew(
            Post, state, CancellationToken.None, TaskCreationOptions.DenyChildAttach,
            _scheduler), value);
    }

#pragma warning disable CS1998
    // Since this method is async void, and is executed by a scheduler connected
    // to the captured SynchronizationContext, any error thrown by the _handler
    // is rethrown on the captured SynchronizationContext.
    private async void Post(object state)
#pragma warning restore CS1998
    {
        try
        {
            T value = (T)state;
            _handler(value);
        }
        finally
        {
            TaskCompletionSource finishedTcs;
            (T Value, bool HasValue) next = default;
            lock (_buffer)
            {
                // Remove the finished value from the buffer, and start the next value.
                Debug.Assert(_buffer.Count > 0);
                Debug.Assert(Equals(_buffer[0].Value, state));
                finishedTcs = _buffer[0].TCS;
                _buffer.RemoveAt(0);
                if (_buffer.Count > 0) next = (_buffer[0].Value, true);
            }
            finishedTcs?.SetResult(); // Notify any waiter of the finished value.
            if (next.HasValue) StartNewTask(next.Value);
        }
    }

    /// <summary>
    /// Returns a Task that will complete successfully when the last value
    /// added in the buffer is processed by the captured SynchronizationContext.
    /// In case more values are reported, causing that value to be discarded because
    /// of lack of empty space in the buffer, the Task will complete as canceled.
    /// </summary>
    public Task WaitToFinish()
    {
        lock (_buffer)
        {
            if (_buffer.Count == 0) return Task.CompletedTask;
            Span<Entry> span = CollectionsMarshal.AsSpan(_buffer);
            // ^1 is the last index in the buffer.
            return (span[^1].TCS ??= new TaskCompletionSource(
                TaskCreationOptions.RunContinuationsAsynchronously)).Task;
        }
    }
}

Usage example:

BufferedProgress<string> progress = new(value =>
{
    lblProgress.Text = $"Progress: {value}";
}) { BoundedCapacity = 10 };

//...

await progress.WaitToFinish();

The WaitToFinish method can be called at the end of a long running operation by the UI thread, to wait until all reporting activity has completed.

I've tested the BufferedProgress<T> in a Windows Forms .NET 8 application, and in my PC the Label is updated about 4,000 times per second, without causing any perceivable non-responsiveness or freezing.

Inobservance answered 31/5, 2024 at 16:54 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.