With ThrottlingTroll this can be achieved as follows.
- Configure your HttpClient like this:
static HttpClient myThrottledHttpClient = new HttpClient
(
new ThrottlingTrollHandler
(
// Consider using RedisCounterStore instead, so that rate counters are stored in a distributed cache
counterStore: new MemoryCacheCounterStore(),
config: new ThrottlingTrollEgressConfig
{
Rules = new[]
{
new ThrottlingTrollRule
{
// No more than 10 requests per second
LimitMethod = new FixedWindowRateLimitMethod
{
PermitLimit = 10,
IntervalInSeconds = 1,
}
},
}
}
)
);
- Then use it like that:
[Function("QueueTrigger")]
[QueueOutput(QueueName)]
public async Task<string> Run([QueueTrigger(QueueName)] string msg)
{
// Making the call via an HttpClient scaffolded with ThrottlingTroll
var response = await myThrottledHttpClient.GetAsync("https://my-http-endpoint");
// If request rate limit is exceeded, myThrottledHttpClient will return this status by itself, _without_ making the actual call.
if (response.StatusCode == System.Net.HttpStatusCode.TooManyRequests)
{
// Just to reduce the load to the queue
await Task.Delay(1000);
// Re-queueing the same message
return msg;
}
// Do whatever else needed with message and response
return null;
}
That ThrottlingTroll-equipped HttpClient will limit itself. If the limit is exceeded, it will return 429 TooManyRequests without making the actual call.
When that happens, we just put the message back to the same queue.
Assuming your Function has multiple instances, consider using RedisCounterStore to maintain rate limit across all of them.