Distributed rate limiting
Asked Answered
U

3

7

I have multiple servers/workers going through a task queue doing API requests. (Django with Memcached and Celery for the queue) The API requests are limited to 10 requests a second. How can I rate limit it so that the total number of requests (all servers) don't pass the limit?

I've looked through some of the related rate limit questions I'm guessing they are focused on a more linear, non concurrent scenario. What sort of approach should I take?

Unconditioned answered 17/12, 2012 at 0:17 Comment(0)
S
0

Have you looked in Rate Limiter from Guava project? They introduced this class in one of the latest releases and it seems to partially satisfy your needs.

Surely it won't calculate rate limit across multiple nodes in distributed environment but what you coud do is to have rate limit configured dynamically based on number of nodes which are are running (ie for 5 nodes you'd have rate limit of 2 API requests a second)

Spectator answered 17/12, 2012 at 4:42 Comment(0)
P
0

I have been working on an opensource project to solve this exact problem called Limitd. Although I don't have clients for other technologies than node yet, the protocol and the idea are simple.

Your feedback is very welcomed.

Paresthesia answered 19/4, 2015 at 14:39 Comment(0)
Q
0

I solved that problem unfortunately not for your technology: bandwidth-throttle/token-bucket

If you want to implement it, here's the idea of the implementation:

It's a token bucket algorithm which converts the containing tokens into a timestamp since when it last was completely empty. Every consumption updates this timestamp (locked) so that each process shares the same state.

Quittor answered 4/5, 2016 at 6:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.