Node.js - Working with an API limit of 5 requests per second
Asked Answered
D

4

7

I've got a 'script' that does thousands of requests to a specific API. This API will only allow 5 requests per second (and probably it measures differently then me). To make the requests I'm using request-promise framework, and I've superseded the normal request-promise function with this:

const request_promise  = require('request-promise')

function waitRetryPromise() {
  var count = 0 // keeps count of requests
  function rp(options) {
    const timedCall = (resolve) => setTimeout( ()=>resolve(rp(options)),1000) // recursive call
    count += 1
    if (count % 3 == 0) { // recalls after a second on every third request
      return new Promise(timedCall)
    } else {
      return request_promise(options)
    }
  }
  return rp
}

const rp = waitRetryPromise()

Once around 300 requests (give or take) are fired off in short succession, these requests start to interfere with each other. Does anyone have a better solution? I thought the recursive call to this same function would help, and It did but it didn't solve the problem. Maybe there is a pattern to queue requests, and do them a few at a time? A library perhaps?

Thanks!

Didactics answered 12/10, 2017 at 4:2 Comment(6)
these requests start to interfere with each other in what way?Wilkinson
still like to know what you mean by these requests start to interfere with each otherWilkinson
so,, no response to a valid question?Wilkinson
Sorry I asked this right before bed. What I meant with the above quote is that when I make some delay some inevitably still start at the same time as some later non delayed ones causing the same problem, as long as there are enough requests. :-)Didactics
All the answers seem to be apply an approximate delay to achieve the desired outcome. I like the idea of restricting the number of calls per time to the API limit and will pursue that.Brough
https://mcmap.net/q/267846/-throttle-and-queue-up-api-requests-due-to-per-second-cap seems to be what I'm looking for.Brough
W
5

OK, rather than recursing the call to rp etc, just make sure you delay between requests by an appropriate amount ... for 5 per second, that's 200ms

function waitRetryPromise() {
    let promise = Promise.resolve();
    return function rp(options) {
        return promise = promise
        .then(() => new Promise(resolve => setTimeout(resolve, 200)))
        .then(() => request_promise(options));
    }
}
const rp = waitRetryPromise();
Wilkinson answered 12/10, 2017 at 5:9 Comment(3)
I don't think this will have the desired effect though I'll try. Since many are queued at the same time won't they just all have a delay and then start all at the same time anyways?Didactics
Yes. I keep messing you the code for some reason. Should've seen my first attempt!!! So derpWilkinson
changed return promise.then.... to return promise = promise.then.... to make it workWilkinson
A
4

My code will run the TimedQueue so as long as there is work to be done. The process() method resolves when all work is finished:

class Queue {
    constructor() {
        this.queue = [];
    }

    enqueue(obj) {
        return this.queue.push(obj);
    }

    dequeue() {
        return this.queue.shift();
    }

    hasWork() {
        return (this.queue.length > 0);
    }
}

function t_o(delay) {
    return new Promise(function (resolve, reject) {
        setTimeout(function () {
            resolve();
        }, delay);
    });
}

class TimedQueue extends Queue {
    constructor(delay) {
        super();
        this.delay = delay;
    }

    dequeue() {
        return t_o(this.delay).then(() => {
            return super.dequeue();
        });
    }
    
    process(cb) {
        return this.dequeue().then(data => {
            cb(data);
            if (this.hasWork())
                return this.process(cb);
        });
    }
}


var q = new TimedQueue(500);

for (var request = 0; request < 10; ++request)
    q.enqueue(request);

q.process(console.log).then(function () {
    console.log('done');
});
Allonym answered 12/10, 2017 at 4:33 Comment(6)
In your case, change the TimedQueue's delay to 200 (1/5) of a second.Allonym
haven't read the code in detail (I like this better than what I was going to post in some ways), but if q.enqueue is called after q.process will it still work? if q.enqueue is called after console.log('done');, do you have to q.process again?Wilkinson
Sorry, yes, you can add work while the queue is processing and it will continue to run, eventually processing the newly enqueued item, without a second call to process.Allonym
no, you made a valid point, I should've done my own test - once the queue is empty, adding to it means you need to process again (see, I tested it :p )Wilkinson
Process is unique, if the TimedQueue is processing, enqueing an item between delays will not need a second call to process. If the queue finished beforehand, then yes, you will have to requeue and reprocess. The beauty is that you don't have to write any fancy code to determine that. If the promise was resolved, then you know, you have to reprocess, if it hasn't, then freely enqueue.Allonym
@Raf I like this solution but I like the other one more, it requires less understanding/complexity. However, please leave it here - it's great and may be exactly what some are looking for.Didactics
W
0

You can use mutex to synchronize concurrent processes running on different threads and limit them by timer:

import { Mutex } from 'async-mutex';

const wait = (ms: number): Promise<void> => new Promise((res) => setTimeout(res, ms));

export class RateLimiter {
    private readonly mutex = new Mutex();

    private delay = 1000 / 2;

    public async run<T>(cb: () => Promise<T>): Promise<T> {
        const release = await this.mutex.acquire();

        try {
            return cb();
        } finally {
            // you can add `await` before to delay the result
            wait(this.delay).then(release);
        }
    }
}

An example usage:

// ...NestJS Controller
private readonly rateLimiter = new RateLimiter();

@Get('random')
public async random() {
    const result = await this.rateLimiter.run(async () => Math.random());

    return result;
}

📍 The mutex lib documentation

Wetzel answered 4/4, 2023 at 9:27 Comment(0)
C
-2

I am not sure but maybe you get some idea from below

function placeAnOrder(orderNumber) {
    console.log("customer order:", orderNumber)
    cookAndDeliverFood(function () {
        console.log("food delivered order:", orderNumber);
    });
}

//  1 sec need to cook

function cookAndDeliverFood(callback){
    setTimeout(callback, 1000);
}

//users web request
placeAnOrder(1);
placeAnOrder(2);
placeAnOrder(3);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Cordiacordial answered 12/10, 2017 at 4:13 Comment(1)
Hey Raj, thanks for posting but your code is in some ways is an interior of my code that I already have. It won't solve the problem as all your doing is making each request take 1 second but all requests (if spawned asynchronously) will still happen in the same second.Didactics

© 2022 - 2024 — McMap. All rights reserved.