Rate Limiting A Queue Of Api Calls And Returning The Results
Solution 1:
The sequential for … of
loop looks good to me. You can add a default delay for each iteration to make it slower, but you can also simply retry requests later when they fail because of throttling. Notice that this approach only works well when you have only a single source of requests in your app (not multiple concurrent calls to requestForEach
), otherwise you'd probably need global coordination.
asyncdoSomething(repoName) {
while (true) {
try {
const data = await codecommit.listBranches({
repoName
}).promise();
return data.branches;
} catch(err) {
if (err.code == 'ThrottlingException') { // if (err.retryable) {awaitdelay(err.retryDelay ?? 1000);
continue;
} else {
throw err;
}
}
}
}
functiondelay(time) {
returnnewPromise(resolve => {
setTimeout(resolve, time);
});
}
Instead of the while (true)
loop a recursive approach might look nicer. Notice that in production code you'll want to have a limit on the number of retries so that your loop never runs infinitely.
Solution 2:
Looks like you want parallelLimit.
It takes an optional callback which receives the results.
From the docs.
https://caolan.github.io/async/v3/docs.html#parallelLimit
callback function An optional callback to run once all the functions have completed successfully. This function gets a results array (or object) containing all the result arguments passed to the task callbacks. Invoked with (err, results).
Example:
// run 'my_task' 100 times, with parallel limit of 10var my_task = function(callback) { ... };
var when_done = function(err, results) { ... };
// create an array of tasksvar async_queue = Array(100).fill(my_task);
async.parallelLimit(async_queue, 10, when_done);
Taken from: how to use async.parallelLimit to maximize the amount of (paralle) running processes?
Solution 3:
You can make use of Promise.all as below to reduce the wait time for your API calls as below
asyncrequestForEach(repos) {
returnPromise.all(repos.map(repo =>this.doSomething(repo.value)));
}
Since you are getting the rate limit
issue with total number of calls, you can make use of libraries like es6-promise-pool to manage concurrent requests (5/10 - based on your requirement).
And update the this.doSomething
with recursion and MAX_RETRIES (Control the MAX_RETRIES
from environment variable
) limit as below
asyncdoSomething(repoName, retries = 0) {
try {
const data = await codecommit.listBranches({
repoName
}).promise();
return data.branches;
} catch(err) {
if (err.code == 'ThrottlingException' && retries <= MAX_RETRIES) {
awaitdelay(err.retryDelay ?? 1000); // As per @Bergi's answerawaitdoSomething(repoName, retries + 1); // Recursive call
} else {
console.log('Issue with repo: ', repoName);
throw err; // (Or) return ''; based on requirement
}
}
}
// Filter out the valid results at the end - Applicable only if you use return '';const results = awaitrequestForEach(repos);
const finalResults = results.filter(Boolean);
This approach might help you to reduce the wait time in production over looping every request in sequence.
Post a Comment for "Rate Limiting A Queue Of Api Calls And Returning The Results"