I have an array that contains an array of promises, and each inner array could have either 4k, 2k or 500 promises. In total there are around 60k promises and I may test it with other values as well.
I need to execute the first batch Promise.all(BigArray[0])
of calls/promises in parallel.
Once the first batch of inner array is done, I need to execute the next batch Promise.all(BigArray[1])
and so on and so on. Batch-to-batch needs to process serially.
If I try to execute a Promise.all(BigArray)
it throws:
fatal error call_and_retry_2 allocation failed - process out of memory
So in order to prevent running out of memory, I need to execute each batch content in parallel, and batch-to-batch in series.
Here is an example piece of code:
function getInfoForEveryInnerArgument(InnerArray) {
const CPTPromises = _.map(InnerArray, (argument) => getDBInfo(argument));
return Promise.all(CPTPromises)
.then((results) => {
return doSomethingWithResults(results);
});
}
function mainFunction() {
BigArray = [[argument1, argument2, argument3, argument4], [argument5, argument6, argument7, argument8], ....];
//the summ of all arguments is over 60k...
const promiseArrayCombination = _.map(BigArray, (InnerArray, key) => getInfoForEveryInnerArgument(InnerArray));
Promise.all(promiseArrayCombination).then((fullResults) => {
console.log(fullResults);
return fullResults;
})
}