Handling a large number of promises in JavaScript

in Hive Gaming3 years ago (edited)

In my bot for Splinterlands, I came across a problem on how to deal with a large number of node-fetch requests. It is usually around 50, which when done parallelly, return an array of battles, which I combine later on. For a low bandwidth network, doing 50 requests would become difficult and the overall operation gets canceled. So, I scoured the net to find a way to do them sequentially. It took a long time, but the work got done. The following blog is how I used both of them to get my work done efficiently.

For an example

Lets have a function which will be a proxy to the node fetch.

const fetchProxy=()=> new Promise((resolve, reject) => {
  const randomWait = 3000*Math.random();
  setTimeout(() => resolve(randomWait), randomWait);
});

Every async fetch will be a Promise object. Let's say we have an array, on whose elements we need to perform the fetchProxy operation.

const arrayToWork = [...Array(50).keys()];

Parallelly

To resolve them parallelly, we can use map and forEach array methods. It will call the callback function at the same time.

startTime=Date.now(); // Starting time
// Will display the time taken to resolve the array of promises
Promise.all(arrayToWork.map(fetchProxy)).then((values) => {console.log(Date.now()-startTime)}); // always less than 3000 as given in the fetchProxy function

Sequentially

And to resolve them sequentially, We need to use Array.prototype.reduce method, because the reduce method will call the callback method one after the other since it needs to accumulate the result of the current call for the next item. In our case, the accumulator is a memo which makes sure that it accumulates (and returns) a promise in each call.

Promise.resolve(
  arrayToWork.reduce((memo,i)=>memo.then(fetchProxy),Promise.resolve())
).then(()=>console.log(Date.now()-startTime))// will display time taken to resolve sequentially
// Will be less than 50*3000, i.e. 150,000 ms

what If we mix them both?

first, we need to split the array into chunks of smaller arrays. chunk will do it for you.
We deal with the chunks sequentially, and every chunk parallelly. Better of both worlds.

const chunk = (arr, n) => {
  if(n<=0)throw new Error('First argument to splitEvery must be a positive integer')
  var result = [],idx = 0;
  while(idx<arr.length)result.push(arr.slice(idx,idx+=n))
  return result
}

const chunkSize = 25; // number that reflects your bandwidth's capacity

Promise.resolve(chunk(arrayToWork,chunkSize).reduce((memo,pieceOfChunk)=>
  memo.then(()=>Promise.all(pieceOfChunk.map(fetchProxy))),Promise.resolve()
)).then(()=>console.log(Date.now()-startTime))

Conclusion

Now it won't matter if the number of calls is 50 or 1500. it will always chunk up the array, and complete sequentially. Take a look at the file battles-data.js in my repo

Check out my other posts while creating Splinterlands

The above example to check all the methods:

const chunk = (arr, n) => {
  if(n<=0)throw new Error('First argument to splitEvery must be a positive integer')
  var result = [],idx = 0;
  while(idx<arr.length)result.push(arr.slice(idx,idx+=n))
  return result
}
const fetchProxy=()=> new Promise((resolve, reject) => {
  const randomWait = 300*Math.random();
  setTimeout(() => resolve(randomWait), randomWait);
});
const reqCount = 50;
const arrayToWork = [...Array(reqCount).keys()];
console.log(startTime=Date.now())


Promise.all(arrayToWork.map(fetchProxy)).then(() => console.log({'Parallel ~ 300ms':Date.now()-startTime}));
Promise.resolve(
  arrayToWork.reduce((memo,i)=>memo.then(fetchProxy),Promise.resolve())
).then(()=>console.log({[`Sequential ~ ${reqCount}*300ms`]:Date.now()-startTime}))

const chunkSize = 25; // number that reflects your bandwidth's capacity
Promise.resolve(chunk(arrayToWork,chunkSize).reduce((memo,pieceOfChunk)=>
  memo.then(()=>Promise.all(pieceOfChunk.map(fetchProxy))),Promise.resolve()
)).then(()=>console.log({[`Parallel and Sequential ~ < ${reqCount}/${chunkSize} * 300ms`]:Date.now()-startTime}))
Sort:  

Your post has been manually curated by @monster-curator Team!

Get instant cashback for every cards purchase on MonsterMarket.io. MonsterMarket shares 60% of the revenue generated, no minimum spending is required. Join MonsterMarket Discord.

Monster-Curator-Monster-Market.png

Yay! 🤗
Your content has been boosted with Ecency Points, by @azarmadr3.
Use Ecency daily to boost your growth on platform!

Support Ecency
Vote for new Proposal
Delegate HP and earn more

Congratulations @azarmadr3! You received a personal badge!

Happy Hive Birthday! You are on the Hive blockchain for 3 years!

You can view your badges on your board and compare yourself to others in the Ranking

Check out the last post from @hivebuzz:

Christmas Challenge - Offer a gift to to your friends
Support the HiveBuzz project. Vote for our proposal!

Dear @azarmadr3,

Do you mind supporting the HiveBuzz proposal for 2022 so our team can continue its work next year?
You can do it on Peakd, ecency,

Hive.blog / https://wallet.hive.blog/proposals
or using HiveSigner.
https://peakd.com/me/proposals/199

We wish you a Happy New Year!