in case of processing a very large array e.g. Promise.all(A_VERY_LARGE_ARRAY_OF_XHR_PROMISE)
which would probably blow you browser memory by trying to send all requests at the same time
solution is limit the concurrent of requests, and wrap promise in thunk
Promise.allConcurrent(2)([()=>fetch('BLAH1'), ()=>fetch('BLAH2'),...()=>fetch('BLAHN')])
if set concurrent to 2, it will send request BLAH1 and BLAH2 at the same time
if BLAH1 return response and resolved, will immediatly send request to BLAH3
in this way promise sending at the same time always keep the limit 2 which we’ve just configed before
I know this is four years ago, but given it's still a top google result, I wanted to share my approach / style too. I made a couple of decisions in my implementation:
Promise.race
instead ofPromise.all
internally in order to maximize "worker" usagePromise.all
remainingTasks
, Map forexecutingTasks
and array again forfinishedTasks
) to have native solutions for performance optimizationsIf you prefer some of those decisions for your own code then hopefully this solution is useful to you. I did not yet test the function very extensively, so there might be bugs I did not account for yet.