Node.js — Run Async Functions in Batches

When working with arrays of data, you may need to run an asynchronous operation on each item. The Promise.all() method provides a convenient interface to run a list of promises in parallel, but it has some limitations.

Using Promise.all() may overload the available resources, especially when reaching out to the Internet or interacting with the database. Depending on your list size, you may send out thousands of requests parallelly into the wild.

This tutorial shows you how to use a promise pool to run a batch of promises in parallel and the batches themselves in sequence.

Node.js Series Overview

  1. String Replace All Appearances
  2. Remove All Whitespace From a String in JavaScript (Coming soon)
  3. Generate a Random String in Node.js or JavaScript (Coming soon)
  1. Get Number of Seconds Since Epoch in JavaScript
  2. Get Tomorrow’s Date in JavaScript (Coming soon)
  3. Increase a Date in JavaScript by One Week (Coming soon)

Run Async Functions/Promises in Batches

Let’s say you have a list of 500 items and you need to perform an asynchronous operation. You don’t want to run the processing for all 500 items in parallel. At the same time, you also don’t want to waste resources by running all 500 items in sequence.

The idea: chunk the list with 500 items into smaller lists with 20 items each and run 20 items in parallel.

Use a Promise Pool

We recommend a promise pool to run batches of items concurrently. A promise pool runs the asynchronous operation at most X items in parallel. Precisely, it handles async pooling for you and tries to run the maximum allowed number of parallel operations.

The @supercharge/promise-pool package provides map-like, concurrent promise processing.

Here’s an example using a promise pool to process a list of 50,000 users. The pool runs at most 20 operations concurrently:

const PromisePool = require(‚@supercharge/promise-pool‘)

const users = [  
  { id: 1, name: 'Marcus' },
  { id: 2, name: 'Norman' },
  { id: 3, name: 'Christian' },
  { id: 50000, name: 'Future Studio' }

 * Process the list of 50,000 users with a concurrency of 20 items.
 * The promise pool takes the next task from the list as soon
 * as one of the active tasks in the pool finishes.
const { results, errors } = await PromisePool  
  .process(async data => {
    const user = await User.createIfNotExisting(data)

    return user

The promise pool iterations through all items in the list, even though an operation may cause an error. The pool collects all errors and results and returns them in an object. You may use destructuring to access the result and errors properties directly.

Try to Avoid Promise.all()

JavaScript’s global Promise class provides the static all method which accepts an array of promises. The downsides of this approach: it runs all promises in the list simultaneously and waits for every promise to finish.

You may chunk your original list of items into smaller lists and use Promise.all on these smaller lists. This leads to the shortcoming that all promises must resolve before processing the next batch. This is where the promise pool shines: it starts takes the next task from the list of operations as soon as one in the batch finishes.

Mentioned Resources

Explore the Library

Find interesting tutorials and solutions for your problems.