JavaScript array .reduce with async/await
Asked Answered
O

11

168

Seem to be having some issues incorporating async/await with .reduce(), like so:

const data = await bodies.reduce(async(accum, current, index) => {
  const methodName = methods[index]
  const method = this[methodName]
  if (methodName == 'foo') {
    current.cover = await this.store(current.cover, id)
    console.log(current)
    return {
      ...accum,
      ...current
    }
  }
  return {
    ...accum,
    ...method(current.data)
  }
}, {})
console.log(data)

The data object is logged before the this.store completes...

I know you can utilise Promise.all with async loops, but does that apply to .reduce()?

Orjonikidze answered 20/12, 2016 at 13:26 Comment(0)
I
275

The problem is that your accumulator values are promises - they're return values of async functions. To get sequential evaluation (and all but the last iteration to be awaited at all), you need to use

const data = await array.reduce(async (accumP, current, index) => {
  const accum = await accumP;
  …
}, Promise.resolve(…));

That said, for async/await I would in general recommend to use plain loops instead of array iteration methods, they're more performant and often simpler.

Intravenous answered 20/12, 2016 at 13:31 Comment(21)
Thanks for your advice at the end. I ended up using just a plain for loop for what I was doing, and it was the same lines of code, but far easier to read...Sprag
The initialValue of the reduce does not need to be a Promise, it will however in most cases clarify the intent.Celloidin
@Celloidin It should be, though. I really dislike await having to cast a plain value into a promiseIntravenous
@Celloidin And when using TypeScript, the initial value needs to be a promise, because the return type of the callback must always match the type of the accumulator.Devonadevondra
@Devonadevondra I think you mean the initial value (if not I probably misunderstand what you are saying). You could pass null couldn't you?Celloidin
@Celloidin no you can't just pass null. In this case you will have to add null check into the reducer function body. Something like const accum = await (accumP ?? Promise.resolve(someInitialValue) which looks disgusting.Jeanettajeanette
@ViacheslavDobromyslov you can. await null simply returns null. A larger example: function test(a) { return a.reduce(async (result, x) => ({ ...(await result), ...(await x) }), null) }, test with: test([Promise.resolve({ x: 1 }), Promise.resolve({ y: 2 })]).then(x => console.log(x)). Please note that I am in favor of using Promise.resolve(...) as initial value.Celloidin
You need to ensure that this await occurs after any expensive task in the reducer, to avoid expensive tasks being needlessly executed in parallel. Unless the task would depend on the accumulated value, but most of the time it wouldn't. Try adding a function that sleeps for 2s before await acc, then after it, and compare . Thought I'd mention this as the accepted answer puts it as the first thing, which you probably don't want to do most of the time.Modification
@Modification The whole question is built around the premise that you want to the expensive tasks (in the part of the reducer) to run in sequence. If you wanted to run them in parallel, you shouldn't use reduce, and you definitely shouldn't do anything before await accumP as otherwise you risk unhandled promise rejections. Just use map+Promise.all for parallel execution.Intravenous
"The data object is logged before the this.store completes... I know you can utilise Promise.all with async loops, but does that apply to .reduce()?" OP was asking why the return value of reduce was logged first and how he could await the complete result, when the whole array was ready. He didn't say anything about the internal order of execution of the reducer.Modification
Do you have more detail (or a link) on the unhandled promise rejections issue? I meant "expensive" from a performance POV btw, so rather slow, but not too expensive to run many in parallel. For example an API call which just involves waiting for a response most of the time. Just wanted to point out that if you await accumulator as the first thing, it's going to force it to be sequential where it might not need to be for everyone who lands on this answer. Demo: jsfiddle.net/t8sr26g4Modification
@Modification See #46889790 and #45285629. If you don't want to do it sequentially, you must use Promise.all. (And be it inside the reduce, as const [previous, data] = await Promise.all([accumP, apiCall(item)]);)Intravenous
I did just post a working example showing you how awaiting the accumulator late will cause everything before it to run in parallel. Or do you mean I must use Promise.all instead because there's an inherent problem with async reducers? Can you explain what that problem is and why it means you should await the accumulator before any task?Modification
Neither of those SO's mention implications for reduce, so pardon me for asking to elaborate.Modification
@Modification Did you understand the posts I linked? In your example you've used try/catch so that accum normally never becomes a rejected promise, but if it did (say console.log(value) threw an exception for some reason) and it would reject before the following iteration would await it, you'd get an unhandled rejection warning. Don't use this approach. If you want them to run in parallel, just use the standard Promise.all+map idiom.Intravenous
I mostly understood that they did not mention async reducers at all, which is what we're discussing atm.Modification
If you're using an implementation of console.log that can throw an error, you should indeed always also wrap it with a try-catch block... I didn't wrap it because it's just a console log to improve the output of the example. Any code that can result in exceptions needs to have a try-catch block. That's not any different than with Promise.all. If the whole reducer can fail, it can also be wrapped in a try-catch block. How is this any different for an async reducer?Modification
@Modification In a synchronous reduce, you could just wrap the entire call with try/catch and the exceptions would propagate as expected. You should not need to wrap each part of the code that can throw exceptions on its own - it also might be the wrong place to handle them. But with an asynchronous reducer, writing try { await array.reduce(…) } catch … only works if you properly (immediately) await every promise you deal with - including the accumulator promise.Intravenous
Let us continue this discussion in chat.Modification
too many TS errors with this codeGrapple
@Grapple Well for one it's JS code, not TS, and two the answer says you shouldn't use that style of code anyway. But if you need help converting this to valid TS code, feel free to ask a new question where you share the exact code of your attempt and the compiler error messages you receiveIntravenous
M
10

The current accepted answer advises to use Promise.all() instead of an async reduce. However this does not have the same behavior as an async reduce and is only relevant for the case where you want an exception to stop all iterations immediately, which is not always the case.

Additionally in the comments of that answer it's suggested that you should always await the accumulator as the first statement in the reducer, because otherwise you might risk unhandled promise rejections. The poster also says that this was what the OP is asking for, which is not the case. Instead he just wants to know when everything is done. In order to know that you indeed need to do await acc, but this could be at any point in the reducer.

const reducer = async(acc, key) => {
  const response = await api(item);

  return {
    ...await acc, // <-- this would work just as well for OP
    [key]: response,
  }
}
const result = await ['a', 'b', 'c', 'd'].reduce(reducer, {});
console.log(result); // <-- Will be the final result

How to safely use async reduce

That being said, using a reducer this way does mean that you need to guarantee it does not throw, else you will get "unhandled promise rejections". It's perfectly possible to ensure this by using a try-catch, with the catch block returning the accumulator (optionally with a record for the failed API call).

const reducer = async (acc, key) => {
    try {
        data = await doSlowTask(key);
        return {...await acc, [key]: data};
    } catch (error) {
        return {...await acc, [key]: {error}};
    };
}
const result = await ['a', 'b', 'c','d'].reduce(reducer, {});

Difference with Promise.allSettled You can get close to the behavior of an async reduce (with error catching) by using Promise.allSettled. However this is clunky to use: you need to add another synchronous reduce after it if you want to reduce to an object.

The theoretical time complexity is also higher for Promise.allSettled + regular reduce, though there are probably very few use cases where this will make a difference. async reduce can start accumulating from the moment the first item is done, whereas a reduce after Promise.allSettled is blocked until all promises are fulfilled. This could make a difference when looping over a very large amount of elements.

const responseTime = 200; //ms
function sleep(ms) {
    return new Promise(resolve => setTimeout(resolve, ms));
}

const api = async (key) => {
    console.log(`Calling API for ${ key }`);
    // Boz is a slow endpoint.
    await sleep(key === 'boz' ? 800 : responseTime);
    console.log(`Got response for ${ key }`);

    if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);

    return {
        [key]: `API says ${ key }`,
    };
};

const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];

const reducer = async (acc, key) => {
    let data;
    try {
        const response = await api(key);
        data = {
            apiData: response
        };
    } catch (e) {
        data = {
            error: e.message
        };
    }

    // OP doesn't care how this works, he only wants to know when the whole thing is ready.
    const previous = await acc;
    console.log(`Got previous for ${ key }`);

    return {
        ...previous,
        [key]: {
            ...data
        },
    };
};
(async () => {
    const start = performance.now();
    const result = await keys.reduce(reducer, {});
    console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();

Check the order of execution with Promise.allSettled:

const responseTime = 200; //ms
function sleep(ms) {
    return new Promise(resolve => setTimeout(resolve, ms));
}

const api = async (key) => {
    console.log(`Calling API for ${ key }`);
    // Boz is a slow endpoint.
    await sleep(key === 'boz' ? 800 : responseTime);
    console.log(`Got response for ${ key }`);

    if (key === 'bar') throw new Error(`It doesn't work for ${ key }`);

    return {
        key,
        data: `API says ${ key }`,
    };
};

const keys = ['foo', 'bar', 'baz', 'buz', 'boz'];

(async () => {
    const start = performance.now();
    const apiResponses = await Promise.allSettled(keys.map(api));
    const result = apiResponses.reduce((acc, {status, reason, value}) => {
        const {key, data} = value || {};
        console.log(`Got previous for ${ key }`);
        return {
            ...acc,
            [key]: status === 'fulfilled' ? {apiData: data} : {error: reason.message},
        };
    }, {});
    console.log(`After ${ performance.now() - start }ms`, result); // <-- OP wants to execute things when it's ready.
})();
Modification answered 20/8, 2021 at 12:52 Comment(0)
C
9

I like Bergi's answer, I think it's the right way to go.

I'd also like to mention a library of mine, called Awaity.js

Which lets you effortlessly use functions like reduce, map & filter with async / await:

import reduce from 'awaity/reduce';

const posts = await reduce([1,2,3], async (posts, id) => {

  const res = await fetch('/api/posts/' + id);
  const post = await res.json();

  return {
    ...posts,
    [id]: post
  };
}, {})

posts // { 1: { ... }, 2: { ... }, 3: { ... } }
Catrinacatriona answered 9/3, 2018 at 16:45 Comment(2)
Is each pass going to be sequential? Or calls all of those await functions in a batch?Carin
Sequential, since each iteration depends on the return value on the previous oneCatrinacatriona
S
9

[Not addressing OPs exact prob; focused on others who land here.]

Reduce is commonly used when you need the result of the previous steps before you can process the next. In that case, you can string promises together a la:

promise = elts.reduce(
    async (promise, elt) => {
        return promise.then(async last => {
            return await f(last, elt)
        })
    }, Promise.resolve(0)) // or "" or [] or ...

Here's an example with uses fs.promise.mkdir() (sure, much simpler to use mkdirSync, but in my case, it's across a network):

const Path = require('path')
const Fs = require('fs')

async function mkdirs (path) {
    return path.split(/\//).filter(d => !!d).reduce(
        async (promise, dir) => {
            return promise.then(async parent => {
                const ret = Path.join(parent, dir);
                try {
                    await Fs.promises.lstat(ret)
                } catch (e) {
                    console.log(`mkdir(${ret})`)
                    await Fs.promises.mkdir(ret)
                }
                return ret
            })
        }, Promise.resolve(""))
}

mkdirs('dir1/dir2/dir3')

Below is another example which add 100 + 200 ... 500 and waits around a bit:

async function slowCounter () {
    const ret = await ([100, 200, 300, 400, 500]).reduce(
        async (promise, wait, idx) => {
            return promise.then(async last => {
                const ret = last + wait
                console.log(`${idx}: waiting ${wait}ms to return ${ret}`)
                await new Promise((res, rej) => setTimeout(res, wait))
                return ret
            })
        }, Promise.resolve(0))
    console.log(ret)
}

slowCounter ()
Sc answered 29/3, 2020 at 23:43 Comment(0)
P
4

Sometimes the best thing to do is simply put both code versions side by side, sync and async:

Sync version:

const arr = [1, 2, 3, 4, 5];

const syncRev = arr.reduce((acc, i) => [i, ...acc], []); // [5, 4, 3, 2, 1] 

Async one:

(async () => { 
   const asyncRev = await arr.reduce(async (promisedAcc, i) => {
      const id = await asyncIdentity(i); // could be id = i, just stubbing async op.
      const acc = await promisedAcc;
      return [id, ...acc];
   }, Promise.resolve([]));   // [5, 4, 3, 2, 1] 
})();

//async stuff
async function asyncIdentity(id) {
   return Promise.resolve(id);
}

const arr = [1, 2, 3, 4, 5];
(async () => {
    const asyncRev = await arr.reduce(async (promisedAcc, i) => {
        const id = await asyncIdentity(i);
        const acc = await promisedAcc;
        return [id, ...acc];
    }, Promise.resolve([]));

    console.log('asyncRev :>> ', asyncRev);
})();

const syncRev = arr.reduce((acc, i) => [i, ...acc], []);

console.log('syncRev :>> ', syncRev);

async function asyncIdentity(id) {
    return Promise.resolve(id);
}
Pasture answered 2/4, 2021 at 23:2 Comment(4)
This fails to properly handle errors, see #46889790 and #45285629. Absolutely never use this pattern!Intravenous
You can absolutely use this pattern and also properly handle errors, if you wrap your reducer body with a try catch block, so that it always is able to return the accumulated value.Modification
I run without "Promise.resolve" in second argument to reduce. Seems to work fine. Could you please explain the purpose of adding Promise.resolve and why it works even without it?Stheno
I actually program using typescript, so not using Promise.resolve(...) as the initial value is not possible, since the type of acc (or anything returned by an async function) is a promise (and Promise.resolve is a way to "box" the initial value). About the second question, I suppose it works (in js) because the await inside the function is for "unboxing" the promise. It turns out that it (await) works for "already unboxed" values too.Pasture
W
2

For typescript previous value and initial value need to be same.

const data = await array.reduce(async (accumP: Promise<Tout>, curr<Tin>) => {
    const accum: Tout = await accumP;
    
    doSomeStuff...

    return accum;

}, Promise<Tout>.resolve({} as Tout);
Woodwaxen answered 8/3, 2022 at 0:47 Comment(0)
D
1

You can wrap your entire map/reduce iterator blocks into their own Promise.resolve and await on that to complete. The issue, though, is that the accumulator doesn't contain the resulting data/object you'd expect on each iteration. Due to the internal async/await/Promise chain, the accumulator will be actual Promises themselves that likely have yet to resolve themselves despite using an await keyword before your call to the store (which might lead you to believe that the iteration won't actually return until that call completes and the accumulator is updated.

While this is not the most elegant solution, one option you have is to move your data object variable out of scope and assign it as a let so that proper binding and mutation can occur. Then update this data object from inside your iterator as the async/await/Promise calls resolve.

/* allow the result object to be initialized outside of scope 
   rather than trying to spread results into your accumulator on iterations, 
   else your results will not be maintained as expected within the 
   internal async/await/Promise chain.
*/    
let data = {}; 

await Promise.resolve(bodies.reduce(async(accum, current, index) => {
  const methodName = methods[index]
  const method = this[methodName];
  if (methodName == 'foo') {
    // note: this extra Promise.resolve may not be entirely necessary
    const cover = await Promise.resolve(this.store(current.cover, id));
    current.cover = cover;
    console.log(current);
    data = {
      ...data,
      ...current,
    };
    return data;
  }
  data = {
    ...data,
    ...method(current.data)
  };
  return data;
}, {});
console.log(data);
Dedal answered 10/8, 2017 at 21:48 Comment(1)
"the accumulator will be actual Promises themselves" - yes, and your solution never waits for them. It only waits for the promise returned from the last iteration, but if that resolves faster than the previous ones, your console.log(data) will be incomplete. This solution does not work. You should just use Promise.all.Intravenous
C
0

export const addMultiTextData = async(data) => {
  const textData = await data.reduce(async(a, {
    currentObject,
    selectedValue
  }) => {
    const {
      error,
      errorMessage
    } = await validate(selectedValue, currentObject);
    return {
      ...await a,
      [currentObject.id]: {
        text: selectedValue,
        error,
        errorMessage
      }
    };
  }, {});
};
Centurion answered 17/4, 2019 at 10:22 Comment(3)
While this code snippet may solve the question, including an explanation really helps to improve the quality of your post. Remember that you are answering the question for readers in the future, and those people might not know the reasons for your code suggestion.Goatfish
Not to say I wouldn't even recommend this approach since using spread operators in loops is very performance heavy.Furtek
This fails to properly handle errors, see #46889790 and #45285629. Absolutely never use this pattern!Intravenous
S
0

Here's how to make async reduce:

async function asyncReduce(arr, fn, initialValue) {
  let temp = initialValue;

  for (let idx = 0; idx < arr.length; idx += 1) {
    const cur = arr[idx];

    temp = await fn(temp, cur, idx);
  }

  return temp;
}
Superelevation answered 19/11, 2020 at 10:46 Comment(0)
S
0

Another classic option with Bluebird

const promise = require('bluebird');

promise.reduce([1,2,3], (agg, x) => Promise.resolve(agg+x),0).then(console.log);

// Expected to product sum 6
Suspensor answered 15/7, 2021 at 8:3 Comment(0)
A
0

My solution for .reduce in typescript

Thanks to this person https://dev.to/arnaudcourtecuisse/comment/1el22

const userOrders = await existUsersWithName.reduce(
      async (promise, existUserAndName) => {
        const acc = await promise;

        const {user, name} = existUserAndName;

        // My async function
        acc[user] = await this.users.getOrders(name);

        return acc;
      },
      <Promise<Record<string, string[] | undefined>>>{}
    );
Aenneea answered 5/11, 2022 at 11:51 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.