Using async/await with a forEach loop

Multi tool use
Multi tool use


Using async/await with a forEach loop



Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.


async/await


forEach


await


import fs from 'fs-promise'

async function printFiles () {
const files = await getFilePaths() // Assume this works fine

files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}

printFiles()



This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher order function like this so I just wanted to ask if there was any issue with this.


async/await




12 Answers
12



Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.


printFiles



If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:


forEach


for … of


await


async function printFiles () {
const files = await getFilePaths();

for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}



If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:


forEach


async


map


Promise.all


async function printFiles () {
const files = await getFilePaths();

await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}





Could you please explain why does for ... of ... work?
– Demonbane
Aug 15 '16 at 18:04


for ... of ...





ok i know why... Using Babel will transform async/await to generator function and using forEach means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context of next() with others. Actually, a simple for() loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21



async


await


forEach


next()


for()





@Demonbane: In short, because it was designed to work :-) await suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28


await





@arve0 Not really, an async function is quite different from a Promise executor callback, but yes the map callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25


async


Promise


map





When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04




To me using Promise.all() with map() is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.


Promise.all()


map()



If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.



An example with your case:


const { forEach } = require('p-iteration');
const fs = require('fs-promise');

async function printFiles () {
const files = await getFilePaths();

await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}

printFiles()



p-iteration





Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52



Here are some forEach async prototypes:


Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}

Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}





Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55





As long as the name is unique in the future (like I'd use _forEachAsync) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29


_forEachAsync



Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:


Promise.all


Array.prototype.map


Promise


Array.prototype.reduce


Promise


async function printFiles () {
const files = await getFilePaths();

await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}





This works perfectly, thank you so much. Could you explain what is happening here with Promise.resolve() and await promise;?
– parrker9
Mar 28 at 20:48


Promise.resolve()


await promise;





This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24





This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13





@parrker9 Promise.resolve() returns an already resolved Promise object, so that reduce has a Promise to start with. await promise; will wait for the last Promise in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00



Promise.resolve()


Promise


reduce


Promise


await promise;


Promise



it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:


module.exports = function () {
var self = this;

this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};

this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};



now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:


...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;

// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});

var cat = await Cat.findOne({ name: 'Nyan' });

// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}





Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08



With ES2018, you are able to greatly simplify all of the above answers to:


async function printFiles () {
const files = await getFilePaths()

for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}



See spec: https://github.com/tc39/proposal-async-iteration





Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40



Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:


Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))



Using Task, futurize, and a traversable List, you can simply do


async function printFiles() {
const files = await getFiles();

List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}



Here is how you'd set this up


import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';

const future = futurizeP(Task)
const readFile = future(fs.readFile)



Another way to have structured the desired code would be


const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)



Or perhaps even more functionally oriented


// 90% of encodings are utf-8, making that use case super easy is prudent

// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )

export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )

export const readFiles = files =>
arrayToTaskList( files, readFile )

export const printFiles = files =>
readFiles(files).fork( console.error, console.log)



Then from the parent function


async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}



If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )


import { curry, flip } from 'ramda'

export const readFile = fs.readFile
|> future,
|> curry,
|> flip

export const readFileUtf8 = readFile('utf-8')



PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p





FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53



In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.


readFile


import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();

const promises = files.map((file) => fs.readFile(file, 'utf8'))

const contents = await Promise.all(promises)

contents.forEach(console.log);
}



Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().


.map()


async


fs.readFile


promises


Promise.all()



In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.


files



One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.


await + for .. of


forEach + async



Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).


await


for


forEach + async



You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.


reduce + promise


async/await


files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)



Or you can create a forEachAsync to help but basically use the same for loop underlying.


Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}





Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as native forEach - accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57


forEach





You can use Array.prototype.reduce in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54


Array.prototype.reduce



Similar to Antonio Val's p-iteration, an alternative npm module is async-af:


p-iteration


async-af


const AsyncAF = require('async-af');
const fs = require('fs-promise');

function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();

AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}

printFiles();



Alternatively, async-af has a static method (log/logAF) that logs the results of promises:


async-af


const AsyncAF = require('async-af');
const fs = require('fs-promise');

function printFiles() {
const files = getFilePaths();

AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}

printFiles();



However, the main advantage of the library is that you can chain asynchronous methods to do something like:


const aaf = require('async-af');
const fs = require('fs-promise');

const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));

printFiles();



async-af


async-af



I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.




const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')

async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}

async function printFiles () {
const files = await getFilePaths()

await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}

printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```





This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24





as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

k2heCSazH,VireolQcgRX LtjeP4XdVvcgG,wXPwPgl,Ya St
7 Vj1wGQYYOQDKL0gs3VE,x S7Mi,tRPp,rWCNztMGMbuHRpA4knJPq5,IqDNAU66vrL4eDZALuMWc

Popular posts from this blog

Rothschild family

Cinema of Italy