How to Add a Delay in a JavaScript Loop
It’s easy to add delays in for loops when using versions of JavaScript that support async/await.
One common reason to sleep in JavaScript is to throttle the requests made to websites when Web scraping.
I’ll explain two ways to do it.
Method 1: Regular JavaScript Promises
Here’s some example JavaScript code that will create a delay:
const sleep = (milliseconds) =>
new Promise((resolve) => setTimeout(resolve, milliseconds));
You can use that function in an async function and await the sleep timer.
It works both in Node.js and in client-side JavaScript.
Full Example of a Sleep Timer in Node.js
Here’s a working example for Node.js. It will scrape some JSON data from a couple of APIs and save the data to a file. There will be a 4-second delay between each request made in the for loop.
Put the code below into a file named scraper.mjs, read through it, and then run it with node scraper.mjs.
// This line is only needed if you are using a version of Node.js that is older
// than 17.5.
import fetch from "node-fetch";
import fs from "fs";
// Here's the sleep timer.
const sleep = (milliseconds) =>
new Promise((resolve) => setTimeout(resolve, milliseconds));
async function fetchData(urls) {
// For the purpose of the example, we'll sleep for 4 seconds to clearly show
// that it's working.
const delay = 4000;
const output = [];
for (let i = 0; i < urls.length; i++) {
console.log("fetching", urls[i]);
// await the response and extract the JSON data as a JavaScript object.
const response = await fetch(urls[i]);
const data = await response.json();
// Push the data into the output array.
output.push(data);
// This displays the data to show that the delay is really working.
console.log("got this data:", data);
// Await the sleep timer before going back to the beginning of the loop.
console.log(`sleeping for ${delay}ms`);
await sleep(delay);
}
return output;
}
function writeDataToFile(data) {
// This creates a filename with a timestamp in it to avoid overwriting files
// from previous runs of the scraper.
const filename = Date.now() + "-data.json";
const formattedJSON = JSON.stringify(data, null, 4);
fs.writeFileSync(filename, formattedJSON);
console.log("wrote file", filename);
}
async function main() {
// Any URL that returns JSON will work here.
const urls = [
"https://api.github.com/",
"https://api.coinbase.com/v2/currencies",
];
const data = await fetchData(urls);
console.log(`You scraped data from ${data.length} URLs`);
// Do whatever you want with the data here.
// In this case it saves it to a file.
writeDataToFile(data);
}
// Run the scraper.
main();
Method 2: Node.js Timer Promises API
In newer versions of Node.js, you can import setTimeout from node:timers/promises and await the result.
Here’s an example:
import { setTimeout } from "node:timers/promises";
const delay = 2000; // 2 seconds
console.log(`sleeping for ${delay} milliseconds`);
const message = await setTimeout(2000, "finished");
console.log(message);
console.log("exiting");
Put that code in a file named sleep.mjs and run it with node sleep.mjs. It will print “start”, sleep for two seconds (2000ms), and then print “finished”.
For more information, see the documentation for the Timer Promises API.
If you have suggestions, tips, or questions, leave a comment below.