I am trying to figure out how to wait for a promise to be resolved before starting the next iteration in a for loop. Someone had suggested for me to use the setInterval() function instead of a for loop, which is fine if you can guess the time that it will take for the promise to resolve, but it obviously is not ideal.
const puppeteer = require('puppeteer-extra')
const StealPlugin = require('puppeteer-extra-plugin-stealth')
puppeteer.use(StealPlugin())
let arrayOfUrls = [
    "https://google.com",
    "https://facebook.com",
    "https://youtube.com",
];
let initialIndex = 0;
let finalIndex = 0;
async function scraper(url) {
    const browser = await puppeteer.launch({headless: false});
    const page = await browser.newPage();
    await page.goto(url);
    await page.screenshot({path: 'example' + initialIndex.toString() + '.png'});
    await console.log(url + "  screenshot complete!")
    await browser.close();
}
const interval = setInterval(() => {
    if (initialIndex < arrayOfUrls.length) {
        scraper(arrayOfUrls[initialIndex]);
        initialIndex += 1;
    } else {
        clearInterval(interval);
        console.log("All complete!")
        loopy()
    }
}, 300)
function loopy() {
    setInterval(() => {
        if (finalIndex === arrayOfUrls.length) {
            finalIndex = 0;
        }
        scraper(arrayOfUrls[finalIndex]);
        finalIndex += 1;
    }, 300)
}
This above code is just experimental at the moment, but what I am ultimately trying to achieve is make a series of API requests using URLs from a text file and then create an array containing an object for each URL. This is the const interval = setInterval(() => { in my code.
Then I want to be able to periodically check each request again and check if there is a change in the API request and have this be performed indefinitely. This is the loopy() function in my experimental code. If there is I want to send a notification to myself.
My current implementation works fine if I set the time for the setInterval() to something high like 5000ms, but if it is something low like 300ms then the promises cannot be fullfilled quickly enough and I end up getting this error:
(node:9652) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added to [process]. Use emitter.setMaxListeners() to increase limit 
What would be the best way to implement the logic for such a program?
Edit:
After the idea in the comments from WSC I attempted the following and it seems to work.
const puppeteer = require('puppeteer-extra')
const StealPlugin = require('puppeteer-extra-plugin-stealth')
puppeteer.use(StealPlugin())
let arrayOfUrls = [
    "https://google.com",
    "https://facebook.com",
    "https://youtube.com",
];
let initialIndex = 0;
let finalIndex = 0;
async function scraper(url) {
    const browser = await puppeteer.launch({headless: false});
    const page = await browser.newPage();
    await page.waitFor(5000)
    await page.goto(url);
    await page.screenshot({path: 'example' + initialIndex.toString() + '.png'});
    await console.log(url + "  screenshot complete!")
    await browser.close();
}
async function initialScrape() {
    if (initialIndex < arrayOfUrls.length) {
        await scraper(arrayOfUrls[initialIndex]);
        initialIndex += 1;
        initialScrape()
    } else {
        console.log("All complete!")
        loopy()
    }
}
async function loopy() {
    if (finalIndex === arrayOfUrls.length) {
        finalIndex = 0;
    }
    await scraper(arrayOfUrls[finalIndex]);
    finalIndex += 1;
    loopy()
}
initialScrape()
I have implemented the artificial delay into the scraper() function instead in the form of await page.waitFor(5000). However, I am not entirely sure if this particular implementation is recommended or not for the program I am trying to achieve.
 
    