I have a small code written in JavaScript that get the content of html pages then do a process on them (crawler). the problem is request causes asynchronous execution. I tried to use Promises and async & await but still got the same problem asynchronous execution , the reason is i want to crawl multiple pages at once in order to move to the next objective. Here is a similar code of what i have here :
const rootlink= 'https://jsonplaceholder.typicode.com/posts/';
async function f (){
    await f1()
    f3()
}
async function f1(){
    return new Promise(async (resolve,reject)=>{
        log('f1 start');
         for(let i=1;i<11;i++){
            await request(rootlink+i,(err, res,html)=>{
                if(!err && res.statusCode==200){
                    log('link '+i +' done');
                    resolve();
                }
                else reject()
            })
        }
    })
}
function f3(){
    console.log('f3')
}
f()
the result should be : f1 start link 1 done link 2 done link 3 done link 4 done link 5 done link 6 done link 7 done link 8 done link 9 done link 10 done f3
instead of f1 start link 1 done f3 link 2 done link 3 done link 4 done link 5 done link 6 done link 7 done link 8 done link 9 done link 10 done
 
     
     
    