Consider these 2 pieces of code:
const test = async () => {
  const start = new Date().getTime();
  const promise1 = new Promise(_ => setTimeout(_, 1000)); 
  const promise2 = new Promise(_ => setTimeout(_, 4000)); 
  const value1 = await promise1;
  const value2 = await promise2;
  const time = new Date().getTime() - start;
  console.log(`${time} ms passed`)
};
test();
// console shows 4001 ms passed
In the above code, the total time is the highest of both promises.
const test = async () => {
  const start = new Date().getTime();
  const promise1 = await new Promise(_ => setTimeout(_, 1000)); 
  const promise2 = await new Promise(_ => setTimeout(_, 4000)); 
  const time = new Date().getTime() - start;
  console.log(`${time} ms passed`)
};
test();
// console shows 5010 ms passed
In the above code, the total time is the addition of both promises.
Can anyone explain what's happening under the hood step by step?
