I'm trying to download a list of files generated by an internal processing system via HTTP get method in node js. For a single files or for a few files it works fine and there is an answer for that already here on stackoverflow. However, the problem occurs then you try to download a huge list of files with asyn requests, the system simply times out and throws an error.
So it's more of a scalability issue. The best way would be to download files one by one/or a few files in one go and move to the next batch, but I'm not sure how to do that. Here is the code I have so far which works fine for a few files but in this case I have ~850 files (a few MBs each), and it does not work-
const https = require("http");
var fs = require('fs');
//list of files
var file_list = [];
file_list.push('http://www.sample.com/file1');
file_list.push('http://www.sample.com/file2');
file_list.push('http://www.sample.com/file3');
.
.
.
file_list.push('http://www.sample.com/file850');
file_list.forEach(single_file => {
        const file = fs.createWriteStream('files/'+single_file ); //saving under files folder
        https.get(single_file, response => {
          var stream = response.pipe(single_file);
          stream.on("finish", function() {
            console.log("done");
          });
        });
    });
It runs fine for a few files and creates a lot of empty files in the files folder and then throws this error
events.js:288                                                              
      throw er; // Unhandled 'error' event                                 
      ^                                                                    
                                                                           
Error: connect ETIMEDOUT 192.168.76.86:80                                   
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1137:16)        
Emitted 'error' event on ClientRequest instance at:                        
    at Socket.socketErrorListener (_http_client.js:426:9)                  
    at Socket.emit (events.js:311:20)                                      
    at emitErrorNT (internal/streams/destroy.js:92:8)                      
    at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)              
    at processTicksAndRejections (internal/process/task_queues.js:84:21) { 
  errno: 'ETIMEDOUT',                                                      
  code: 'ETIMEDOUT',                                                       
  syscall: 'connect',                                                      
  address: '192.168.76.86',                                                 
  port: 80                                                                 
}   
Seems like it gives a huge load to the network, probably downloading these one by one might also work. Please suggest the best scalable solution if possible. Thanks.