I'm writing a script to scan the /config route on some ports on a host.
I have first written the script in node.js and am now porting it to bash to achieve less dependencies.
Why is the bash script more than 300 times slower when scanning localhost. What am I missing?
I guess there are some optimizations built in node-fetch. How can I achieve the same in bash?
- node.js: 10 Ports -> 79ms
 - bash(0.0.0.0): 10 Ports -> 2149ms
 - bash(localhost): 10 Ports -> 25156ms
 
I found out that in bash when using 0.0.0.0 instead of localhost, it is only 27 times slower, but still... (In node.js, using 0.0.0.0 does not make a significant difference.)
node.js (IFFE omitted for readability)
import fetch from 'node-fetch';
for (let port = portFrom; port <= portTo; port++) {
  try {
    const res = await fetch("http://localhost" + port + '/config');
    const json = await res.json();
    console.log(json);
  } catch {/*no-op*/}
}
bash
for ((port=$port_from; port<=$port_to; port++))
  do
    json="$(curl -s http://localhost:$port/config)"
    echo "$json"
done