GNU parallel and xargs also process one line at time (tested)
Can you give an example of this? If you use -j then you should be able to run much more than one process at a time.
I would write it like this:
doit() {
url="$1"
urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "${url}" --max-time 5 )
echo "$url $urlstatus"
}
export -f doit
cat input.txt | parallel -j0 -k doit
Based on the input.txt:
Input file is txt file and lines are separated as
ABC.Com
Bcd.Com
Any.Google.Com
Something like this
www.google.com
pi.dk
I get the output:
Input file is txt file and lines are separated as 000
ABC.Com 301
Bcd.Com 301
Any.Google.Com 000
Something like this 000
www.google.com 302
pi.dk 200
Which looks about right:
000 if domain does not exist
301/302 for redirection
200 for success
I must say I am a bit surprised if the input lines you have provided really are parts of the input you actually use. None of these domains exist, and domain names with spaces in probably never will exist - ever:
Input file is txt file and lines are separated as
Any.Google.Com
Something like this
If you have not given input from your actual input file, you really should do that instead of making up stuff - especially if the made up stuff does not resemble the real data.
Edit
Debugging why it does not work for you.
Please do not write a script, but run this directly in the terminal:
bash # press enter here to make sure you are running this in bash
doit() {
url="$1"
urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "${url}" --max-time 5 )
echo "$url $urlstatus"
}
export -f doit
echo pi.dk | parallel -j0 -k doit
This should give:
pi.dk 200