3

I have 100,000 URLs of small files to download. Would like to use 10 threads and pipelining is a must. I concatenate the result to one file. Current approach is:

cat URLS | xargs -P5 -- curl >> OUTPUT

Is there a better option that will show progress of the whole operation? Must work from the command line.

1 Answers1

3
cat URLS | parallel -k -P10 curl >> OUTPUT

or if progress is more important:

cat URLS | parallel -k -P10 --eta curl >> OUTPUT

or:

cat URLS | parallel -k -P10 --progress curl >> OUTPUT

The 10 seconds installation will try do to a full installation; if that fails, a personal installation; if that fails, a minimal installation.

wget -O - pi.dk/3 | sh

Watch the intro video for a quick introduction: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1

Ole Tange
  • 5,099