Today I wanted to create a backup from a folder on a server with 16 CPUs. I started looking for an option that can utilize my hardware to archive a folder better than how tar does it. Something with multi-thread support probably. I did some research and I figured there are tools like pbzip2 and pigz but they can only compress, not archive. So do you guys have an elegant solution for that?
Asked
Active
Viewed 370 times
2 Answers
1
Archiving itself is I/O-intensive, and won't benefit from multiple cores. Feed the uncompressed output of tar to one of the programs you found.
Edit:
tar -cO /directory/path | whizbang -compress --ultra-brute --cpus=16
where whizbang is replaced with your favorite compressor depending on speed vs size preferences
technosaurus
- 1,112
Ignacio Vazquez-Abrams
- 114,604
1
What Ignacio said... but -O is for extracting files to stdout...
So, my suggestion:
tar cf - /directory/path | whizbang -compress --cpus=16 > archive.tar.whizbang
The next step, of course, would be badgering the maintainers of tar to include whizbang support in their next release. ;-)
DevSolar
- 4,560