7

I know this can be done for single files, e.g.

gunzip -c my.gz > somedir/my

Can it be done for multiple files?

[UPDATE] I have a directory with a large number of .gz files (not .tar.gz), and I want to gunzip them into another directory while leaving the original files untouched.

Nifle
  • 34,998
Bio_X2Y
  • 103

6 Answers6

9

try something like

for a in *.gz; do gunzip -c $a > somedir/`echo $a | sed s/.gz//`; done
Nifle
  • 34,998
2

This will work in bash

for FILE in *.gz
do
    echo -n "File $FILE... "
    gzip -c $FILE > ${FILE%.gz}
    echo "Done"
done

Building on Andreja's answer, adding correction for file names.

Rich Homolka
  • 32,350
2

This also works. Just another way of doing it. It will not garble up your extracted text files. I have actually tested this.

for f in *.gz; do
  STEM=$(basename "${f}" .gz)
  gunzip -c "${f}" > /somedir/"${STEM}";
done
Andy N
  • 121
2

Assuming that each file has a name like

foo.bar.gz

basename (GNU coreutils 8.4) can give you the original uncompressed file's name, thus I'd do

for f in /source/dir/*.gz; do t=$(basename $f .gz); gunzip -c $f > /target/dir/$t; done
sphakka
  • 197
0

Sounds like job for a quick perl/python/ruby/etc. script.

Just adjust the code in this example to apply the necessary gunzip command.

0

This version preserves the original filetimes as inplace gzip does, and is "space-safe":

IFS=$'\n';for a in *.gz; do d=`basename $a .gz`;echo gunzip -c $a > somedir/$d; touch somedir/$d -r $a; done;

I found quoting "$a" and "$d" thus, was unnecessary as bash appeared to automatically escape spaces.