0

Similar to this question and this question I'd like to make a single compressed tar file (files.tar.gz) that contains all the php files in my webroot and all it's subdirectories, maintaining folder structure. My webroot is called 'public_html'.

I have not tried anything yet because I'm not sure where to start. The second question that I linked actually comes pretty close to what I'm looking for, but it makes each found file a separate GZ, if I read it right.

Actually, the monolithic nature of the tar is more important to me than the compression of gzip. I just want to pull this file onto my windows machine where I can search through it with the tools I am used to.

TecBrat
  • 148

1 Answers1

4

find -iname '*.php' -print0 | xargs -0 tar -rf php_backup.tar

This uses the second answer you linked to, but replaces the -c with -r

From man tar:

-c      Create a new archive containing the specified items.
-r      Like -c, but new entries are appended to the archive.  Note that
        this only works on uncompressed archives stored in regular files.
        The -f option is required.

An other option would be to replace the -print0 with -exec. I think that will be more efficient, but it might break if you have thousands or ten of thousands of files.

[Edit]

1) The -print0 and -0 after xargs are only needed if you have inconvenient separators in your filenames. (read names with spaces). You might not need those.

2) I checked the command again and tested it. find /public_www -name '*.php' -print | xargs feeds a single line to xargs. Thus the example you linked to translates to something like xargs -0 tar -cf docs.tar *.php and should also work.

Hennes
  • 65,804
  • 7
  • 115
  • 169