3

I have a bunch of huge files (4GB+ each) with the same extension in a directory structure in a network drive, but am only interested in the first few bytes of each, so I'd like to copy them all to my local drive, with the same filenames and maintaining the same folder structure.

I've tried:

$ find . -type d -exec mkdir -p ~/Desktop/heads/{} \;
$ find . -type f -name "*.ext" -exec head -c 128 {} > ~/Desktop/heads/{} \;

But it doesn't work, since they are all put in a file called '{}', probably because the '>' operator is not being interpreted as part of the -exec argument, and if I escape it with '\>', then it is escaped down to the head argument, which doesn't work either.

user2986
  • 133

2 Answers2

1

You've correctly observed that the problem is the > redirect. Two solutions:

Either use something like this:

find . -type f -name "*.ext" -exec sh -c "head -c 128 '{}' > ~/Desktop/heads/'{}'" \;

Or you can also pass the '{}' as an argument to the subshell like so:

find . -type f -name "*.ext" -exec sh -c 'head -c 128 $1 > ~/Desktop/heads/$1' -- {} \;

The latter works because -- {} passes the filename as an argument to sh -c, which can be accessed by $1. Note that you'd now have to use single quotes ' instead of double quotes.


Update: I've actually found a Stack Overflow question that covers the underlying problem of yours, namely using > within xargs or similar commands:

How to use > in an xargs command?

I want to find a bash command that will let me grep every file in a directory and write the output of that grep to a separate file. [...]

slhck
  • 235,242
1

Your redirect is breaking it. Change to this:

$ find . -type d -exec mkdir -p ~/Desktop/heads/{}
$ find . -type f -name "*.ext" -exec sh -c "head -c 128 {} > ~/Desktop/heads/{}" \;
bahamat
  • 5,782