2

Had an issue several hours ago with a client mail account, which got compromised. As a result, he had hundreds of thousands of spam queued in postfix, which lead to several issues. Everything's fixed and security's tied up, apart from one "small" issue: the client now has close to 100k of returned spam mails in his inbox. And obviously, I'm looking for a bulk operation with some filtering as everything's not junk. PF runs on Ubuntu server v10.x, with maildir.

I tried this command on a backed-up folder containing the same files:

grep -l -r 'Undelivered' | xargs rm

But it doesn't seem to do anything apart from running.

Can this come from the fact that all the "mails" are stored inside files named this way:

1395063807.V902Ib2081dM533672.ip.ip.ip:2,

djsmiley2kStaysInside
  • 6,943
  • 2
  • 36
  • 48
shroom
  • 33
  • 6

1 Answers1

1

Depending on your version of grep, this might wait forever because you don't have given a file (or directory) name as argument; grep's behavior to use the current working directory when -r is specified is a rather new feature. Thus if your version is an older one, this your call might wait forever for input on stdin. Just add . as last argument to grep to avoid this case.

To avoid issues with file names (which shouldn't be a problem in this case), it would be safest to call

grep -Zl -r 'Undelivered' . | xargs -0 rm --

This way, grep outputs the matching file names zero-byte ('\0') separated, which avoids trouble with spaces and alike in file names. -- tells rm to not treat the following arguments as options, i.e. if a file name starts with a - it doesn't break.

To see whether this command does anything at all, you could add the -v option to rm, so you could see whether rm does actually remove anything (in front of --, of course).