Given the complexity of some of the answers, I'd say Perl or Python are the right tools for the job of walking a directory tree, getting a list of directory entries, and checking if it's exactly equal to a 1-element list containing .directory (and that it's a regular file).
Having find fork/exec a complex sh -c command for every subdirectory seems like a waste, and my eyes are glazing over trying to follow the logic in Kamil's answer. Assuming a recursive directory listing of the whole tree will easily fit in RAM on your system, simple tools are the way to go.
Perl has File::Find which recurses depth-first, which is good, but it wants to call your callback function on each directory entry. So if you wanted a directory listing you'd have to do it yourself on top of that. Not a showstopper but not ideal, so I looked at Python.
Python's directory walk standard library function is os.walk, which is breath-first. But interestingly, it produces an iterator of root, dirs, files (usage example I copied), where the latter two are lists of the contents of the directory root, split into directory and non-directory files. That's exactly what we want; or would be in a depth-first search.
For your use-case, it's probably fine to remove the .directory file in a directory containing only other directories? If so, we can do that with Python, then use find -depth -type d -empty -delete (or -print -delete to see what dirs we'll kill).
#!/usr/bin/python
import os
import sys
if (len(sys.argv) != 2):
print ("usage: dir-remove path")
exit(1) # not really needed; sys[1] errors on its own if there aren't enough args, and I haven't implemented looping over multiple args. But make sure
for root, dirs, files in os.walk(sys.argv[1]):
print (root, dirs, files)
if (files == [".directory"]):
# only one non-directory file and it's called .directory
print("os.unlink " + os.path.join(root, ".directory") ) # debug
#os.unlink (os.path.join(root, ".directory") ) # for real
Test setup:
$ mkdir -p foo/bar/baz
$ touch foo/bar/baz/.directory foo/bar/.directory foo/{a.txt,.directory}
$ find foo -exec ls -dn {} +
drwxr-xr-x 3 1000 1000 100 Dec 7 00:35 foo
-rw-r--r-- 1 1000 1000 0 Dec 7 00:05 foo/a.txt
drwxr-xr-x 3 1000 1000 80 Dec 7 00:35 foo/bar
drwxr-xr-x 2 1000 1000 60 Dec 7 00:35 foo/bar/baz
-rw-r--r-- 1 1000 1000 0 Dec 7 00:35 foo/bar/baz/.directory
-rw-r--r-- 1 1000 1000 0 Dec 7 00:35 foo/bar/.directory
-rw-r--r-- 1 1000 1000 0 Dec 7 00:05 foo/.directory
usage, debug-print version:
./dir-remove.py /tmp/foo
/tmp/foo ['bar'] ['a.txt', '.directory']
/tmp/foo/bar ['baz'] ['.directory']
os.unlink /tmp/foo/bar/.directory
/tmp/foo/bar/baz [] ['.directory']
os.unlink /tmp/foo/bar/baz/.directory
Usage, for real: run the Python script to remove .directory files, then find ... -delete to remove empty directories depth-first. I could have had Python exec find after the loop if I wanted to build more of a reusable solution.
$ ./dir-remove.py foo # with both prints commented, os.unlink uncommented
$ find foo
foo/a.txt
foo/.directory
foo/bar
foo/bar/baz
$ find foo -depth -type d -empty -print -delete
foo/bar/baz
foo/bar
All that remains is foo/a.txt and foo/.directory (and the directory foo itself of course).
Limitation: foo/bar/.directory would have been removed even if foo/bar/baz was non-empty and didn't end up getting removed. I don't see a trivial solution for that without doing my own depth-first search in Python, or going back to Perl and using its File::Find and doing a directory listing myself.
I only know a little bit of Python, but this was still very easy to cook up, and I'm relying on standard libraries for the heavy lifting so hopefully it should be robust.