5/30 Let's say I have a directory full of subdirectories, each of which
has a number of files in it. How can I delete all subdirectories that
have, say, less than 10 files in them?
\_ fewer
\_ There's no single standard command that will do this. You'll
need to write something that will count files per directory,
make a list of targets and pass that to rm.
\_ might the "find" command help?
\_ That is one part of one possible solution. What I would do
is write the whole thing in perl. Perl has a built in
find-like function, can opendir() and readdir(), and allow
you to easily maintain state and lists of target dirs. It's
a simple two phase process: 1) make target list, 2) kill
targets. Obviously, you'll want to test your code in a
test directory or you risk removing the wrong files.
\_ mosquito. SLEDGEHAMMER! <BABAMMMM!!!!!!>
\_ Uhm, yeah, the following are any different?
\_ Exactly 80 chars:
for(grep{!/^\.\.?$/&&-d}<*>){opendir D,$_;unlink if@{[readdir D]}<12;closedir D}
--dbushong
\_ More than 80 chars:
sub z{my @a=`find $_[0] -type d -maxdepth 1 -mindepth 1`;chomp @a;@a};
for (z(".")) { print "$_\n" if scalar(z($_))>$n; } # --darin
\_ A bit shorter and more correct:
use File::Path;
for(grep{-d}<*>){rmtree $_ if <$_/*> < 10}
--dbushong
\_ I don't think this works.
scalar(<$_/*>) does not return
the file count. In any event,
the script deleted stuff it
wasn't supposed to and didn't
delete stuff it should. Good
\_ I don't think this works. scalar(<$_/*>) does not return
the file count. In any event, the script deleted stuff it
wasn't supposed to and didn't delete stuff it should. Good
thing I backed up. -- op
\_ Yeah, it needs @{[ ]} around it like mconst's... but use his.
And yeah, anytime there's a "shortest perl" contest on the
motd, Caveat Executor. --dbushong
\_ @{[<$_/{.,}*>]}<12&&`rm -r \Q$_`for<*> # --mconst
\- hmm, there are some interesting ways to do this ...
the exact details vary slightly based on things like, are all
the dirs only one deep, do you count sub-subdirs are files etc.
i think it is better to do this modularly rather than trying to
save characters. here is one different approach:
find -type f | xargs -n 1 dirname | sort | uniq -c |
egrep -v '[0-9][0-9]' | grep /
there are some obvious shortcuts or changes to make more
robust [e.g. if there are spaces in name], if you are going
to do this once or mutiple times, performance etc. [yes i know
the dirname call is expensive ... that can be replaced, but if
this is a one shot thing, debugging time is more expensive
than machine cycles]. --psb
\-and it admittedly relied on a hack to deal with the "less
than 10 part" ... but all this is easily remedied. --psb |