Pls. help with script to remove million files

Hi,

one of the server, log directory was never cleaned up. We have so many files. I want to remove all the files that starts with dfr* but I get error message when I use the *.

 
rm qfr*
bash: /usr/bin/rm: Arg list too long

I am trying to write this script but not working.

 
find .  -type -f -exec rm -f {} 

Please help.

did you specify the trailing\; (quoted semicolon); as in...

find .  -type -f -exec rm -f {} \;

i suggest you use the below code in a loop so that your system does not get hooked up on memory.

find . -type f -name dfr* -mtime +5 -exec rm -f {} \;

alternatively you can also fire this in a for loop if you know the oldest bfr* file

hope it helps

By default, ls do sort by name which may be painful from a performance point of view if you have a lot of files.
You could use the -f options to disable sorting then get those you want and remove them (in the example below 5 by 5 ... don't take too many of them not to exceed the command line max number of characters )

ls -f | egrep -e "^qfr" | xargs -n5 rm
2 Likes

The belt and braces method is to not let shell see the list

Assuming that you want to remove qfr* (not dfr*) and that the directory does not contain subdirectories containing files which you do not want do delete. if your "find" does not have "-print", then omit the parameter.

find . -type f -name qfr\* -mtime +5 -print | while read FILENAME
do
       # Remove echo after testing
       echo rm "${FILENAME}"
done

Try with the "echo" in first to check that the list of commands is suitable.
The "-mtime +5" parameter is designed to avoid deleting anything created in the last 5 days.