rm long list of files in a directory

Hi,

I have used a command rm *.txt in a directory which has about 2000 txt files.

but it is throwing the following error:

error: args list too long.

any ideas pls..

find /path/to/dir/ -iname '*.txt' -exec echo rm '{}' ';'

Run it once to be sure it picks the right files, then remove the 'echo' and run it again to do it for real.

rm `find -name "*.txt" `

The list generated is limited in size. Try breaking your request down to the size it can handle like:
rm a*.txt
rm b*.txt
and so forth. If one of the sub groups is too large you will need to break it down further.
I think the limit is 256 but it has been a while since I tried to difine it.

If it's too many arguments with globbing, how do you expect backticks to help cram them into one argument list? That's not going to work!

---------- Post updated at 01:47 PM ---------- Previous update was at 01:45 PM ----------

This is not an effective or general-purpose solution, especially since the size of a commandline may be limited by the size of the names and not just their number. The proper way to deal with it is to use pipes or tools, not to keep randomly trying different globs until you find one small enough.

find /path/to/dir/ -iname '*.pdf' -exec echo rm '{}' ';'

after running this command, I am getting the following error:

find: bad option -iname

Try 'name' instead. 'iname' is the case-insensitive variant supported by the GNU variant I usually use. And you are replacing /path/to/dir with the path to your dir, right?

corona688 has the right idea by getting "find" to expand the list, not shell.

This alternative method allows you to test the script without deleting anything (like corona688 version), works for any number of file in the directory, and works for filenames containing space characters:
The "." in the filename is escaped as "\." to make the point that this is not MSDOS and "." can be part of a regular expression.

We don't know if you have subdirectories containing files you don't want to delete. Otherwise generate the files list more efficiently with:

find . -type f -name \*\.txt -print
#!/bin/ksh
ls -1 *\.txt | while read filename
do
      echo "Deleting: ${filename}"
      # Uncomment next line if you are happy
      # rm "${filename}"
done

try the following:

find . -name '*.txt' | xargs rm

like this xargs will take care of splitting up the list of files into acceptable chunks. it will execute much faster than running rm for every single file found (what will happen when using the -exec option) without worrying about the number of files passed onto rm.

find -maxdepth 1 -name '*.txt' | xargs rm

:wink:

didnt wrked

which command isn't working?

Leppie version is good. It will execute "rm" for each file like all sensible solutions. I can't see a way of testing it without deleting files.

In a modern unix, frequently executed programs are cached in memory and don't give the performance issues of 1980's unix. Similarly issuing a continuous stream of "exec" calls no longer crashes the kernel.

There are many methods to process large numbers of files, but to my mind the most important aspect is to be able to test what will happen in advance.

There is much variation in the "find" command depending on your version and vintage of unix or Linux. It always helps to state yout Operating System and preferred shell.

For example the "-maxdepth" parameter to "find" is not found in many unixes.

find /path/to/dir/ -iname '*.pdf' -exec echo rm '{}' ';'

This worked!!

But I bet the files are still there (tongue in cheek).

find /path/to/dir/ -iname '*.pdf' -exec echo rm '{}' ';'

funny, cause you just posted:

what changed???

---------- Post updated at 10:53 PM ---------- Previous update was at 10:52 PM ----------

and he's searching for pdfs now...

Yep that command line had a number of typos which meant that it didn't delete files.