Argument list too long - Shell error

Trying to tar specific files from a directory causes problems when the number of files is too large.

ls ~/logs | wc -l
5928

In the logs directory - I have 5928 files

If I want to include all files with today's date - I run the following command

tar cf ~/archive/LoadLogs_20060302.tar ~/logs/LoadLog_20060302_*.log

However, I get the following error:

/usr/bin/tar: Argument list too long

Any suggestions on how to correct this would be appreciated.

Thanks,

Rick

Your tar command doesn't look right. You need something like:
tar cf ~/archive/LoadLogs_20060302.tar ~/logs/LoadLog_20060302_*.log

You can specify an "include file" usually with a -I option. To use, create a directory listing into the include file:

ls -1 ~/logs/LoadLog_20060302_*.log > /tmp/tar.include

Then run the tar command

tar -cf ~/archive/LoadLogs_20060302.tar -I /tmp/tar.include

Thanks for the suggestions - however when I use an asterisk in my matching criteria (even for the "ls" command) - I get the same error.

I guess the default limit for an argument list is used for ALL commands.
Commands like "ls" "gzip" "tar" "cp" "mv" "cat" - all complain when I try to limit the argument list.

I was hoping there was some shell configuration setting that would increase the allowable size - but I'm beginning to think that it's the programs problem - not the shell environment.

Rick

Quite right, I should have seen that coming. Use the find command to generate the include file:

find ~/logs/ -name "LoadLog_20060302_*.log" > /tmp/tar.include

This isn't AIX by chance is it?

not c shell by any chance?

Not C shell.
Running ksh on Unix and Linux - same results on both.

Thanks

the 'ls' command fails with the same error too???? that's pretty bizarre. i would go with hegemaro's suggestion of generating a file list.

For the call to tar, I think I will.

The problem shows up in areas where we would not have expected it.

Try removing 5000 files that match *.old
rm *.old fails.

Or try moving 5000 files that match *.dat

I've had to resort to multiple pipes to get the job done that should work with a single command.

ls | grep dat$ |awk '{ print "mv " $0 " /home/mydir/dat_files" }' | sh

is a long way around what should be:
mv *.dat /home/mydir/dat_files

And I'll bet it isn't as efficient.

Production scripts that have worked for years are now failing due to reaching that magic number of files.

I just wish there were a global solution - like setting a system parameter - that would solve it.

Oh well -

Thank you all for your suggestions.

Rick - aka dad5119