Trying to find files equal to and greater than

Hi Guys and Gals,

I'm having some difficulty putting this check into a shell script. I would like to search a particular directory for a number of files. The logic I have is pretty simple:

Find file named *.txt that are newer than <this file> and count them
If the number of files is equal to or greater than 1 echo "all is good"
Otherwise echo "all is not good"

I'm having difficulty using -eq | -gt at the same time.

I have a variable

filecount=`ls -l | wc -l`
if [filecount -eq 1] 
echo all is good 

How do I get it to do equal and greater than in the same line?

if [ $filecount -ge 1 ]

Easy as that huh?

Thanks. I just realized it would've also worked if I had just said greater than 0. :smiley:

Thanks

Only that that condition will always be true, as "ls -l" will print at least the "total" line. If you use only "ls", without any options, you'll get the correct file count (ls, if not printing to a TTY, will only list 1 file per line)

In one of my directories there are about 1300 files and the script is giving me the arg list too long error. I tried to put it in a for statement but it's telling me there are 0 files.

What's wrong with this command?

for file in *.txt; do find /dir1/dir2/ -prune -type f -mtime +1 -exec ls -l $file {} \;; done | wc -l

There are in fact files older than 1 day.

---------- Post updated 07-06-10 at 10:59 AM ---------- Previous update was 07-05-10 at 03:38 PM ----------

What am I doing wrong with this command please?

for file in *.txt; do find /dir1/dir2/$file -prune -type f -mtime +1 -exec ls -l {} \;; done | wc -l

Keeps telling me arg list too long.

---------- Post updated at 11:10 AM ---------- Previous update was at 10:59 AM ----------

I got it to work.

for file in *.txt; do find /dir1/dir2/$file -prune -mtime +3 -exec echo {} \;; done | wc -w

---------- Post updated at 11:34 AM ---------- Previous update was at 11:10 AM ----------

scratch that ... still giving the same error:

bash: /usr/bin/find: The parameter or environment lists are too long