This script does exactly what I need, it just doesn't seem as efficient as it could be.
#!/bin/sh
baseDirVar="/opt/user/"
cd ${baseDirVar}
sizeVar=`ls -s $(find . -name nohup*.out) | cut -d'.' -f 1`
for i in $sizeVar
do
if [ "$i" -ge "5000" ];
then
dirVar=`ls -s $(find . -name nohup*.out) | grep $i | cut -b 6-100`
curFile=`basename ${dirVar}`
curDir=`dirname ${dirVar}`
cd ${curDir}
./rotate*
pwd
cd ${baseDirVar}
pwd
curFile=''
curDir=''
else
curFile=''
curDir=''
echo "size not big enough"
cd ${baseDirVar}
fi
done
exit
As you can see, it runs through the loop, which runs through the if then else for every file found matching nohup*.out, which can be up to 30 or so, but usually only 1 or two files will match the requirement to actually go through the if statement. Is there a way I can change this to a while loop to where it will only run through on the ones that are greater than 5000?
The two pwd's were just for while I was testing it, I have since deleted those two pwd commands.
---------- Post updated at 01:00 PM ---------- Previous update was at 12:44 PM ----------
Something else I just noticed. I admin quite a few different servers, and one of them had a log file that was like 100mb, so b/c of that it threw off my cut line. Is there a better and more dynamic way to use that, rather than giving it a static line of 6-100. Now, it's not a huge issue b/c I have this set to run by crontab every half-hour, so I doubt in that half hour I'll have a log that grows so much that it'll get past that digit and cause an issue, but in the initial running of the script on new servers, I have to modify it accordingly to the biggest log file on those servers.