Turn For Loop into While Loop

This script does exactly what I need, it just doesn't seem as efficient as it could be.

#!/bin/sh
baseDirVar="/opt/user/"
cd ${baseDirVar}
sizeVar=`ls -s $(find . -name nohup*.out) | cut -d'.' -f 1`
for i in $sizeVar
do
        if [ "$i" -ge "5000" ];
                then
                        dirVar=`ls -s $(find . -name nohup*.out) | grep $i | cut -b 6-100`
                        curFile=`basename ${dirVar}`
                        curDir=`dirname ${dirVar}`
                        cd ${curDir}
                        ./rotate*
                        pwd
                        cd ${baseDirVar}
                        pwd
                        curFile=''
                        curDir=''
                else
                        curFile=''
                        curDir=''
                        echo "size not big enough"
                        cd ${baseDirVar}
        fi
done
exit

As you can see, it runs through the loop, which runs through the if then else for every file found matching nohup*.out, which can be up to 30 or so, but usually only 1 or two files will match the requirement to actually go through the if statement. Is there a way I can change this to a while loop to where it will only run through on the ones that are greater than 5000?

The two pwd's were just for while I was testing it, I have since deleted those two pwd commands.

---------- Post updated at 01:00 PM ---------- Previous update was at 12:44 PM ----------

Something else I just noticed. I admin quite a few different servers, and one of them had a log file that was like 100mb, so b/c of that it threw off my cut line. Is there a better and more dynamic way to use that, rather than giving it a static line of 6-100. Now, it's not a huge issue b/c I have this set to run by crontab every half-hour, so I doubt in that half hour I'll have a log that grows so much that it'll get past that digit and cause an issue, but in the initial running of the script on new servers, I have to modify it accordingly to the biggest log file on those servers.

how about -

find . -name nohup*.out -exec ls -s {} \; | awk '$1>=5000' {print substr($0,6,100) | 
while read dirVar
do
   curFile=`basename ${dirVar}`  # this variable is not used
   curDir=`dirname ${dirVar}`   
   ${curDir}/rotate*
done

Looks a lot more efficient, can you see my updated comment in the original post about the cut piece.

Also, I'd need to modify yours just a bit to cd to ${curDir} first, b/c whoever wrote the rotate script wrote it in a relative manner, so I have to be in that dir to run it. As for the curFile, yeah, again, that was a var I used in some of my test runs to make sure I was finding the correct file.

Also, you didn't define dirVar.

If you need to start at pos 6 then go on to the end of the line change the awk to

  .| awk '$1>=5000' {print substr($0,6)

is that what you mean?

What I'm wondering is there a way to make it more dynamic, that way, say a log file decides to start growing like crazy at 01, and my crontab just ran, but won't run again for 29 minutes. If it grows enough to where the size field will be longer than the 4 characters it is now, the variable set by the ls command will be off, either it will have an extra space or it will have a number in it, causing the basename/dirname command to fail.

here's the example of when the script was ran and the size was too many digits:

size not big enough
size not big enough
dirname: too many arguments
Try `dirname --help' for more information.
./sizeTest.sh: line 13: ./rotate*: No such file or directory
dirname: too many arguments
Try `dirname --help' for more information.
./sizeTest.sh: line 13: ./rotate*: No such file or directory
size not big enough

As you can see the size of the log file had too many characters and screwed up the dirname/basename command:

ls -s $(find . -name nohup*.out) | cut -d'.' -f 1
    92
    32
166066
 10314
   104
 ls -s $(find . -name nohup*.out) | cut -b 6-100
2 ./path/to/nohupFile.out
2 ./path/to/nohupFile.out
6 ./path/to/nohupFile.out
4 ./path/to/nohupFile.out
4 ./path/to/nohupFile.out

I tried using a '.' as the delimiter, but that screws it up b/c then it delimit's the .out as well, so it doesn't pick up the entire filename.

---------- Post updated at 01:25 PM ---------- Previous update was at 01:18 PM ----------

Played with it a bit more, I think I got the dynamic part down:

#!/bin/sh
baseDirVar="/opt/user/"
cd ${baseDirVar}
sizeVar=`ls -s $(find . -name nohup*.out) | cut -d'.' -f 1`
for i in $sizeVar
do
        if [ "$i" -ge "1000" ];
                then
                        dirVar=`ls -s $(find . -name nohup*.out) | grep $i | cut -d'.' -f 2`
                        curFile=`basename .${dirVar}.out`
                        curDir=`dirname .${dirVar}.out`
                        cd ${curDir}
#                       ./rotate*
                        pwd;echo "${curDir}/${curFile}"
                        cd ${baseDirVar}
                        curFile=''
                        curDir=''
                else
                        curFile=''
                        curDir=''
                        echo "size not big enough"
                        cd ${baseDirVar}
        fi
done

I cahnged the cut line to do it by period, but then for curFile and curDir I just added the period in front of and the .out behind the variable and the dirname and basename command's ran fine. I had to change it to 1000 for the size, rather than the 5000, that way I'd get some results.

Now, this one ran even a bit slower than before, but that is probably b/c I had like 6 results instead of just 1 or 2.

Anyone got anything on this. I got my script working exactly the way it should, but I'd like to be able to use a while loop or something else so it doesn't run so long.

Fix:

find . -name "nohup*.out"

Why ? It's commandline. Do you like to tell for shell to make filegeneration or tell for find to search all nohup*.out files ?

Test it in some directory where you have those files:
echo "nohup*.out"
echo nohup*.out

Final Solution:

#!/bin/sh
baseDirVar="/opt/user/"
cd ${baseDirVar}
sizeVar=`find . -name "nohup*.out" -and -size +5120k`
for i in $sizeVar
do
                dirVar=`dirname ${i}`
                cd ${dirVar}
                ./rotate*
                cd ${baseDirVar}
done
date "+Script complete - %c"
exit

Instead of using a while loop, I just modified my find command to only find files over a certain size, and then I can run my loop through that variable and perform the tasks needed.