awk to monitor size in directory

i'm trying to find the most efficient way to monitor specific files in different directories in one go.

/var/log/
/var/app/
/var/db/

each one of these directories can have subdirectories which need to be looked into as well.

I want to find any file in this directory that has the name ".gmei" in its filename, and check if the size of the file is at least 9000MB. if it is, spit out the file names.

my initial approach to this was something like this:

find /directory -name ".gmei" -maxdepth | while read line
do
du -s $line 
calculate the size and alert
done

the problem is i'm almost certain this method wont be efficient.

is there any other way to accomplish this quickly?

Doesn't your find version have the -size test?

1 Like

Your specification of 9000Mb is strange. Assuming that you mean 9Gb (as in 9*1024**3 bytes), try:

find /var/log /var/app /var/db -name '*.gmei*' -size +9G

If the version of find on your system doesn't accept -size +9G , try changing it to -size +9663676416c .

If you meant files with sizes greater than 9,000,000,000 bytes, try:

find /var/log /var/app /var/db -name '*.gmei*' -size +9000000000c
1 Like

worked perfectly. thanks guys!

one last question on this:

find /var/log -name "*.test*" -exec du -m {} \;

how do i add a second command to the above so it also includes the number of seconds ago that each file was last modified?

something like:

find /var/log -name "*.test*" -exec du -m {} \; -exec perl -le "print ((stat '${eachFile}')[9])"

Another issue i'm having is, i need to catch files that breach the warning and critical thresholds by issuing only one find command. currently, im running two separate find sessions, one for warning, the other for critical. i'd like to be able to combine them into one.

what im currently doing is:

warning=`find /var/log -name "*test.log*" -size +20000c -amin -1440 -exec du -m {} \; 2>/dev/null`
critical=`find /var/log -name "*test.log*" -size +60000c -amin -1440 -exec du -m {} \; 2>/dev/null`

any ideas?

You'll need to write a script that accepts filenames and deals with them as you require:

Then just call the script from your find command for all files over the warn size, the script can then decide if the file is a warning or a critical and deal as appropriate.

You don't mention your OS so I can't get too specific on how the script will check times and sizes on your OS.
However, on many linux systems you can use something like this:

read size mtime <<EOF
$(stat -c "%s %Y" filename)
EOF
printf "file: %s is %d bytes, modified %d seconds ago\n" filename $size $(( $(date +%s) - mtime))

This is from man find on my Ubuntu linux system:

the OS i'm going to be using this on is AIX (5.2) and Linux.

things work fine on linux, but aix is proving to be quite challenging.

find /tmp -name "*.gmei*" -perm +o=r -size +5000000c -mtime -1 -exec du -m {} \;

the /tmp runs deep with several subdirectories. as a result, the above command can take a long time to complete. which is why i want to be as specific as i can as to the type of file i want find to look for.

can anyone suggest a better idea? what i want to do is find all files that under 12 hours old.

For 12h, use -mmin -7200 instead of -mtime. And, don't use -exec du {} , but -ls or -printf (if available).

1 Like