i'm trying to find the most efficient way to monitor specific files in different directories in one go.
/var/log/
/var/app/
/var/db/
each one of these directories can have subdirectories which need to be looked into as well.
I want to find any file in this directory that has the name ".gmei" in its filename, and check if the size of the file is at least 9000MB. if it is, spit out the file names.
my initial approach to this was something like this:
find /directory -name ".gmei" -maxdepth | while read line
do
du -s $line
calculate the size and alert
done
the problem is i'm almost certain this method wont be efficient.
is there any other way to accomplish this quickly?
Another issue i'm having is, i need to catch files that breach the warning and critical thresholds by issuing only one find command. currently, im running two separate find sessions, one for warning, the other for critical. i'd like to be able to combine them into one.
You'll need to write a script that accepts filenames and deals with them as you require:
Then just call the script from your find command for all files over the warn size, the script can then decide if the file is a warning or a critical and deal as appropriate.
You don't mention your OS so I can't get too specific on how the script will check times and sizes on your OS.
However, on many linux systems you can use something like this:
the /tmp runs deep with several subdirectories. as a result, the above command can take a long time to complete. which is why i want to be as specific as i can as to the type of file i want find to look for.
can anyone suggest a better idea? what i want to do is find all files that under 12 hours old.