I am writing a shell script that checks catalina logs on a production system and mails me if it detects errors.
It greps the logs for known errors which i have defined as variables.
The problem is the logs are huge, approx 30,000 before they rotate.
So I am forced to use grep instead of tailing 30,000 lines.
But while testing grep I end up getting re-notified of previous errors that still exist in the logs.
That is quite an expensive way to do it since it causes sed to read the entire file twice per iteration. You can instead obtain the $now from the file size, and use tail -c to jump to that offset:
start=$(<wherewasi)
now=$(ls -l moxy.log | awk '{print $5}')
if [[ $now -lt $start ]] ; then
start=0
fi
echo $now > wherewasi
tail -c +$(($start + 1)) moxy.log | grep 'stuff i want to find'