Advice on monitoring gziped files

so, every 5 minutes, i monitor some data files based on their line numbers.

so if i checked now and there are 100 lines in the file, i will scan the 100 lines for specific errors.

if i check again 5 minutes later and there are 150 lines in the file, i will monitor the file from line 100 to 150. since only 50 lines were added since i last checked.

now, lets say the file being monitored is called dataccess.log.

when this file gets rotated, it is named dataccess.log.gz.

and then there's a new file called dataccess.log

i know there are commands such as zgrep, zcat and these other types on linux. but these commands dont exist on other UNIX OSes, i.e. solaris.

so my question is, how would you guys scan the "gzipped" version of the file to make sure you didn't miss anything? then scan the dataccess.log as usual. keep in mind i already have the line numbers on record. so maybe that could be of use to you?

If you look inside gzcat on many systems, you'll find that it's a shell script containing this:

gunzip -cd "$@"

-c for "print to stdout", -d for "decompress".

so

( gunzip -cd /path/to/file.log.gz ; cat /path/to/logfile ) | grep ...
1 Like

thank you so much!!! had no idea it could be done this way.

the only problem i have is. i dont know how big the gzip log file could be. so "cat"ing it can be disastrous if the gzip file is huge.

the only alternative is to write the uncompressed output to disk but that's a bad idea.

if there's absolutely no other solution, would the following be the most efficient way to go about what i'm trying to do?:

( gunzip -cd /path/to/file.log.gz ; cat /path/to/logfile | awk 'NR>100' ) | grep FAIL 

also, i worry that the command could/would overwrite the new copy (dataccess.log) that was created when the data file was rotated.