Copy last 30 minutes' contents from a log file

Hi Guys,

I am writing a bash script to capture the last 30 minutes's contents from log file to a new file. This job is a scheduled job and will run every 30 minutes. The log file is db2diag.log in DB2. I am having difficulties copying the last 30 minutes's contents. Can someone please help me. Thanks

The sample log file is like

019-10-22-00.02.53.593412-300 E2310A623            LEVEL: Warning
PID     : 17432874             TID : 8134           PROC : db2sysc 0
INSTANCE: db2pdtrg             NODE : 000           DB   : GWDITRGP
APPHDL  : 0-39400              APPID: 10.33.39.144.51753.191022050231
AUTHID  : DSMMONDB             HOSTNAME: lcofndbp1
EDUID   : 8134                 EDUNAME: db2agent (GWDITRGP) 0
FUNCTION: DB2 UDB, RAS/PD component, PANotifLogColl::paGetNotifyLogFiles, probe:50
MESSAGE : ECF=0x9000001A=-1879048166=ECF_FILE_DOESNT_EXIST
          File doesn't exist
DATA #1 : String, 37 bytes
No valid notification log file found.

2019-10-22-00.02.53.630281-300 E2934A623            LEVEL: Warning
PID     : 17432874             TID : 8134           PROC : db2sysc 0
INSTANCE: db2pdtrg             NODE : 000           DB   : GWDITRGP
APPHDL  : 0-39400              APPID: 10.33.39.144.51753.191022050231
AUTHID  : DSMMONDB             HOSTNAME: lcofndbp1
EDUID   : 8134                 EDUNAME: db2agent (GWDITRGP) 0
FUNCTION: DB2 UDB, RAS/PD component, PANotifLogColl::paGetNotifyLogFiles, probe:50
MESSAGE : ECF=0x9000001A=-1879048166=ECF_FILE_DOESNT_EXIST
          File doesn't exist
DATA #1 : String, 37 bytes
No valid notification log file found.

2019-10-22-00.04.17.390138-300 E3558A576            LEVEL: Error
PID     : 17432874             TID : 152830         PROC : db2sysc 0
INSTANCE: db2pdtrg             NODE : 000           DB   : GWDITRGP
APPHDL  : 0-46568              APPID: 10.33.39.144.51779.191022050420
AUTHID  : DSMMONDB             HOSTNAME: lcofndbp1
EDUID   : 152830               EDUNAME: db2agent (GWDITRGP) 0
FUNCTION: DB2 UDB, catalog services, sqlrl_evmon_eventtable, probe:2540
MESSAGE : ADM4002W  The Event Monitor target table "LOCK_EVENT" (table schema 
          "IBM_RTMON" ) already exists.

How I've tackled this type of problem in the past is to have the script "Remember" where it got up to in a file and start from there next time.

Example:

LOGFILE=/DB2/db2diag.log 
NEWFILE=${LOGFILE%.log}_$(date +%Y%m%d%H%M).log
CHKFILE=/var/scriptdata/db2diagline.dat

EndLine=$(awk 'END{print NR}' $LOGFILE)
if [ -f $CHKFILE ]
then
    read StartLine < $CHFILE
    StartLine=${StartLine:-0}
    # Test for log file truncated
    [ $EndLine -lt $StartLine ] && StartLine=0
else
    StartLine=0
fi

awk -vS=$StartLine -vE=$EndLine 'NR>E{exit} NR>S' $LOGFILE > $NEWFILE

#Save EndLine for StartLine next time
echo "$EndLine" > $CHKFILE
1 Like

Thanks a lot for your help

If you have logrotation in place you may want to know if a log-switch happened. I did this by checking if the first line of a file is still the same. (Hopefully it's not an empty line :wink: ) ... like this:

LOGFILE=/DB2/db2diag.log 
NEWFILE=${LOGFILE%.log}_$(date +%Y%m%d%H%M).log
CHKFILE=/var/scriptdata/db2diagline.dat
CHKFILE2=/var/scriptdata/db2firstline.dat
read FIRSTLINE_SAVED < $CHKFILE2
FIRSTLINE="$(head -n1 $LOGFILE)"

if [ "$FIRSTLINE" == "$FIRSTLINE_SAVED" ] ; then 
   EndLine=$(awk 'END{print NR}' $LOGFILE)
   if [ -f $CHKFILE ] ; then
       read StartLine < $CHFILE
       StartLine=${StartLine:-0}
       # Test for log file truncated
       [ $EndLine -lt $StartLine ] && StartLine=0
   else
      StartLine=0
   fi
else
   StartLine=0
   head -n1 $LOGFILE > $CHKFILE2
fi


awk -vS=$StartLine -vE=$EndLine 'NR>E{exit} NR>S' $LOGFILE > $NEWFILE

#Save EndLine for StartLine next time
echo "$EndLine" > $CHKFILE

Assumption is that if a logrotation occurs the resulting file will be smaller that the last run.
This is identified with [ $EndLine -lt $StartLine ] && StartLine=0

On systems I have worked with log files grow at a fairly consistent rate and logrotation only occurs every week.

Not to say it's impossible but and I've never encountered a file that got truncated and then grew bigger than it's previous size within 30mins.