Retrieve logs for last 4 hours

Our log file looks like this:

2010-11-18 00:57:01,698                  ERROR
Shipment Error Log:Error Details - Account Id:3962, PO:2710015, Line:2, File:221112901, Version:V1, Desc:Order cannot not be shipped as there are no line items in New state
2010-11-18 14:59:39,960                  ERROR
Shipment Error Log:Error Details - Account Id:3962, PO:2573485, Line:2, File:221115901, Version:V1, Desc:Order cannot not be shipped as there are no line items in New state

We need to get entries which come in the last 4 hours.While trying awk command it is treating 'Shipment Error Log.." as second record and giving error while doing date comparison.

What is the command to select from the first match of condition(logtime > (currentime-4 hours)) till end of file?

Hey, try this:

Thanks for your quick response.

We checked using your code.But still it copies all the logs in the file without checking the condition:(

try this:

1 Like

I think it should check the date condition first. here is how i get the outfile as:

2010-11-20 14:59:39,960                  ERROR
Shipment Error Log:Error Details - Account Id:3962, PO:2573485, Line:2, File:221115901, Version:V1, Desc:Order cannot not be shipped as there are no line items in New s
2010-11-18 15:00:06,236                  ERROR
Shipment Error Log:Error Details - Account Id:7238, PO:2591168, Line:2, File:221116801, Version:V2, Desc:Order cannot not be shipped as there are no line items in New s
2010-11-19 15:05:06,250 ERROR
Shipment Error Log:Error Details - Account Id:7238, PO:2591168, Line:2, File:221116801, Version:V2, Desc:Order cannot not be shipped as there are no line items in New s
2010-11-18 20:20:25,250 ERROR
Shipment Error Log:Error Details - Account Id:7238, PO:2591168, Line:2, File:221116801, Version:V2, Desc:Order cannot not be shipped as there are no line items in New s
 

if you see, it is not checking the date, but only the time. Is it possible to first locate the occurance of the first record satisfying the condition i.e, sysdate-time - 4hrs and copying the logs till end of file?

---------- Post updated at 05:16 AM ---------- Previous update was at 05:00 AM ----------

Thanks Kevintse, your second command saved us! it works wonders for us now...
I just noticed that my dummy log file contained dates jumbled up, so it was not fetching the correct records. But trying with a real log file, it worked!!.

Can you please explain how does this work?

BEGIN{f=0} !f { s=$0; sub(/,.+/, "", s); gsub(/[-: ]/, " ", s); t=(systime()-mktime(s));

The code check the date and time (like this: 2010 11 19 14 57 01).
But your log file looks weird, "2010-11-20 14:59:39,960" goes before "2010-11-18 15:00:06,236"....

---------- Post updated at 05:26 AM ---------- Previous update was at 05:17 AM ----------

in the script while using the awk command with ssh like:

ssh user@IP awk 'BEGIN{f=0} !f { s=$0; sub(/,.+/,\"\", s); gsub(/[-: ]/,\" \", s);t=(systime()-mktime(s)); if(t<=14400) f=1 } f' /home/user/log.txt >> erroroutput.txt

it gives the error:

bash: -c: line 0: syntax error near unexpected token `/,.+/,'
bash: -c: line 0: `awk BEGIN{f=0} !f { s=$0; sub(/,.+/, \"\", s); gsub(/[-: ]/, \" \", s);t=(systime()-mktime(s)); if(t<=14400) f=1 } f /home/user/log.txt'

any help?

---------- Post updated at 02:11 AM ---------- Previous update was at 02:04 AM ----------

in the script while using the awk command with ssh like:

ssh user@IP awk 'BEGIN{f=0} !f { s=$0; sub(/,.+/,\"\", s); gsub(/[-: ]/,\" \", s);t=(systime()-mktime(s)); if(t<=14400) f=1 } f' /home/user/log.txt >> erroroutput.txt

it gives the error:

bash: -c: line 0: syntax error near unexpected token `/,.+/,'
bash: -c: line 0: `awk BEGIN{f=0} !f { s=$0; sub(/,.+/, \"\", s); gsub(/[-: ]/, \" \", s);t=(systime()-mktime(s)); if(t<=14400) f=1 } f /home/user/log.txt'

any help?

The code below should work(NOT tested).

ssh user@IP /bin/bash <<\EOF 
	awk ' BEGIN{f=0} !f { s=$0; sub(/,.+/, "", s); gsub(/[-: ]/, " ", s); t=(systime()-mktime(s)); if(t<=14400) f=1 } f ' /home/user/log.txt
EOF >> erroroutput.txt

Thanks kevin.It worked with some modifications.

ssh user@IP /bin/bash  <<\EOF
awk ' BEGIN{f=0} !f { s=$0; sub(/,.+/, "", s); gsub(/[-: ]/, " ", s); t=(systime()-mktime(s)); if(t<=14400) f=1 } f ' home/user/log.txt >  erroroutput.txt
EOF

But errorOutput.txt is getting created in the remote server and not in our local server.

You put "> erroroutput.txt" inside the EOF pair, so erroroutput.txt is supposed to be created on remote server.
Actually you can save the code in the following in a file, let's name it, fetch_log.sh

ssh user@IP /bin/bash <<\EOF 
	awk ' BEGIN{f=0} !f { s=$0; sub(/,.+/, "", s); gsub(/[-: ]/, " ", s); t=(systime()-mktime(s)); if(t<=14400) f=1 } f ' /home/user/log.txt
EOF

make it an executable from shell chmod +x fetch_log.sh, and you run ./fetch_log.sh > erroroutput.txt, I think this should satisfy your requirements.

We are remoting to six different servers,so the error file is getting created in these 6 servers.We need to append all these into a single file in our local machine so that we can mail it.So we are using a single shell script with for loop to login to 6 servers.So your redirection on the call of the script might not work here.

That is not an issue, just use ">>" instead of ">".