File name from a List of files

Hi,

Greetings!!

I'm grepping a string from a series of files, using the below code (how ever the awk is not grepping between 'from' & 'to' time!)

awk '$0>=$from&&$0<=$to' from=$START_TIME to=$STOP_TIME $logpath/UL_`date +%Y%m%d`_Scheduler*.log.csv > temp-grep.txt

Out of 50 files, wherever the string had occured, It'll redirect it to a temp file.

My query here is, how can I find, from which file the entry is made to the temp file?

Please help me.

Thank you in advance...

<<A Correction in the query!!! Posted at 4th reply... Sorry!!!>>

$ for i in $(find /your/path/to/50files -type f -print); do echo "$i"; awk '/start/,/end/' $i >> tempfile; done
1 Like

Thanks Jayan.

This was useful, However this does the same function as the one that I'm using now (Code in Initial post).

What I was expecting is like below

/opt/app/bharg

file1.txt
file2.txt
file3.txt

Now, I'm redirecting logs that are generated for an hour from /opt/app/bharg/file*.txt and redirect it to a temp file.

awk '$0>=$from&&$0<=$to' from=$START_TIME to=$STOP_TIME $logpath/file*.txt > temp-grep.txt

cat temp-grep.txt | grep 'ORA' > error_found.txt

Now, I'm grepping the temp file for a string, If string found, it'll redirect it to error_found file.

Till now this is working, my query is; How do I find from which of the file*.txt , the string got redirected to error_found file.

can you post the contents of temp-grep.txt

Validating address (getting geocode) of subscriber in OTSM... ",56241257,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318483203,131,2011/10/13 00:20:03.0131 (America/Chicago),Scheduler0,1620,257019,DEBUG2,,DEB
UG_LEVEL2,"Lb_OTSMCreateUserPasswordHeader::UserPasswordHeaderAVS->AVS::OTSM_UserPasswordHeader
UserName(STRING) -> m70991
Password(STRING) -> ykGus2jG
",56241257,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318483203,131,2011/10/13 00:20:03.0131 (America/Chicago),Scheduler0,1620,257020,DEBUG1,,DEB
UG_LEVEL1,"Lb_OTSMSendValidateAddressRequest::
Auto Renew Purchase scheduler started with entityId... 26359230_3",578918291,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318485495,26,2011/10/13 00:58:15.0026 (America/Chicago),Scheduler0,1620,257374,DEBUG1,,DEBU
G_LEVEL1,"Lb_CPMGetOrderPlan::Getting order plan (26359230)",,America/Chicago,false

When I grep INSERT, the above will be redirected from multiple log files under a directory.

But, out of 50 files, I'm confused from which all files, this entry is made to the temp file.

---------- Post updated at 05:06 AM ---------- Previous update was at 04:40 AM ----------

I realize that I was trying to get the file name after It is being redirected.

  1. Redirect the log between start and end time to a temp file
  2. Grep the temp file for string and redirect to error file

I was trying to get the file name from where the string is captured at step 2. Where in it should be captured at step 1 itself.

You have got it .. try modify your awk code accordingly ..

Ok, Let me try to modify and post the result...

Getting just the entries... not getting from which file the entry is made! :confused:

then modify it as below and check ..

for i in `ls file*.txt`
do
     nawk -v start="$START_TIME" '/start/ {print FILENAME}' $i >> temp-grep.txt
     awk '$0>=$from&&$0<=$to' from=$START_TIME to=$STOP_TIME $logpath/$i >> temp-grep.txt
done

Thanks Jayan. However the code iterates the result in the outputfile and the file contains the whole log entry with the iterated file name. Below is the head and tail of the generated output file

> tail temp-grep.txt
 ORDER_PLAN_OID->(28884387) AUTO_RENEWED->(Y) STATUS->(PC) END_DATE->() PLAN_CODE->(120) ORDER_OID->(28879928) PLAN_GROUP->(SBP1) USAGE_ALLOWANCE->(250) START_DATE->(20111018132102) PLAN_TYPE->(D) PLAN_PRICE->(14.99) PAYMENT_METHOD->(C) SPEED_TIER_POLICY_ID->() HOT_SOC->(SBNP004)",451211265,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942262,44,2011/10/18 07:51:02.0044 (America/Chicago),Scheduler9,1602,2089,DEBUG3,,DEBUG_LEVEL3," ""entering Br_CreateNextThresholdNotificationScheduler::"" \
        ""br_processThresholdNotification.dsd:"" ",451211265,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942262,44,2011/10/18 07:51:02.0044 (America/Chicago),Scheduler9,1602,2090,DEBUG3,,DEBUG_LEVEL3,"Br_CreateNextThresholdNotificationScheduler::dumpmap pm_subscriber:
 USAGE_INDICATOR->(N) UPD_USR->(1) ADD_USR->(1) IMSI->(310410350798855) SUBSCRIBER_ACTVN_END_DATE->() SUBSCRIBER_PREV_STATUS->(R) IMEI->(012328002750678) SUBSCRIBER_ACTVN_STR_DATE->(20101224163948) SUBSCRIBER_STATUS_DATE->(20101224163959) UPD_USR_TMS->(20110920044924) ADD_USR_TMS->(20101224163948) BILLING_ACCOUNT_NUMBER->(346000977621) SUBSCRIBER_STATUS->(A) ACCOUNT_OID->(346000977621) PREV_SUBSCRIBER_NUMBER->() SIM->(89014104243507988554) EQUIPMENT_TYPE->(G) SERVICE_AREA->(009847003716) SUBSCRIBER_NUMBER->(9563731369) TECHNOLOGY_TYPE->(GSM) HOMING_INDICATOR->(1) SUBSCRIBER_OID->(2223596)",451211265,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942262,44,2011/10/18 07:51:02.0044 (America/Chicago),Scheduler9,1602,2091,DEBUG3,,DEBUG_LEVEL3,"Lb_GetNextThresholdValue entering::workspace/libraries/lb_utilities.dsd:pm_currValue->30",451211265,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942262,44,2011/10/18 07:51:02.0044 (America/Chicago),Scheduler9,1602,2092,DEBUG3,,DEBUG_LEVEL3,"Br_CreateNextThresholdNotificationScheduler::nextThreshold (absolute): ",451211265,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942262,69,2011/10/18 07:51:02.0069 (America/Chicago),Scheduler9,1602,2093,DEBUG1,,DEBUG_LEVEL1,"Component Scheduler9: finished executing event rule ""ExpiryRule"" with no errors",,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942274,469,2011/10/18 07:51:14.0469 (America/Chicago),Scheduler9,2576,491,DEBUG3,,DEBUG_LEVEL3,"findComponent: incrementing ref count on Scheduler9 to 238",,America/Chicago,false
INSERT,p3wtg1z2.edc.cingular.net,1318942291,569,2011/10/18 07:51:31.0569 (America/Chicago),Scheduler9,1775,5400,DEBUG3,,DEBUG_LEVEL3,"findComponent: incrementing ref count on Scheduler9 to 239",,America/Chicago,false
usprod41:/opt/app/p3wtg1z2/sbpprod1/kb521q/grep
> head temp-grep.txt
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv

And, here is the debug level code run, let me try to work on modifying the awk for which it doesn't seem to be taking the start and end value while grepping

 bash -x jayan.sh
++ TZ=UTZ+6
++ date
+ x='Tue Oct 18 06:49:00 UTZ 2011'
++ date
+ y='Tue Oct 18 07:49:00 CDT 2011'
++ echo Tue Oct 18 06:49:00 UTZ 2011
++ awk '{print $4}'
+ START_TIME=06:49:00
++ echo Tue Oct 18 07:49:00 CDT 2011
++ cut -b12-19
+ STOP_TIME=07:49:00
++ date +%Y%m%d
+ for i in ls '/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_`date +%Y%m%d`_Scheduler*.log.csv'
+ nawk -v start=06:49:00 '/start/ {print FILENAME}' ls
nawk: can't open file ls
 source line number 1
+ awk '$0>=$from&&$0<=$to' from=06:49:00 to=07:49:00 /ls
awk: can't open /ls
+ for i in ls '/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_`date +%Y%m%d`_Scheduler*.log.csv'
+ nawk -v start=06:49:00 '/start/ {print FILENAME}' /opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
+ awk '$0>=$from&&$0<=$to' from=06:49:00 to=07:49:00 //opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler0.log.csv
+ for i in ls '/opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_`date +%Y%m%d`_Scheduler*.log.csv'
+ nawk -v start=06:49:00 '/start/ {print FILENAME}' /opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler1.log.csv
+ awk '$0>=$from&&$0<=$to' from=06:49:00 to=07:49:00 //opt/app/p3wtg1z2/sbpprod1/FW/home/UnifiedLogging/UL_20111018_Scheduler1.log.csv
//And it went on till the last file...//