@Corona688 , You asked what I want. I want an AWK code which parse xml message and produces an out to output file every minute:
Message example
[date+time], message=[DATA= �<?xml version=�1.0?�><data changeMsg><NAME=�John Smith�><Age=�23�><D.O.B=�11-10-1988�> <Gender=�Male�>�
Out put example (Time is current time)
8:30,Male,23,1
8:31,Female,23,1
8:32,Female,30,4
8:33,Male,50,10
I have an AWK code which parse the xml message and does the counts then writes to output file. But the only problem with this is that the data is in temporary file and I need to capture rapidly changing data from this temporary file.
AWK code
awk '
{
gsub( ">", "" ); # strip uneeded junk and make "foo bar" easy to capture
gsub( " ", "~" );
gsub( "<", " " );
for( i = 1; i <= NF; i++ ) # snarf up each name=value pair
{
if( split( $(i), a, "=" ) == 2 )
{
gsub( "\"", "", a[2] );
gsub( "~", " ", a[2] );
values[a[1]] = a[2];
}
}
#gcount[values["Gender"]]++; # collect counts
#acount[values["Age"]]++;
agcount[values["Gender"]","values["Age"]]++;
printf( "%s %s %s %s\n", values["NAME"], values["Age"], values["D.O.B"], values["Gender"] );
}
END {
printf( "\nSummary\n" );
for( x in agcount )
printf( "%s,%d\n", x, agcount[x] ) | "sort";
}
' input-file
If I understood the fifo approach right then this what I did.
Create a backup.tar
tar -cpf /path/to/backup.tar /path/to/outputfile
Then create outputfile
mkfifo /path/to/outputfile
After using while do loop process the messages
While true Do
awk '
{
gsub( ">", "" ); # strip uneeded junk and make "foo bar" easy to capture
gsub( " ", "~" );
gsub( "<", " " );
for( i = 1; i <= NF; i++ ) # snarf up each name=value pair
{
if( split( $(i), a, "=" ) == 2 )
{
gsub( "\"", "", a[2] );
gsub( "~", " ", a[2] );
values[a[1]] = a[2];
}
}
#gcount[values["Gender"]]++; # collect counts
#acount[values["Age"]]++;
agcount[values["Gender"]","values["Age"]]++;
printf( "%s %s %s %s\n", values["NAME"], values["Age"], values["D.O.B"], values["Gender"] );
}
END {
printf( "\nSummary\n" );
for( x in agcount )
printf( "%s,%d\n", x, agcount[x] ) | "sort";
}
' input-file
done
But I am not sure about this
while true
do
cat /path/to/output > /tmp/$$
echo "Received data from /path/to/output"
# process /tmp/$$ as you please since it won't vanish from under you
done
Should I write the data from the feed to the backup file, then using the AWK code read from backup file then to the output file and sleep 10 seconds? Same process again and again.
My only problem with this is writing to different files which mean I will have all these data in different files which I don�t need to keep.
If I got this process wrong, I will be grateful if you could with the right process.
Thank you once again and Happy New Year