Stuck with awk !!!

Can someone help me with geting the desired output?

Source:

Bank1.sss    
     63  Result code [0] 
      2  Result code [106] 
      4  Result code [108] 
 Bank2.sss
    474  Result code [0] 
      1  Result code [103] 
      4  Result code [106] 
      2  Result code [187] 
      1  Result code [580]

Output: (Should calculate the percentage of each bank's error)

Bank1.sss
91.30     63  Result code [0] 
2.89      2  Result code [106] 
5.79      4  Result code [108] 
 Bank2.sss
97.92    474  Result code [0] 
0.20      1  Result code [103] 
0.82      4  Result code [106] 
0.41      2  Result code [187] 
0.20      1  Result code [580]

I did the following:

for i in cat FI_list; do echo $i >> err_result; grep $i CCScrape130903.log | grep "Result code"| cut -d~ -f2 | sort | uniq -c | awk '{ s += $1 } END { print "sum is", s "\n"} {printf "%f", $1/s*100} {print $0}' >> err_result; done

It is generating output as :

Bank1.sss
100.000000     63  Result code [0] 
3.076923      2  Result code [106] 
5.797101      4  Result code [108] 
sum is 69

Bank1.sss
100.000000    474  Result code [0] 
0.210526      1  Result code [103] 
0.835073      4  Result code [106] 
0.415800      2  Result code [187] 
0.207469      1  Result code [580] 
sum is 482

Can someone help me with getting the desired output?

TIA

Use printf "%.2f" to round to 2 decimal places (assuming that's your only problem, which I can see from the output).

If you see the output carefully, you will see that the output shows 100.000000 for Result code [0] even though it should not be 100, it should be 91.30 for Bank1.sss.

The first column data is the success rate in percentage for any error for a particular bank.

Eg:
For Bank 1,

total entries : 63+2+4=69
63/69100 = 91.30
2/69
100 = 2.89 ... and so on.

Without any precision, it will never know how to round correctly.

$ echo "63/69*100" | bc
0
$ echo "scale=2; 63/69*100" | bc
91.00
$ echo "scale=3; 63/69*100" | bc
91.300

You get higher precision with

$1*100/s

and even higher enforcing floating point

$1*100.0/s

hmm, I am aware of that but since I am new to awk, its getting difficult for me to generate the required output.

I need to know what needs to be changed in the following command

 awk '{ s += $1 } END { print "sum is", s "\n"} {printf "%f", $1/s*100} {print $0}'

so that I can get the output as

Bank1.sss
91.30     63  Result code [0] 
2.89      2  Result code [106] 
5.79      4  Result code [108] 
 Bank2.sss
97.92    474  Result code [0] 
0.20      1  Result code [103] 
0.82      4  Result code [106] 
0.41      2  Result code [187] 
0.20      1  Result code [580]

The problem with the above awk command is :

At first, the value of s is 63. so it creates percentage as 63/63100 = 100 %
then, the value of s is 63+2 i.e 65. so it creates percentage as 2/65
100 = 3.07 % and so on... I hope you got the point.

How can I change the command such that awk will create percentage only for the entries between ".sss" , calculate the sum and then percentage. Then move to the next loop i.e the entries between next set ".sss".

I am getting the proper sum for every set but not correct percentage.

awk '
  NR == FNR { if( !$2 ) { B++; next }
              BANK+= $1
              next
            }
  ! $2 { T = BANK[++C]; print $0 " (total " T ")"; next }
  {
    printf( "%.2f ", $1 / T * 100 )
  } 1
' file file
Bank1.sss (total 69)
91.30      63  Result code [0] 
2.90       2  Result code [106] 
5.80       4  Result code [108] 
Bank2.sss (total 482)
98.34     474  Result code [0] 
0.21       1  Result code [103] 
0.83       4  Result code [106] 
0.41       2  Result code [187] 
0.21       1  Result code [580]

Wow, this is what I needed, but I doubt that I would have been able to create a awk script like that...

Any training material that will help me to code like you did?

I wouldn't wish that on you :slight_smile: I've taken a somewhat lazy approach by reading the same file twice, so if the file is very large, you might want to rewrite that to do everything in a single pass. An exercise for the reader!

The web has a ton of handy awk tutorials. Just search for "awk tutorial" and you'll find a whole bunch of 'em.

The first one on the list is as good as any:

Awk - A Tutorial and Introduction - by Bruce Barnett

THanks for the suggestion..

I came up with the following script and iit is working fine on my local Ubuntu system. But when i tried to execute this script on one of the old debian server, it prompted me with the error message indicating readarray not found. or something similar.

Seems that the bash version is older than required on the sever. What change should I do that will simulate readarray command.

I am trying the following but it ain't working. Help me..

count=$(wc $file | awk {'print $1 -1'})

for i in `seq 0 $count`;
do
        for z in `cat $file`;
        do
        array[$i]= $z
        done   
        echo "$array[@]"
done

$file contents :

38.33% accounts for bank1.scr script failed with error code [130]
27.37% accounts for bank2.scr script failed with error code [102]
25.00% accounts for bank3.scr script failed with error code [101]
28.57% accounts for bank4.scr script failed with error code [102]
100.00% accounts for bank5.scr script failed with error code [185]
 

Actual script:

#!/bin/bash
#
# Nagios plugin to monitor 

###--Nagios Exit Status--###
STATE_OK=0
STATE_CRITICAL=1


#cc_scrape_log=/CustomerCentral/logs/.....
export cc_scrape_log=/home/zshaikh/Documents/CCScrape13091111.log
#export cc_scrape_log=/home/zshaikh/Documents/CCScrape130903.log
export fi_list=/home/zshaikh/Documents/FI_list
export fi_error_details_file=/tmp/fi_error_details
export final_op_file=/tmp/final_result

if [ -f $fi_error_details_file ] ; then
        cat /dev/null > $fi_error_details_file
fi

if [ ! -f $fi_error_details_file ] ; then
        touch $fi_error_details_file
fi

###--Start initial Checks--###
if [ ! -f $cc_scrape_log ] ; then
        echo "CRITICAL - Scraper logs not found for today's date."
        exit $STATE_CRITICAL
fi


#Generate error report for each FI

for i in `cat $fi_list`; do echo $i >> $fi_error_details_file; grep $i $cc_scrape_log | grep "Result code"| cut -d~ -f2 | sort | uniq -c >> $fi_error_details_file; done


#Generate failure rate (in %) for each FI

Output=$(awk 'NR == FNR { if( !$2 ) { B++; next } BANK+= $1;next} ! $2 { T = BANK[++C]; print $0 " (total " T ")"; next } {  printf( "%.2f ", $1 / T * 100 )} 1' $fi_error_details_file $fi_error_details_file | awk '$1 > 20' | grep -v "Result code \[0\]" | grep -B 1 "Result code")

check_op=$(echo $Output | grep -c [0-9])

if [ $check_op -lt 1 ]; then

echo "OK - No New error found"
        STATE=$STATE_OK
else

        echo $Output | awk 'BEGIN {RS="--"} {print $4 "% accounts for "$1" script failed with error code " $8}' > $final_op_file

        COUNT=$(grep -c scr $final_op_file)
        if [ $COUNT -gt 0 ] ; then

                readarray -t testarray < <(cat $final_op_file)
                for i in "${testarray[@]}"; 
                do
                        echo "CRITICAL ::: $i"; done
                        echo "Total $COUNT scripts failed ";
                        STATE=$STATE_CRITICAL
        fi
fi

I guess you could try to simulate that bash 4 command with something like this:

readarray() {
  local i
  if [ "$1" = "-t" ] && [ $# = 2 ]
  then
    while read "$2[$((i++))]";
    do
      :
    done
  fi
}

Or just

oldIFS=$IFS
IFS=$'\n'
testarray=( $(<"$final_op_file") )
IFS=$oldIFS

But what is the use of reading a file into an array, if it is only used for enumeration.
Try something like:

COUNT=0
while read i
do
  echo "CRITICAL ::: $i"
  ((COUNT++))
done < "$final_op_file"

echo "Total $COUNT scripts failed ";
STATE=$STATE_CRITICAL

thanks... I will try that and will let know...

I want to read a file into an array because when printing the variable contents directly does not results in expected output.

I get the following output if I dont use array.

38.33% accounts for bank1.scr script failed with error code [130] 27.37% accounts for bank2.scr script failed with error code [102] 25.00% accounts for bank3.scr script failed with error code [101] 28.57% accounts for bank4.scr script failed with error code [102] 100.00% accounts for bank5.scr script failed with error code [185]

Yes there was a mistake with single parentheses where they should be double. I corrected it in my post.