Summing a number column

hi All,

i have a file in which only one column is there.,

test.txt

-900.01
-900.02
-900.03
-900.04
-900.05
-900.06
-900.07
-900.08
-900.09
900.01
900.02
900.03
900.04
900.05
900.06
900.07
900.08
900.09

i want to sum the values and get the final value.

currently i am usin the foloowing command,

Detail_Amount_Sum=0;
for i in `cat IFs_temp1.txt`
do
Detail_Amount_Sum=`echo "scale=2; $Detail_Amount_Sum + $i" | bc -l`
done
 
final_sum=$Detail_Amount_Sum;

but if i have one lakh record in the file it will take an hour to complete the for loop itself.

i need a command which will complet very fast but has to give accurate result.

my output should be in the scale of 2 like ( 900.00 or 900.01 or 0.01 or 3.27)

please send me the commands

It is slow because you are running bc 9,000 times for 9,000 numbers. Transform the entire stream into something you can feed into bc directly instead -- one program instead of 9,000.

For example if you transform it into lines like a+=(-5) then follows up with the last line a it will sum them all and print the resulting value.

awk 'BEGIN { print "scale=2" } { print "a+=("$1")" } END { print "a" }' inputfile | bc

Ordinarily I would use awk to sum the numbers too, but if you need infinite precision, bc is for you.

Also -- basically, you should never do for X in `cat file` , that's a dangerous use of backticks.

2 Likes

For a shell solution that should be faster, try something like:

{ printf "scale=2; 0"
  while read i; do
     printf "+%s " "$i"
  done
  printf "\n" 
} < IFs_temp1.txt | bc
2 Likes

Of course, if you have access to a 1993 or later version of the Korn shell, you can do this entirely with shell built-ins:

#!/bin/ksh
Detail_Amount_Sum=0
while read x
do      ((Detail_Amount_Sum+=x))
done < test.txt
printf "%.2f\n" "$Detail_Amount_Sum"

which with your sample input prints:

0.00
3 Likes

You may use paste and bc.

paste -sd+ test.txt | bc

Assuming you don't have any extra lines and all lines are numerics.

2 Likes