Generate sum of a particular column date wise

Hi All,
I have a file with below content

01/22/2014,23:43:00,1742.8,
01/22/2014,23:43:00,1742.8,
01/22/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/29/2014,23:45:00,2046.45,
01/29/2014,23:43:00,1742.8,
01/29/2014,23:44:00,1749.06666666667,
01/29/2014,23:45:00,2046.45,

I want to get a command that actually sum up the 3rd column for each date wise and finally give me the result

say example like below (rough calculation done)

01/22/2014  4567
01/25/2014  8965
01/29/2014  9852

Hello,

Following may help.

awk -F"," 'NR==FNR{a[$1]+=$3;next} a[$1]{print $1" "a[$1]}' get_total_3rd get_total_3rd | uniq

output will be as follows.

01/22/2014 5234.67
01/25/2014 11076.6
01/29/2014 7584.77

Where get_total_3rd is the input file name.

Thanks,
R. Singh

Thanks a lot for ur response. I tried below stuff

Hostname:~ 905>cat two.txt
01/22/2014,23:43:00,1742.8,
01/22/2014,23:43:00,1742.8,
01/22/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/29/2014,23:45:00,2046.45,
01/29/2014,23:43:00,1742.8,
01/29/2014,23:44:00,1749.06666666667,
01/29/2014,23:45:00,2046.45,
Hostname:~ 906>
Hostname:~ 907> awk -F"," 'NR==FNR{a[$1]+=$3;next} a[$1] {print $1" "a[$1]}' two.txt | uniq
Hostname:~ 908>

I am not getting any error but getting blank prompt comes up..
am I doing anything wrong here..plz help me

---------- Post updated at 03:12 AM ---------- Previous update was at 03:06 AM ----------

Its working..
I used the file names 2 times now.
I initially thought the file name is duplicated mistakenly in ur command.
Now I tried ur command, its 100% working, thanks a lot again forur help.

---------- Post updated at 03:19 AM ---------- Previous update was at 03:12 AM ----------

Hi,
The code is working awesome..
instead of exponential sum can I get the real integer number, plz

01/23/2014 5.04681e+06
01/24/2014 4.9229e+06
01/25/2014 4.26656e+06
01/26/2014 4.94703e+06
01/27/2014 6.0064e+06
01/28/2014 5.4583e+06
01/29/2014 5.22587e+06
01/30/2014 3931

if order doesn't matter then use this

$ awk '{A[$1]+=$3}END{for(i in A)print i,A}' FS="," file

@ Ravinder you can do like this also, provided sum is not zero

$ awk -F"," 'NR==FNR{a[$1]+=$3;next} a[$1]{print $1" "a[$1];a[$1]=""}' file file

for example :

[akshay@aix tmp]$ cat file
01/22/2014,23:43:00,1742.8,
01/22/2014,23:43:00,1742.8,
01/22/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/25/2014,23:45:00,2046.45,
01/25/2014,23:43:00,1742.8,
01/25/2014,23:44:00,1749.06666666667,
01/29/2014,23:45:00,0,
01/29/2014,23:43:00,0,
01/29/2014,23:44:00,0,
01/29/2014,23:45:00,0,

[akshay@aix tmp]$ awk -F"," 'NR==FNR{a[$1]+=$3;next} a[$1]{print $1" "a[$1];a[$1]=""}' file file
01/22/2014 5234.67
01/25/2014 11076.6

This could be solution for above problem

[akshay@aix tmp]$ awk -F"," 'NR==FNR{a[$1]+=$3;next}($1 in a){print $1" "a[$1];delete a[$1]}' file file
01/22/2014 5234.67
01/25/2014 11076.6
01/29/2014 0
1 Like

Thanks a lot for the timely help guys. I love this forum.

---------- Post updated 01-31-14 at 01:39 AM ---------- Previous update was 01-30-14 at 08:59 PM ----------

Guys,
I am stuck again. Probably the reason was I did not understand it completely.
Now I have below scenario same as the initial one but a new column added

This time I would completely understand it so that whenever a new column is added I can still use the same command with small alteration.
Can u plz explain what modification in command required whenever a newer column added but I still want date-wise data for each column.

when thr is only one column for any date below comamnd works as mentioned above.

awk -F"," 'NR==FNR{a[$1]+=$3;next}($1 in a){print $1" "a[$1];delete a[$1]}' ff2.txt ff2.txt
[akshay@aix tmp]$ cat ff2.txt
01/22/2014,23:43:00,1742.8,199.8,
01/22/2014,23:43:00,1742.8,287.9
01/22/2014,23:44:00,1749.06666666667,764.8
01/25/2014,23:45:00,2046.45,656.3
01/25/2014,23:43:00,1742.8,786.1
01/25/2014,23:44:00,1749.06666666667,875.9
01/25/2014,23:45:00,2046.45,890.91
01/25/2014,23:43:00,1742.8,765.11
01/25/2014,23:44:00,1749.06666666667,678.98
01/29/2014,23:45:00,0,651.11
01/29/2014,23:43:00,0,781.13
01/29/2014,23:44:00,0,871.23
01/29/2014,23:45:00,0,893.19