Removing duplicates on a single "column" (delimited file)

Hello !

I'm quite new to linux but haven't found a script to do this task, unfortunately my knowledge is quite limited on shellscripts...

Could you guys help me removing the duplicate lines of a file, based only on a single "column"?

For example:

M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591331;175130959;000074873-AB;9.9;RIO DE JANEIRO
M202034357;01/2008;J30AP096;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589332;175123672;000001842-AB;9.9;MACAPA
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000043657;01F010800002;118000613348;175138146;000161122-AA;9.9;RIO DE JANEIRO
M202034357;01/2008;J30DF061;Ciclo 06 de Faturamento;4000034956;06F010800020;269800607228;173691920;000030011-AA;9.9;GUARA
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500588743;175121705;000188224-AA;9.9;NITEROI
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589299;175123639;000241055-AB;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589787;175125437;000256241-AB;9.9;SAO PAULO
M202034357;01/2008;J30AM097;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614870;175142866;000026153-AA;4.99;MANAUS
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614087;175140485;000023707-AA;9.9;BELEM
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043785;01F010800027;270200624370;175114167;000011219-AB;9.9;BEL�M
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591956;175132948;000441734-AA;9.9;SAO BERNARDO DO CAMPO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500590036;175126399;000458131-AA;9.9;SAO CAETANO DO SUL
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591958;175132950;000441735-AA;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000043657;01F010800002;118000612017;175130959;000469327-AA;9.9;GUARULHOS

So, the yellow field is found duplicate on a few lines... Like the first and last ones. But the data between them are different many times.

It doesn't matter for my purpose to have the ocurrence twice, even if the info before and after is different... So what I need is a script (maybe awk or cut) that recognizes the same string on position 8 and, if it was already found before, delete that whole line, but keep every other lines that do not contain a repeated string at position 8.

Ideas?

Try:

awk -F ';'  '!arr[$8]++' oldfile > newfile
1 Like

Thanks you very much, it worked... I shall be studying this function from now on... It is proving to be very useful.

The following variant saves some memory (an integer per line):

awk -F ';'  '!($8 in A) {A[$8]; print}' oldfile > newfile

I prefer the awk solutions suggested by Jim McNamara and MadeInGermany for your stated problem, but you could also consider this alternative for cases where you want the output sorted on the field you're using to select records:

sort -t';' -u -k8,8 oldfile > newfile

which, with your sample input in oldfile , produces the output:

M202034357;01/2008;J30DF061;Ciclo 06 de Faturamento;4000034956;06F010800020;269800607228;173691920;000030011-AA;9.9;GUARA
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043785;01F010800027;270200624370;175114167;000011219-AB;9.9;BEL�M
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500588743;175121705;000188224-AA;9.9;NITEROI
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589299;175123639;000241055-AB;9.9;SAO PAULO
M202034357;01/2008;J30AP096;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589332;175123672;000001842-AB;9.9;MACAPA
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589787;175125437;000256241-AB;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500590036;175126399;000458131-AA;9.9;SAO CAETANO DO SUL
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591331;175130959;000074873-AB;9.9;RIO DE JANEIRO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591956;175132948;000441734-AA;9.9;SAO BERNARDO DO CAMPO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591958;175132950;000441735-AA;9.9;SAO PAULO
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000043657;01F010800002;118000613348;175138146;000161122-AA;9.9;RIO DE JANEIRO
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614087;175140485;000023707-AA;9.9;BELEM
M202034357;01/2008;J30AM097;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614870;175142866;000026153-AA;4.99;MANAUS

in newfile .