How to check duplicate entries in file ? (Solaris-9)

Hi,
There are duplicate entries in file, but uniq will not see because first field is different. How will I catch all lines, which are having duplicate IPs ?

bash-2.05# cat db.file | grep 172.30.133.11
dsrq-ctrl1-prod         A       172.30.133.11
e911q-db1-nxge0         A       172.30.133.11
bash-2.05#
bash-2.05# cat db.file | grep 172.30.133.12
dsrq-ctrl2-prod         A       172.30.133.12
e911q-icp1-nxge0        A       172.30.133.12

Thanks

awk 'NR==FNR {if ($NF ~ /[0-9][.][0-9]/) a[$NF]++; next}; {if (a[$NF] > 1) print $0}' db.file db.file

try also either /usr/xpg4/bin/awk or nawk in Solaris.

This command ran and came back on prompt, without any output

bash-2.05$ cat db.file | wc -l
   11354
bash-2.05$ awk 'NR==FNR {if ($NF ~ /[0-9][.][0-9]/) a[$NF]++; next} a[$NF] > 1' db.file db.file
bash-2.05$ cat db.file | wc -l
   11354
bash-2.05$

It works.
Ensure there are db.file db.file command arguments (needs to read the file twice),
and use /usr/xpg4/bin/awk or nawk !

1 Like

Thanks. Didn't knew, that is different awk