Awk - more than 99 fields?

Just written a great little script that analyses a flat ASCII CSV file and reports for any discrepancies with field counts per record etc.

Works fine on all the flat files except one, which has 113 fields per line. AWK can only seemingly cope with up to 99 fields.

Is there any way around this? It will be a pain to check manually as there are more than 100,000 records in this file!



Probably depends which version of awk you are using. The version distributed with HP-UX supports up to 200
($0 .. $199). You might be able to do some subtraction using $NF if you cant access a field directly.

An alternative would be to use the 'cut' command or, even better, perl.

Thanks for that! I eventually used :

cut -d, -f50- input_file > output_file

and then did my awk command on the output file, remembering to subtract 50 from any variable used thereon in.

Thats what I like about UNIX...there is a way around almost anything :slight_smile: