Check valid records in really big file with one commend..

Hi,
I have a 5 gig file, no record terminators, field terminators are newline. The record length is 768 and I would like to check that every 768th byte is a newline and print out the byte position if it isn't. I would like to do this going either forward or backwards with one command if possible. I read that awk has a 3000 byte limit for a record so maybe a complex grep?

Thanks,

Victor

Many egreps will probably choke on this, but it's worth a try.

egrep -v '^.{768}$' file

Maybe awk will help.

awk '{len=length($0); if(len < 768) {print NR, len}}' data.file

thanks for the suggestions, I'll try them tomorrow..