sed and awk not working on a large record file

Hi All,

I have a very large single record file.

abc;date||bcd;efg|......... pqr;stu||record_count;date

when i do wc -l on this file it gives me "0" records, coz of missing line feed.

my problem is there is an extra pipe that is coming at the end of this record
like

abc;date||bcd;efg|......... pqr;stu||record_count;date|

I want to take that extra pipe away. But, awk and sed not working for me.

awk - giving me error that record too long.
sed - tried a few flavours of sed and still not working.

I need this issue to be fixed ASAP. Can someone help me?

What you have tried with sed and awk..?

 
awk '{sub("\\|$","")}1' file
 
sed 's/|$//' file

Yes Pamu, i tried both the commands that you shared and it didn't work.

That file is not a valid Unix text file. You could try if this awk works anyway:

awk 'NR>1{print p}{p=$0} END{printf "%s\n", p}' RS=\| ORS=\| file

If it works, it should add a linefeed at the end..

This command worked, but with an issue

||13455;2013-01-04 15:06:49|abc@ser01:/tmp/WrkSpc/Archive

||13455;2013-01-04 15:06:49
abc@ser01:/tmp/WrkSpc/Archive

wc -l on orig file returns

0 origfile

wc -l on output file returns

1 newfile

My job won't pick this newfile if it has a line feed.. how to remove the line feed after this command?

You could try to leave out the line feed:

awk 'NR>1{print p}{p=$0} END{printf "%s", p}' RS=\| ORS=\| infile

If it works that would produce another invalid text file..

Hi Scrutinizer,
this seems to work:

cat newfile | tr -d "\n" | tr -d "\r" > newfile1

..

This I ran on top of your first command.. Do you c ne issues using this??