Extract data from large file 80+ million records

Hello,

I have got one file with more than 120+ million records(35 GB in size). I have to extract some relevant data from file based on some parameter and generate other output file.

What will be the besat and fastest way to extract the ne file.

sample file format :--
++++++7777jjjjjjj0000000000 ( header record)
2098 POCG 0000 KKKK
2097 KOLL 0F00 KLLL
2095 LKJH 0L99 L0IU
.
.
.
.

********66666666666**** ( trailer record

Now suppose i enter the key as 2098(field as key) , so all rercords with 2098 as the first record should be moved to new file.

**********************************************

I tried to use grep ...but it took a lot of time ..nearly 45 mintues to give me output file.

hmmm 35Gb?? try awk.. or may be sed.. but it will put considerable load on CPU

With a file that size, anything is going to take a long time. There's not going to be anything faster than grep, with the possible exception of a filter written in C that does nothing but what you want.

With that much data, you might want to look at using a DBMS, e.g., PostgresQL.