Shell Programming in unix

Hi,

i want to read a full file.

If i want to split the file and by reading parralel each, i can save the time.

Can any body give me the suggesion??

ia m using this function to read a file and using that i have to grep in another file. since the file 1 is huge it is taking lot of time.

iam planning to split that and run parraley.

suggesions pls

read_file1()
{
cat file1.txt | while read line
do
grep $line file2.txt >> outputfile.dat
done
}

Am I reading this correct as:

You have a very large file and it takes a lot of time to process it. So you are going to split up the file and have each piece processed separately.

I am not sure how exactly where parallel processing comes into play in your example or how this is faster that reading the entire file (unless your program loads the entire file into for some reason)

Have you looked at the 'split' command? It will split a file into separate files based on size as well as lines. So you could break the 'large' file into separate 1000 line files and then process them accordingly.

iam using the split command and i want to know how many files it has splitted.

ex:

big file 1 lac lines.

split 5000 bigfilename.txt

then i want to read thhis splitted file one by one and run the grep command as abouve.

non-parallel processing ...

split --lines 5000 bigfilename.txt tmp_
for PART in $( ls tmp_* ); do
    grep something $PART
done