Splitting a large file, split command will not do.

Hello Everyone,

I have a large file that needs to be split into many seperate files, however the text in between the blank lines need to be intact. The file looks like

SomeText
SomeText
SomeText

SomeOtherText
SomeOtherText

....

Since the number of lines of text are different for each entry, my only real marker is a blank line. I have tried the following
cat largetxtfile.txt | awk -f

BEGIN{i=0}

{

if($0=="")

        {

          ++linecount;

        }

if(linecount%500 != 0)

        {

        print $0 >> i".txt"

        }

        else

        {

         ++i

        }

}

This should split the file at every 500 entries to a separate file. It sort of works but doubles up the files, I'm not sure if my logic is wrong.

Please Help.

J

awk 'BEGIN{RS=""} {print > "file" NR} ' largetxtfile.txt

That would work fine if I wanted to split every instance of a line break into a separate file, but I want a certain number of instances to go into a separate file. I call this variable how many and its defined int he command line. This slight modification seems to work.

BEGIN{i=0; linecount=0}

{

if($0=="")

        {

          ++linecount;

        }

if(linecount != howmany)

        {

        print $0 >> i".test"

        }

        else

        {

        linecount = 0;

        ++i;

        }

}

you would have rerun the cmd. Try removing the files and then run the cmd