Multi thread shell programming

I have a unix directory where a million of small text files getting accumulated every week.

As of now there is a shell batch program in place which merges all the files in this directory into a single file and ftp to other system.

Previously the volume of the files would be around 1 lakh and we used to manage with single program.

Now this batch program taking so much time and running for days to merge the files.

any better approach to merge the files faster?

I would think... split the files based on some criteria and multi thread the program.

My files all the times start with a number.

MergeFilesNameStartwith0 & MergeFilesNameStartwith1 & MergeFilesNameStartwith2 .... MergeFilesNameStartwith9

Is this good idea? any suggetions appreciated

since the directory is having a million of files it could not list even 20k at a time

so using below expression to get the each file to merge as of now.

for file in ls | head -1500 
cat $file >> $concatFile 

yours suggetions are welcome

Look at GNU parallel. Their man page has good examples.

Try cpio and check if it is faster

ls server.xml*  |head -500 |cpio -ocv >$concatFile