sort split merge -u unique

Hi, this is about sorting a very large file (like 10 gb) to keep lines with unique entries across SOME of the columns.

The line originally looked like this:

sort -u -k2,2 -k3,3n -k4,4n -k5,5n -k6,6n file_unsorted > file_sorted

please note the -u flag.

The problem is that this single command is taking more than 12 hours with 10 g of memory and I am looking for a way to speed things up.

I have heard that splitting a large file into subfiles, then sorting each subfile and then merging back together with the sort command can work, but I am imaging this will not work because I'm using the -u flag to keep only unique rows.

maybe split and then sort first WITHOUT the -u flag and then sort -m -u -k2,2 -k3,3n -k4,4n -k5,5n -k6,6n subfiles ?

I'm thinking of ways to test that, but please let me know if you have any ideas for me!

Thank you kindly.

Jonathan

You need a database, not shell scripting, to manipulate such amount of data ...

Thanks for the tip; we can consider using a database in the future.

Do any readers have an idea how to accomplish this task within the unix / bash framework I am accustomed to?

Thanks.

Hi.

Most file sorting programs will follow the design noted in External sorting - Wikipedia, the free encyclopedia -- i.e. they will create smaller temporary sorted files, then do as many merges as necessary to accomplish a final sort. If you think about it, this is very reasonable: you sort a memory-load of data (as an array, using one of any number of algorithms), write it out to a temporary file until all input files are read, then you read chunks from each sorted temporary to produce a final file. If there too many temporary files, you might need to do an intermediate merge, etc.

I just glanced though a few thousand lines of GNU sort ( part of coreutils, coreutils-8.5/src/sort.c ) and it seems to do the sorting into temporary files.

I think the most you could do to help a system sort is to provide one or more directories for the temporary files on devices different from the input file:

       -T, --temporary-directory=DIR
              use DIR for temporaries, not $TMPDIR or /tmp; multiple options
              specify multiple directories

-- excerpt from Linux man sort 

You could experiment with your own splitting by splitting a sample of your data, using utility split, and then sorting those files. I'd guess that would be slower than sort's internal splitting, but it might be worth a try, at least it would be educational.

You could also use a trace facility to look at the sort while it is going on. That would probably show scratch files being opened and child processes being created to help with the overall process: strace - trace system calls and signals, for example.

Best wishes ... cheers, drl

Make sure you are sorting in the C locale. The other locales can be 10 time slower.