General compression question

I am looking for an alternate solution other than gzip or bzip2 to compress files that are 3 to 4 GB each and will be hundreds per day. Aside from increasing storage anybody found a good tool?

You have to be a bit more specific on what you need, and in what way gzip/bzip2 don't match your expectations.

compression of large (3-4 GB each) data sets
speed = able to handle 300-400 files at once
system resources can not cripple the machine.

bzip2 works but we need to look at other possibilities

Data compression requires a lot of CPU power. To compress several hundred files simultaneously you're simply going to need a lot of horsepower. Something like a 16 core server with 64 GB of memory is probably the minimum configuration for this task. bzip2 is best compression program I know. There are a few alternatives but they are not as good. But there no magic software that can compress hundreds of 4 GB files on a workstation in a twinkling of an eye.

The vczip component of the Vcodex software from AT&T Labs Research might be worth looking into.

vcodex data transformation package

Depending on what your starting data looks like the algorithms in Vcodex might generate better compression than what you've used so far.

7zip looks like it might suit your needs.

http://www.7-zip.org/download.html

7zip can use multiple cores very well, as well as using a maximum number of cores, so that will help to not bring your system to a crawl. However, this might cause a strain on your disk system.