Copy large dump file

Hi Experts..

Could anyone please let me know the easier way to copy large dump of files from one server to another. I am trying to copy a set of dump files on two different servers placed in different geographic locations.. Though there are other factors such as latency, etc., slowing up the process, i am curious to know if there are any other method to copy it fast.. I am using scp functionality to copy at present..

Thanks in advance !!

Not sure what OS you're using, but you could try using the 'split' command to break the files down to smaller chunks, then use the 'cat' command on the otherside to join them back together:

i.e.
split -b 1m largefile FILE_

then to rejoin:

cat FILE_* > largefile

Hope this helps.

thanks much for ur reply.. We are using AIX..

If you are not using scp compression, add that.

Maybe look at rsync as well. It can use scp as the channel, but add some intelligence to avoid needless transfers.

You could look at rsync+ssh, and give ssh the "blowfish" encryption option as it is supposed to be faster than the default.

Like so:

rsync -avz --ignore-existing -e  'ssh -oConnectTimeout=10 -c blowfish -ax' source-file destserver:/dest/path

Advantage of rsync is that it can resume a broken transfer.

You could also try regular scp with:

 -c blowfish -C 

[ -C is for compression ]

You could also try piping the large file through tar+gzip as described here: here and here

If you can download using HTTP, aget is another option.

All this may not make much of an impact if your file cannot be compressed, or if your network is too slow.