FTP a huge Size file

Dear All,

Good Evening!!

I have a requirement to ftp a 220GB backup file to a remote backup server.
I wrote a script for this purpose.
But it takes more than 8 hours to transfer this file.
Is there any other method to do it in less time???

Thanks in Advance!!!

---------- Post updated at 05:22 PM ---------- Previous update was at 05:20 PM ----------

I cannot compress / split it as the originality of the file is very important and cannot take even 0.1% risk. Bcoz when the system crashes, using this file I'll be able to restore the entire system.

FTP is doing well in individual large files .
What about rsync is a good option.
Just google for FTP vs rsync

Why not do a md5sum of the file before you send it and check it on the receving end:

source_box $ md5sum mybigfile.tar
81836bd568b30ec974bff32af98458d1 *mybigfile.tar
source_box $ gzip mybigfile.tar
source_box $ rsync ./mybigfile.tar.gz dest_box:mybigfile.tar.gz
 
dest_box$ gzip -d mybigfile.tar.gz
dest_box$ $ md5sum mybigfile.tar
81836bd568b30ec974bff32af98458d1 *mybigfile.tar
1 Like

If there was a special 'go faster' data transfer method, most things would be using it already.

The main risk of corruption in transferring something that huge is network blips and timeouts, which can be reduced by reducing the size of what you're sending. In that regard, compression may reduce your risks of corruption.

Agreed, also if this is for backup purposes why not do what most over people do in this situation - write the data to tape/disk and store off-site but close by. This also has the advantage that recovery time is much quicker as you don't have another 8 hour delay to transfer the data over the network again.

That is also available. We have built a dedicated server for Backup for Security Audit.