Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's completed successfully even if it did.
We are using Sun Solaris 5.9 OS.
How to solve this?

Thanks in advance

I believe there is an ftp option, -A, that forces active mode by default. You might try that. If that doesn't work... this is totally off the cuff and a real solution depends greatly on your exact situation, but, a few quick workarounds;

1) use scp (over ssh) - which you should probably do anyway.

2) split the file into chunks before sending (you can do this a variety of ways, from zip archivers that do splitting to using the unix split command) and then on the other side cat pieces* > original_file

look into TCP keep-alive specific parameters and try to lower it down. Probably sending keep-alive packet may act as workaround for your firewall

using SCP is the best and simplest method