Retry upon FTP failure

I am using the following code in a C Shell script to transfer files to a remote server:

 
ftp -n logxx.xxxx.xxx.xxx.com <<DO_FTP1
quote user $user_name
quote pass $password
ascii
put $js_file_name
bin
put $FinalZipFile
quit
DO_FTP1

This code works great except on those rare occasions when the remote server is down or, for whatever reason, fails. This causes my code to fail and makes my already tense users angry. This code runs on both Unix and Linux. Is there a way to force a retry after a specified time period if there is a failure of the ftp command? Also, is there a way to make the hash mark print at something other than every 8192 bytes (something more like every GB)?

Thanks,
Paul Hudgens
Denver

Save the output of the script into a log file, and examine the log file afterwards.
ftp abc.com <<DO_FTP1 >abc.log

Sorry - that didn't work. The script stalls at the re-direction and just sits there. I have to ctrl-c to end it. Thanks anyway.

Add a line just before the quit statement in the ftp input to get a file from the remote machine,
If the file does not exist on your local system after the ftp job has completed, then you should investigate.

rm job.complete
ftp -n xxxxx <<EOF
blah
blah
get job.complete
quit
EOF
if [ -r job.complete ]
then
exit 0
else

How does job.complete get on the remote server from which you are trying to retrieve it in the first place?

Thanks,
Paul H.

Either put it there yourself, or ask the admin there to do it for you, or choose any file that will always exist on the remote server.

Is there not a way to have ftp give an exit status that can be retrieved to determine if it succeeded or failed?

Thanks,
Paul H.

I tried that also, but as soon as you give it the quit command it terminates normally, and $? is equal to zero.

Hi

This is based on ksh, but the same prinicples should apply. A key thing is that successful FTP normally yields a 226 message on success for each transferred file.

So, this code counts the number of 226 messages in the ftp log ($FTPTEMP) against the number of files to be transferred ($FTPNUMFILES).

There is a weakness here, in that if one file transfer in a group fails, the whole group is resent on the next iteration/retry. Depinding on the action of the target machine, it is possible to find multiple copies of the same file at the target, suffixed (probably) by ".1", ".2" etc.

Assume $DELAY = num of seconds to wait before FTP retry
Assume $NUMFILES = number of files being transferred
Assume $TEMP = log file from ftp command (e.g. ftp <<END >$FTPTEMP)
Assume $LOOP = retry count (decrementing within process)
Assume $TRIES = retry count (fixed within process)
Assume $RC = general return code variable

   TRIES=$LOOP
   
   while [[ $LOOP -gt 0 ]] && [[ $RC -ne 0 ]]
   do
     ftp <<END >$TEMP
     open yourtarget
     user yourlogon
     mput yourfiles
     quit
END

     LOOP=$((LOOP-1))

   # calculate RC:  0 = success; non-0 = failure

     RC=$(($NUMFILES-$(egrep -ic "^(226 Transfer complete\.)|^(226 File receive OK\.)" $TEMP)))
     if [[ $RC -ne 0 ]]
     then
        sleep $DELAY
        rm $TEMP >/dev/null 2>&1
     fi
   done


 if [ $RC -ne 0 ]
 then
    echo "\nFTP failed after " $TRIES " attempts" 
 else
    echo "\nFTP successful"
 fi

Rgds
J