Hi guys,
I am having a problem to determine if the FTP request I had was successful or not... Here is what I do:
In a shell script I call another shell script to do the FTP like:
#!/bin/ksh
echo "Hello..."
...
# call do_ftp.sh
do_ftp.sh $SERVER $USR $PASS $FILE....
status=$?
if [ status -ne 0 ]; then
echo "FTP error"
else
echo "FTP OK"
fi
and do_ftp.sh is like below:
exec 4>&1
ftp -inv >&4 2>&4 |&
print -p open $FTPSERVER
print -p user $USER $PASSWD
print -p lcd $OUTPUTDIR
print -p binary
print -p put $ZIPFILE
print -p quit
but even if I pass wrong parameters, status would always be equal 0.
I've tried using inline redirection or "value=do_ftp.sh ..."
and so and so but I am not getting anywhere...
I had the same sort of issue. For FTP, I used a function within the script (matter of preference I guess) The program was dependant upon a file that was retrieved nightly from an FTP server. Instead of trying to resolve the correct return code from FTP, I focused
on evaluating the result of the FTP - the file itself. If it
was there, then everything was cool. If not, well, fail. I also checked for an empty file and exited if it was.
just noticed your doing a "put" and not a "get"...oh well...maybe this will still spark an idea or two.
ftp_func () {
ftp -v -n $TO_HOST << ENDFTP
user $USERNAME $PASSWORD
ascii
prompt off
hash on
lcd $FILE_DIR
get $INPUT_FILE_NAME
bye
ENDFTP
}
if [ ! -r $FILE_DIR/$INPUT_FILE_NAME ]
then
echo "ERROR-APP-->: FTP Process Has Failed or FTP Server Unavailable." | tee -a $INLOG
else
echo "INFO-->: FTP Process Completed Successfully!" >> $INLOG
fi
test -s $FILE_DIR/$INPUT_FILE_NAME
FILE_TEST=$?
if [ $FILE_TEST -gt 0 ]
then
echo "INFO-->: $INPUT_FILE_NAME Is An Empty File. Skipping SQL*LOADER Process!" | tee -a $INLOG
echo "INFO-->: FTP Completed Successfully" >> $INLOG
exit $SUCCESS
else
echo "INFO-->: Completed File Check - Starting SQL*LOADER Process." | tee -a $INLOG
fi
Hi google,
Thanks for the reply...
The reason I'm using a different shell script (not the function) is because it gets called by number of other scripts...
Unfortunately I am sending the files across to other servers and no get! so I don't know how find out if I had a successful transfer...
ftp -v -n $1 << ENDFTP > /tmp/do_ftp_report_$4.log
user $2 $3
ascii
prompt off
put $$4
bye
ENDFTP
then have your calling script(s) examine the log file created searching for specific keywords to indicate success or failure. In binary mode you can also test the bytes sent line against the original size of file, track transfer time statistics. etc.
To see if a file exists on a remote ftp server, use the ftp program in a script and do this command:
dir file.in.question local.out
You will need to turn off interactive mode before you do that command. After the ftp job finishes, look at the file local.out. If the remote file did not exist local.out will be empty. Otherwise it will have some contents.
At the end of FTP process I "dir" the remote file on a local file like below:
....
exec 4>&1
ftp -inv >&4 2>&4 |&
print -p open $FTPSERVER
print -p user $USER $PASSWD
print -p lcd $OUTPUTDIR
print -p binary
print -p put $ZIPFILE
print -p dir $ZIPFILE local.out
print -p quit
....
but this file is not available for the first script which called do_ftp.sh
if [ -s $OUTPUTDIR/local.out ]; then
"Yes"
else
"No"
fi
It always says No... But file is there and I can test it (-s) through another scripts... It's weird... I think it might be related to the open pipe!
exec 4>&1
ftp -inv >&4 2>&4 |&
After the "print -p quit" line, you need to have a line that just says "wait".
You are spitting lines at your ftp client very fast. That's the equivalent of typing ahead. You need to let the ftp job finish. The wait command will reap the zombie that results when the ftp job finishes. At that point the file will exist.