FTP Cron Issue

I have an automated FTP script that is giving me grief. The script runs a ftp sweep for one of our accounts. It gets about 90 files then hangs. When i run ftp manually from cli everything runs great and I pickup the thousands of files waiting.

I used to run lftp, but it has been behaving even worse. Here is a snipet from my shell script that runs the automated pickup.

# OLD WAY!
#/usr/local/bin/lftp -d -f $SESSION_CMDS

# this way works from CLI, but stops short when in this script
ftp -f -d -i -n $SERVER <<END_SCRIPT
quote USER $USERNAME
quote PASS $PASSWORD
mget *
quit
END_SCRIPT

I've run this on both FreeBSD 6.2 and Gentoo 2007.0. Both with the same results. Termination and loop back to trying to login again after 90 files or so, but then I get the User already logged in message.

I thought maybe this was firewall related, but since I can run ftp from bash and have no problems, I'm lost.

Please if anyone knows any special voodoo, I would greatly appreciate it :slight_smile: Does FTP have a file limit (count or size) running from cron? If anyone can help and you need more info please let me know. THANKS!

Have you thought of using $HOME/.netrc for the username and password?

good pointer. I should have used .netrc file for my cron job. But I don't think this will help with my problem. Its almost like there is a file limit on count or size when running via cron. but i'm not sure thats true either, just what I'm observing.

  1. are you setting the current directory?

  2. can you get any output from ulimit?

     limit, ulimit, unlimit - set or get limitations on the  sys-
     tem resources available to the current shell and its descen-
     dents
  1. Yes I'm setting the current dir ($WORKING_DIR)
# change the working directory and print it
cd $WORKING_DIR
echo "Chaning to FTP download directory: "
pwd
echo ""

# Start output for ftp session
echo "FTP Session:"

# Call LFTP and pass in the commands as a parameter
#/usr/local/bin/lftp -d -f $SESSION_CMDS
#lftp -d -f $SESSION_CMDS
ftp -d -i -n $SERVER <<END_SCRIPT
quote USER $USERNAME
quote PASS $PASSWORD
mget *
quit
END_SCRIPT

# Put in newlines for next entry
echo " "
echo " "

# Terminate the program
exit 0
  1. ulimit = unlimited

Maybe this doesn't answer Your question, but have You considered trying to use wget or curl or some other powerful "getter"?
eg:
wget ftp://${USERNAME}:${PASSWORD}@${SERVER}/*

More suggestion rather than solution ...
The USERNAME & PASSWORD should be hidden in an environment file and run at the start of the script. Just a thought.

ftp -inv ${CFS_DNS} <${FTP_BATCH} >${FTP_LOG}

Would humbly suggest:
a) Create an actual batchjob ($FTP_BATCH)
b) drop the `quote user` for `user`

Don't know what OS your using, `-d` isn't used on HP-UX 11.23.

Cheers,
Cameron

i'm using FreeBSD 6.2 and Gentoo 2007.0. Man ftp says -d flag for debugging output.

I plan on moving the config information into the .netrc file per everyone suggestion.