I have a Fedora FTP server (lets call it FTP-SERVER1) and files are constantly being uploaded to it from a 3rd party.
Once every 15 minutes I need to "move" the files from the FTP
server to another server (lets call it SVR-WINDOWS) where it will be processed and then deleted.
I'm thinking the process should be something like this:
From SVR-WINDOWS, open ftp connection to FTP-SERVER1
list the contents of the "ftp home" directory (e.g. dir = file1, file2, file3)
get each file listed (get file1, get file2, get file3)
delete each file that was copied (delete file1, delete file2, delete file3) to SVR-WINDOWS
close ftp connection
While the above takes place more files are uploaded to the FTP server. So next time I execute my script, it will move the newly uploaded files.
Does anyone know if it's possible to do the above?
If yes, then how do I go about to "list files" and only get and delete the files "listed".
After you have uploaded the first batch, make a text file and edit it so that it contains:
rm file1
rm file2
etc.
At the end of the file add the lines
mget *
quit
Use this newly created file as input to your next ftp session.
Depending on the size of the files being uploaded from the thrid party. you might retrieve files that are still in the process of being uploaded.
Can you have the third party, send a second file of minimal size indicating the primary upload is complete for that file?
# Get list of files on FTP server
ftp -i -n <<EOF > /tmp/filelist.txt
open <hostname>
user <userid> <passwd>
ls
bye
quit
EOF
# Run through the list of files and get, delete each file
for i in `cat /tmp/filelist.txt`
do
ftp -i -n <<EOF
open <hostname>
user <userid> <passwd>
get $i
delete $i
bye
quit
EOF
done
Do you think this will work?
It doesn't solve the problem of copying partially uploaded files (thanks for bringing it up).
---------- Post updated at 02:45 PM ---------- Previous update was at 02:21 PM ----------
Okay tried my script, but it didn't work.
It gets a list of file names, but it's not happy with the second part of the script. It fails at the line with "ftp -i -n <<EOF" (inside the "do" bit).
If there is a problem processing the retrieved file, you have already deleted it from the source.
#!/bin/ksh
if [ -r lock.file ]
then
exit 1
fi
echo $$ >lock.file
if [ ! -r input.file ]
then
echo "mget *" >input.file
echo "quit " >>input.file
fi
ftp site <input.file
cat /dev/null >input.file
list=`ls received files`
#list=`ls *.job` if there are two files file.data and file.job
for file in $list
process $file
echo "rm $file\n" >>input.file
mv $file ../done
done
echo "mget *\n" >>input.file
echo "quit\n" >>input file
rm lock.file