ftp file verification

I need a simple method to verify that an automated ftp script was successful. The ftp command can exit without error and the file may not have been successfully sent. It's rare, but it happens.

I could write a script that would use md5sum before and after sending the file. But, what I really want is an open source ftp program that does this for me! It would exit non-zero if the files are not the same. Does such a beast exist???

If not, what is the best way to accomplish ftp file verification?
Thanks,

The problem with your plan is that you can't compute a checksum on a remote file via the ftp protocol. It would be cool to add a checksum option into the protocol. This could be done via the site command. But until everyone else agrees with this and modifies the servers and clients we are stuck with what we have.

You could lcd to another directory and retrieve the file again. Then compare the original file with the one that made the round trip. If they are identical, it pretty certain that the remote copy must also be ok.

Why not check the sizes of the two files to verify they are the same? Use the SIZE command.

Definitley not as precise as MD5 but it still might come in handy.

We currently retrieve the file via ftp to verify that the first ftp session was valid. Needless to say it multiplies network traffic and increases the complexity of the entire process.

Having said that, it would seem that ftp should be able to internally compute the checksum before and after the transfer, then ftp "get" the checksum from the remote server to verify the comparison? Just wishful thinking. I am not expecting this to happen and I don't have time to build it myself...

I find it surprising that this issue has not frustrated enough people to make an ftp verification process.

We don't have the SIZE command on our Solaris boxes. I will play with it on my home Linux box...
Thanks and best regards!

Why dont you just keep a log of your ftp activity and check this post the ftp'ing of the file?

ie.
ftp -vn -T 5 << endftp2 > ftplog
open <server>
user <username> <passwd>

put filename
bye
endftp2

Then you can grep the logfile for errors etc.

Our ftp processes have very good log files. This still does not capture some network issues. We can have good log files yet the ftp process somehow failed.

I need a process to verify that the files are the same... Consider this, these files may be overwritten because they have the same file name (where a timestamp is not allowed). This is where checksum is a good tool.

A long time ago an ftp process was done on SUN servers at my workplace where the map database file is sent to other servers and then a check file was sent. The check file contained the output of ls -s of the map database file. It was checked against the ftp'd file. (The second file with the size also allowed us to know when the first file was completly sent and not still being sent).

set knownsize = `cat $ftphome/ftpOVready|awk '{print $1}'`
set thissize = `ls -s $ftphome/openview-$today.tar |awk '{print $1}'`
if ("$thissize" != "$knownsize") goto files-missing

This was part of the code (in csh).

I don't see why you couldn't do the same unless it's going to a non-UNIX system.

Thanks for the code example.
We have a batch process ("file watcher") that will indicate when the file has finished transfering.

I think the solution would be to build on your concept. Perhaps the remote server should have a file with the output of the cksum command for each file to be sent. This file would be ftp'd with the other files and a similar cksum process would compare these results. The process would fail if there is a difference.

The nice thing about this method is that the filenames could be different and it would still work (if files were timestamped). It would simply check if all the checksum values are the same and fail if they are not.

Thanks for everyone's contribution.