Hello,
I need to download multiple files from an FTP server but occasionally they arrive in error so I need to perform an integrity check. I've been attempting to write a bash script that does the following:
- Downloads all files including those in sub directories
- Perform md5sum using stored hash values
- Download any files that fail the integrity check again
- Repeat steps 2-3 until all files have passed the integrity check
I'm stuck on step 3. I don't know how to download the individual file so that it goes into its relevant sub-directory by itself. The relevant parts of my files and script are as follow:
checksums.txt
abcdef123456 /a/one.txt
ghijklm67890 /a/b/two.txt
file=`awk "NR==$i{print}" $checksumFile | awk -F " " '{print $2}'`
curl ftp://www.test.com/$file --user user:password
This ofcourse stores the file in the working directory. For instance, I'd like two.txt to save to /working directory/a/b/. Any ideas? Additionally, is there a better way of doing this?
Thanks