#!/bin/bash
sftp user@domain.com <<EOF
cd somedir
mget *.csv
quit
EOF
but on a crontab I want to only pull newer files, so I want to do something like:
while read ls current dir local file != true do
mget that new file
but I'm not sure the syntax to get sftp to check a local variable to see if a a remote file = the local one. I would really like to use rsync -auv -e ssh, but they won't give me a system account to do it, only sftp. Is there some way I can do this?
Can you write Perl code? If not, search these forums for a question like this, but with FTP. Then use the SFTP module instead. The code should be nearly identical.
well I wound up writing one, I know it's probably a terrific hack, but here it is in case it helps someone else, this took me awhile to figure out
#!/bin/bash
# pipe ls of remote folder to file
# to compare to figure out which files
# are newer
b=`sftp user@ftp.domain.com <<EOF
cd folder_with_hourly_dataset
ls
quit
EOF`
echo $b > remotelist.txt
# sanitize that file, so that all we
# see are the .csv entries
list=$(cat remotelist.txt)
for i in $list
do
echo $i | grep \.csv$ >> sortedremotelist.txt
done
# create local list of files
ls *.csv >> locallist.txt
# compare local/remote list and then
# sanitize the output file into a file
# with one .csv per line so we can
# use a for loop to sftp get it
diff locallist.txt sortedremotelist.txt > get
cut -c3- get > get2
grep \.csv get2 > get3
# now go get that list of files
c=$(cat get3)
for i in $c
do
d=`user@domain.com <<EOF
cd directory_with_hourly_data
mget $i
quit
EOF`
echo "-------------"
echo "getting file: "$i
done
# clean up files for next time
cat /dev/null > remotelist.txt
cat /dev/null > sortedremotelist.txt
cat /dev/null > get
cat /dev/null > get2
cat /dev/null > get3
hope that helps someone else. Of course I could've done the whole thing with a single rsync -auv -e ssh user@example.com:/folder_with_updates/ ./ but I couldn't get a system user account on the box to do it