Use cURL in shell script and get most recent file from remote server using SFTP

I have been trying to implement the following shell script --

sftp to remote server
get list of files in the directory
sftp get the most recent listed file
exit

This script will be scheduled to be executed everyday using CRON or CONTROL-M and the need is to have absolutely no human interaction. This should be executed without having the need to have someone answer system prompts (eg: enter username and password, accept certificate etc.)

I was told that this could be achieved easily using cURL in the sh script. Can someone confirm that this cannot be done without cURL. Can someone help me write this script.

If you are running this script without human interaction, then you need to establish some criteria for what files to get.
Are you always retrieving files from the same directory on the remote server?
Are you always fetching the same number of files? Or all files modified since the last script?

Are you retrieving files from a public server? That is, do you need to supply a username and password, or can anyone retrieve the files.

If it's not a public server, and I assume it's not, do you have secure shell (SSH) configured? I would do that first. If you copy your public key from your PC to the server, you can use ssh, scp and sftp without a password. In a script, scp will be more convenient than sftp.

I would also checkout rsync. It's probably what you really want.

Hi Ken, Thanks for the response. Let me elaborate my situation;

  1. Are you always fetching the same number of files? -- always getting one file.
  2. criteria for what files to get -- The most recent one i.e. the one with the most recent time stamp. File name will keep changing in an undefinable pattern.
  3. Are you retrieving files from a public server? -- no, need to provide username and password.
  4. ssh, scp and sftp without a password. In a script, scp will be more convenient than sftp. -- the server limits itself to sftp.

I will check out rsync. Thanks. Based on details above, any suggestions?

Maybe something like this:

#!/bin/sh
SERVER=example.com
DIR=some_directory
FILE="$(echo ls -t | sftp -b- $SERVER:$DIR | head -n1)"
echo get "$FILE" | sftp -b- $SERVER:$DIR 

These individual commands worked for me with no username or password because I have SSH setup to the server. If you don't, I assume you'll have to add user@. I'm not sure how the password is passed.

Command ls -t lists the files in time order, so the most recent file is first. Note that if there are subdirectories that might be modified, one could end up as more recent than any file, which isn't what you want.

Switch -b- tells sftp to work in batch mode and accept commands from stdin, which is the ls command we just piped to it.

Command head -n1 runs on the local machine to catch just the name of the first file.

The syntax FILE="$(...)" runs a command on the local machine and captures the output into a variable named FILE.

Even if the server really doesn't allow ssh, you may still be able to setup SSH keys so that sftp works without a password. It's secure because you can only access it if you log into your local machine.

Thank you very much. I will try this out and let you know how it went!

Ken,

Since I am moving to a dir when I do the sftp, the FILE variable is getting populated with "Changing directory..." message. Can you help me with a workaround., please?

Amit

may be...

 
FILE="$(echo ls -t | sftp -b- $SERVER:$DIR | head -n2 | tail -n1)"

Thanks Girish.

Here is the script

#!/bin/sh
SERVER=<my-server-ip-address>
DIR=XYZ
FILE="$(echo ls -t | sftp -b- $SERVER:$DIR | head -n2 | tail -n1)"
echo get "$FILE" | sftp -b- $SERVER:$DIR

Output:

Changing to: /home/username/XYZ
sftp> get sftp> ls -t
Couldn't stat remote file: No such file or directory
File "/home/username/XYZ/sftp>" not found.