SFTP - accidentally removing unprocessed records

I have an automated sftp process running on a Linux server that is securely connecting to an insurance company server whereby the insurance company places work assignments into a directory on their windows server (running VanDyke Vshell).

My unattended (background) process runs every minute. There is private/public key authentication that is working fine.

Its goal is to retrieve any new records in the Assignments directory, and after retrieving them, post them back to a directory called ProcessedAssignments. After sending them back, the process clears the Assignments directory.

This process works perfectly 99.9% of the time. Once in a great while, I accidentally remove a new record that was just placed in the Assignments directory between the time I did a 'get' and a 'put' and the time I do an rm * on the Assignments directory.

I have searched everywhere for a 'best practices' for this type of recurring file transfer with no results.

I'm trying to find a snippet of code somewhere that will allow me to create a list of the records I got/put, and only remove those records from the Assignments directory.

Code to generate script from Basic program here:

SCRIPT = ""
SCRIPT<-1> = "cd /Assignments" ;* Change external directory
SCRIPT<-1> = "get * ":CUSTOMER.INBOUND.DIRECTORY
SCRIPT<-1> = "cd /ProcessedAssignments":
SCRIPT<-1> = "put ":CUSTOMER.INBOUND.DIRECTORY:"/* ":
SCRIPT<-1> = "cd /Assignments"
SCRIPT<-1> = "rm * "
SCRIPT<-1> = "quit"
SCRIPT<-1> = "eof"

My problem is with the rm * statement.

So, I need to somehow pass a list to my script generator after performing the get, or I need to somehow tag the records I 'got' and only delete those...
Any help would be greatly appreciated.

Is that how you generate code? (curiosity) Maybe we can do something about customizing how you delete stuff. Looks like mainframe....

If you have keys over there can you ssh?

rm * is dangerous, get * is not helpful in this situation either.

A priori - what the strategy is:

  1. sftp to the remote
  2. ls the assignments directory to a file
  3. use the file to do individual get and rm commands

Thanks for the quick response. And thanks for fixing my code tags.

The best practice would be for the receiving system to remove or at least move records that have been processed, by itself. After all, only it knows which records have and haven't been processed.

I have tried unsuccessfully to ls to a file. I have searched this forum and can't find my answer.

I am trying to redirect stdin to a file that I touched on the unix system called assignmentslog.txt.

I have tried

 
sftp> ls .  assignmentslog.txt

and

 
sftp> ls . 2>&1 | tee assignmentslog.txt

and several other attempts at redirecting the output(really input) of the ls to this file, and they all go to the screen.
What am I doing wrong?

sftp remotesystem.com  <<EOF > dir.lis
 cd ./Assignments
 ls 
 exit
EOF

dir.lis will now contain the listing of available files.

move_file()
{
   sftp remotesystem.com <<EOF
   cd ./Assignments
   get  $1
   rm $1
   cd ./otherdirectory
   put $1
   exit
EOF
}

while read fname
do
    move_file $fname 
done < dir.lis

This is purely on the UNIX side. You will end up with local copies of the files, and copies of the files in ./otherdirectory on the remote box. Note that EOF goes in the leftmost column. EOF can be any set of nonsense syllables that the shell will ignore. It is sort of traditional to use EOF. This construct is a here document.

:wall:I see what you are trying to show me, but I don't understand where to put the move_file code and how to execute the while/do. Are these embedded in a script, or set as executables somehow?

I have let this project go and have been handling the premature deletes manually. It has now reached a more critical mass, and I must solve this.
I am still using the rm * method, which is prone to problems. How do i reference the move_file script from within my script?

Why don't you show the entire scenario - your script(s), directory structures, etc.
You should abstain from the rm * - no good!
jim mcnamara has shown you a way to handle file by file, and, which might be even safer, I'd consider to use the sftp rename command to move the files to the remote new path without losing any of them.