Shell scripts to Pull and Push files from SFTP location

I Required Shell Scripts code to Pull and Push Files From SFTP Location and Send Email when Error Level Is 1 or Success

@Yogesh_Shinde , Welcome.

The forum is a collaboration, that means you post your request along with your attempts and the team respond, we will not simply write a solution for you without your participation.

rgds

This is what chatGPT generated based exactly on the text from your post:

#!/bin/bash

# SFTP details
SFTP_HOST="sftp.example.com"
SFTP_USER="your_username"
SFTP_PASSWORD="your_password"
REMOTE_DIR="/remote/directory"
LOCAL_DIR="/local/directory"

# Email details
TO_EMAIL="recipient@example.com"
FROM_EMAIL="sender@example.com"
SMTP_SERVER="smtp.example.com"
SMTP_PORT="587"
SMTP_USER="smtp_username"
SMTP_PASSWORD="smtp_password"

# Pull files from SFTP
echo "Pulling files from SFTP..."
sftp_command="get -r $REMOTE_DIR $LOCAL_DIR"
sftppass -d -v -r $SFTP_USER:$SFTP_PASSWORD@$SFTP_HOST << EOF
$sftp_command
bye
EOF

# Check SFTP operation result
if [ $? -eq 0 ]; then
    echo "SFTP pull successful."
    EMAIL_SUBJECT="SFTP Pull Success"
    EMAIL_BODY="The SFTP pull operation was successful."
    ERROR_LEVEL=0
else
    echo "SFTP pull failed."
    EMAIL_SUBJECT="SFTP Pull Error"
    EMAIL_BODY="The SFTP pull operation encountered an error."
    ERROR_LEVEL=1
fi

# Push files to SFTP (example)
# Uncomment and modify this section if needed
# echo "Pushing files to SFTP..."
# sftp_command="put -r $LOCAL_DIR $REMOTE_DIR"
# sftppass -d -v -r $SFTP_USER:$SFTP_PASSWORD@$SFTP_HOST << EOF
# $sftp_command
# bye
# EOF
# if [ $? -eq 0 ]; then
#     echo "SFTP push successful."
# else
#     echo "SFTP push failed."
#     ERROR_LEVEL=1
# fi

# Sending email
echo "Sending email..."
echo -e "Subject:$EMAIL_SUBJECT\n$EMAIL_BODY" | \
    ssmtp -v -s $SMTP_SERVER:$SMTP_PORT -au $SMTP_USER -ap $SMTP_PASSWORD $TO_EMAIL

exit $ERROR_LEVEL

And also some comments:

Make sure to replace the placeholders (your_username, your_password, etc.) with your actual SFTP and email server details. The script uses the sftp command for file transfer and the ssmtp command for sending emails. Please note that using plaintext passwords in scripts is not recommended for security reasons. It's better to use SSH keys for SFTP authentication.

Remember to set execute permissions for the script using chmod +x scriptname.sh.

This script is just a basic example and might need adjustments based on your specific requirements and environment. Additionally, you may want to consider using more advanced error handling and logging mechanisms for production use.

I was curious what it will generate for Bash, usually I use it only for Python short snippet requests.

I am looking for help on a query on how to recursively pull files from a remote host to a local host. I believe this is a popular topic. I browsed through but could not find the suitable one. Let me explain what I am planning to do.
Both local and remote hosts are Redhat Linux. I want to write a bash script to recursively sftp get a specific set of files (could be more than 100 in some cases) from remote host. The file names are random and hence I am unable to use '* 'in file name to select them. The only parameter to select is the time stamp when the files were created.
Can someone please guide me? If any relevant discussion was already there, I will be happy to study that. Thank you

This implies that you'd most probably need to check every file's timestamp before downloading it (or pushing it from a remote machine to your local machine), or do a "two-pass" - first to create a list of relevant files (based on their timestamps), second download only files meeting the criteria. Why sftp? Won't scp or rsync do what you need?

2 Likes

Or ssh ?
ssh can run remote commands like find and cpio and transfer files through the stdout stream.

2 Likes

@asutoshch please clarify if you are wanting files newer than, on, or older than the time stamp.

I wonder if rsync might have an option to help you.

You may need to run something like find with proper options to select the file(s) that you're interested in on the remote system and use that as a list to generate a command to pull files.

Please provide some hypothetical examples to better demonstrate what you're trying to do.

Thank you DrScriptt, I will reply with details tomorrow. I was away on personal work. Thank you for your help.

Hello DrScript,
I am late since my VPN access had issues. Now OK. Let me describe.
I transfer some files (not fixed in number) from one location to a temporary Linux server. This transfer is managed by a tool through a one click operation. The Linux server is having restricted access and only sftp enabled. After transferring to the temporary server I need to move them to third server (Linux) which is permanent storage for files. I can reach the temp server from third server using sftp and pull files. This is where I was looking for some automation. File names and their number are not known but I can tell the date time range. I will run the script on the third server which should connect to temp server and pull the files I want. rsync, I guess is not possible because it should not dump the entire content of folder but copy selectively.

It allows sftp only. When I run ssh, I see.
This service allows sftp connections only.

I can create a list of relevant files and put a in a text file. I was trying with that but some how the sftp was not working from a script on third server.

Thank you for the clarification @asutoshch. :slight_smile:

I wonder if something like the following might give you a list of files:

echo "ls -l" | sftp ...

N.B. you might need to put the command in a file and reference it via the -b (batch) command line option, particularly if you're already messing with redirection with sftppass.

Once you have a list of files, then you should be able to pull them from the remote server without much trouble.

N.B. Edited to correct a missing command; echo "ls -l" vs just ls -l, after @Matt-Kita pointed out my mistake.

Absolutely not! This command lists the files (in a long listing format!) in your current working directory, and pipes this output to sftp command - this is definitely NOT how you invoke remote commands (neither for sftp, nor for other, similar tools).

You are correct @Matt-Kita.

I had intended to put echo "ls -l" | sftp ....

I've edited the post with a note about the edit. The command above is a copy and a paste from testing the command against multiple systems and ensuring that the listing is different than the local system and multiple different remote systems.