Script to archive logs and sftp to another archive server

Requirement:

Under fuse application we have placeholders called containers;

Every container has their logs under:

<container1>/data/log/fuse.log
<container1>/data/log/fuse.log.1
<container1>/data/log/fuse.log.XX
   
<container2>/data/log/fuse.log
<container2>/data/log/fuse.log.1
<container2>/data/log/fuse.log.XX

Our requirement is to have the logs moved from the server to another archiver server using SFTP after compressing it.

We have 9 servers and each of them will get their logs stored in the archiver server in different paths.

Lets say,

1st node has 3 containers
The archiver script will make a directory in the archiver server and store the compressed files as below.

/<app>/Prod/Cluster1/<Node1>/<Container1>/<compressed logs>
/<app>/Prod/Cluster1/<Node1>/<Container2>/<compressed logs>
/<app>/Prod/Cluster1/<Node1>/<Container3>/<compressed logs>

Similarly,

2nd node has 3 containers

/<app>/Prod/Cluster1/<Node2>/<Container1>/<compressed logs>
/<app>/Prod/Cluster1/<Node2>/<Container2>/<compressed logs>
/<app>/Prod/Cluster1/<Node2>/<Container3>/<compressed logs>

Any suggestion on this would be of great help :slight_smile:

---------- Post updated at 11:08 AM ---------- Previous update was at 10:41 AM ----------

We had this implemented using perl and ftp but now we need to implement shell script with sftp
:frowning:

So why can't you just use the perl script and modify it to use SFTP instead of FTP?

There is not SFTP perl module supported by redhat as of now. Got it checked with Redhat support and hence cannot implement it. :frowning:

Can you show us how you call ftp from your Perl?

Do you have a file with the nine servers' access info listed, including the number of containers?
Do you want all log files in a directory compressed (by which mechanism / tool?) into one single archive?
Do you mean "delete original files" when you say "logs moved"?
Where is that script going to run, on a central server not identical with either of the nine nor the archive server?

@RudiC
Do you have a file with the nine servers' access info listed, including the number of containers?
Ans: The existing perl script called another file which contained path for the logs node wise.
So node 1 would have 2 files, file 1: perl script, file2: a txt file containing the log location of that node.

Do you want all log files in a directory compressed (by which mechanism / tool?) into one single archive?
Ans: As of now no compression is in place. We would like the script to compress the logs and then sftp post compression.

Do you mean "delete original files" when you say "logs moved"?
Ans: Yes, so as to save server space. Something like logs purging

Where is that script going to run, on a central server not identical with either of the nine nor the archive server?
Ans. Not on a central server. The script will run individually on each node and sftp it to the archive server.

@rbatte1

Attached the existing perl script