Do you have a file with the nine servers' access info listed, including the number of containers?
Do you want all log files in a directory compressed (by which mechanism / tool?) into one single archive?
Do you mean "delete original files" when you say "logs moved"?
Where is that script going to run, on a central server not identical with either of the nine nor the archive server?
@RudiC
Do you have a file with the nine servers' access info listed, including the number of containers?
Ans: The existing perl script called another file which contained path for the logs node wise.
So node 1 would have 2 files, file 1: perl script, file2: a txt file containing the log location of that node.
Do you want all log files in a directory compressed (by which mechanism / tool?) into one single archive?
Ans: As of now no compression is in place. We would like the script to compress the logs and then sftp post compression.
Do you mean "delete original files" when you say "logs moved"?
Ans: Yes, so as to save server space. Something like logs purging
Where is that script going to run, on a central server not identical with either of the nine nor the archive server?
Ans. Not on a central server. The script will run individually on each node and sftp it to the archive server.