Need automation script executing on multiple servers from one server

I have a tar file uploaded in a FTP server and it needs to be deployed in some 300 linux machines in a path like /opt/oracle/scripts and untar the file. it needs to be executed with one shell script by executing on all the the servers. this tar needs to deployed by doing pbrun su - user. public/private key isn't possible to install on all 300 servers.

Please provide a solution

What you have tried so far ..

You can use ssh in a script to execute a remote command.

Why is using public/private key to have passwordless login possible? If some servers don't support SSH, then can you remote shell (rsh/resh/remsh depending on your client OS and version)

You say that your have several hundred Linux servers, what are they (Centos, Red Hat, ....) and roughly what versions? Obviously don't list them all!

  • What have you tried so far?
  • What are your preferred tools?
  • Would you be better with an FTP-push to all or each one doing a get?
  • Could you NFS share the file? (or is it Windoze?)
  • What general logic were you thinking?

Most importantly, What have your tried so far?

I ask this so we can see your thinking and can better advise in a way that suits and you learn, after all we all came here to learn.

Robin

There are a lot of ways that you can do that. In my case I was maintaining database shell scripts on over 100+ servers. I setup a Subversion repository with svn 1.6.2 and I had a local script that did "svn update" to update the local repository with any changes that I made. I then rsync'd the changes to the bin and sql directories where they lived. I used svn over ssh, without authentication.

You can also setup and ssh/sftp/ftp server that your servers can log into and download the scripts. You should prevent the scripts from being changed by the servers that are able to download them. Otherwise someone might be able to make the script do something malicious.