How to run same script on multiples servers

Hi All,
I have some script that run some commands and send results to my email. I want to run same script on mulitiple servers. How can I do that. I know there is an option "ssh". But I'm not quite sure how to use it in the script.

And also. scripts has some parameters like following,

if [ -f "/some/some/some1"]: then

do something.

"/some/some/some1" will vary by the server.

Any body have an idea. Thanks

I recommend setting up an ssh key pair for passwordless logins on the machines you will be running the script on. For this to work you will need to put a copy of the script you are trying to run on each server. Then you can write a simple "launcher" script to ssh into each machine and run the script that are wanting to run on each machine. The launcher script doesn't need to be anything more than a simple for loop.
Step by step:

  1. Set up the ssh keys on each machine. The ssh man page explains how to do it.

  2. Put a copy of the script on each machine

  3. Write a simple launcher script. It will look something like this
    #!/bin/ksh

for server in host1 host2 host3
do
ssh $server /path/to/script
done

  1. Enjoy your time saved

Thanks for your reply.
I forgot to mention that, I get an email after I run the scripts, so if I set up ssh to run the scripts on multiple servers, can I put results from all servers to one file,then send an email instead of sending mail from each server. Thanks

You could have the script running on each machine append the results to a text file on an nfs server and then have the launcher script email you the file as an attachment and mail it to you or read the file and mail the text to you.

Thanks for your quick reply.

I can able to ssh to other servers and able run the command. I'm getting emails from each server. To aviod this, I'm trying to append file using ssh using following command

ssh user@some.domain "cat remote-source-file-name" >> local-target-file-name

since I have to add this command to the script for several servers and need to remove the file from all servers, the script will be too long.
if I can do it in oppisite direction using following command

cat local-source-file-name | ssh user@some.domain "cat >> remote-target-file-name"

i can send result file from all servers and append to the file where (server A) the intial command runs. from there I can send an email with result file. But for this, Do I need to copy RSA keys from all servers to server A. Because when I run the above command, it asking me for the password. But I can ssh from server A to all other servers without the password. Any one have idea. Thanks.

Do you have any NFS servers in your environment? If you do just append the output of the script to the same file on an NFS share and then mail that one file to yourself.

If you run an ssh command remotely the stdout from the command is returned to your local terminal. You can capture it there and send to a local file. So, say you want the output of the date command from multiple servers put in a file and then emailed. You could just:

#!/bin/bash
hosts=( host1 host2 host3 )
for ((i=0;i<${#hosts[@]};i++)); do
  ssh me@${hosts[$i]} "date" >> outputfile &
done
wait
mail -s "Results" me@domain.com < outputfile

if it's a script that has to run they you could have it in a shared location and run it

  ssh me@${hosts[$i]} "/path/to/script.sh" >> outputfile &

If there's something in the script that varies by server you're just going to have to work that out. Either have a different script in each location or, if you need more automation, have it accept command line input for the difference and pass it with the script:

#!/bin/bash
hosts=( host1 host2 host3 )
var=( var1 var2 var3 )
for ((i=0;i<${#hosts[@]};i++)); do
  ssh me@${hosts[$i]} "/path/to/script.sh ${var[$i]}" >> outputfile &
done
wait
mail -s "Results" me@domain.com < outputfile

(As you can see I'm an array junkie. Might be overkill here. Dunno. Works for me though :slight_smile: If you need any parts of my example explained please don't hesitate to ask)

Thanks for your reply Mglenny,
Since I'm new to the scripting, I have some questions,
ssh me@${hosts[$i]} "/path/to/script.sh" >> outputfile &
I see the out file but nothing in there. when I ran the following
ssh me@${hosts[$i]} "/path/to/script.sh" , I see some output which deliver by script on each server. Any idea. Thanks again.

I'm typing this on my phone so I can't provide a code example but try removing the & from the end of the line and move the 'wait' up one line above 'done'

Thanks for your reply...
I just tried that but still the same thing. here is the code I'm using..

#!/bin/bash

for server in ipaddress
do
ssh $server /usr/local/bin/service-mon.sh >> /home/uname/servicemon
wait
done
cat /home/uname/servicemon | mail -s "Service Status" uname@company.com

I'm just testing with one server now.

service-mon.sh generates outfiles itself.

I also just tried to run locally with simple command "/path/to/script >> outputfile" but i dont see anything in outputfile.

Thanks

Does service-mon.sh output any data to stdout or does it all get directed to an output file. If so you'll need to tee it to stdout if your going to catch it with ssh.