Execute commands parallel in a for loop ?

Hi,

please can someone point me in the right direction with a shell scripting problem.
I want to execute a command in a for loop and the command should be started not one-by-one meaning the for loop is waiting for the exit code , it should be started in parallel.

I have a plain text file where a hostname stands in each line.

#!/bin/bash

file="/tmp/serverlist"

for server in `cat $file` ; do

ssh admin@$server "command to run"

done

the problem is :

the command I execute with SSH takes approx. 10 minutes to complete.

I have to run the command on 145 servers and I dont want to wait 10 mins to go to the next. I want to execute it in parallel say on 10 servers at a time.

Any ideas ? simply putting a & after the SSH commands didnt work, nor with continue ?

thanks for your help again...

try this,
with every instance executing command on 10 servers ...
not tested

>subscript.sh

#!/bin/bash

file=$1
for server in `cat $file` ; do
ssh admin@$server "command to run"
done
/bin/rm $1
exit 0
>mainscript.sh

#!/bin/bash

file="/tmp/serverlist"
TMPFILE="/tmp/"
cnt=0
filecnt=1
headcnt=10
tailcnt=`cat $file | wc -l`

while [ $cnt -lt $tailcnt ]
do
head -$headcnt $file | tail -10 > $TMPFILE$filecnt
subscript.sh $TMPFILE$filecnt &
cnt=$(($cnt + 10)
headcnt=$(($headcnt + 10))
filecnt=$(($filecnt + 1))
done

exit 0

ssh should be able to work as a background process. Maybe if you enclose the command you want to issue in double quotes to make clear where it ends?

ssh $user@$host "$command" &

Another gotcha is if the keys are not exchanged ssh stops in an interative mode asking for passwords - really silly programming IMHO - so your script would hang there indefinitely prompting perhaps cron or so for an answer.

bakunin

put the ssh command in a sub-shell in the background ...

#!/bin/bash

file="/tmp/serverlist"

for server in `cat $file` ; do

(ssh admin@$server "command to run" < /dev/null &)

done