how to delete files on two remote servers simultaneously?

dear all,

i'm preparing a script which can do these actions :

  1. stop remove server's certain service
  2. clean the files on remote servers simultaneously (because lots of files need to be deleted)
  3. after files/logs are removed, restart the service again

i'm stuck on how to clean remote servers's files at the same time to save time.

this is the code i have right now :
(by the way, i have setup the auto ssh-login mechanism, thus, passwd is not necessary when running ssh command)

MAILERS=(192.168.1.1 192.168.1.2)

# stop service
for ((index=0; index<${#MAILERS[@]}; index++)); do
   ssh ${MAILERS[$index]} "service my_service stop"
done

# delete those files matching $1 variable
for ((index=0; index<${#MAILERS[@]}; index++)); do
   ssh ${MAILERS[$index]} "find /path -name '*$1*' -exec rm {} \;"
done

# start the service again
for ((index=0; index<${#MAILERS[@]}; index++)); do
   ssh ${MAILERS[$index]} "service my_service start"
done

to be more specific, how to run the "find-exec-rm" command on both servers simultaneously?

just put the & in end of it is not what i want...

any advice on this? Thanks.

And why is not & a sufficient solution?

Your ONLY choice is to create two seprate processes on the remote boxes, either serially or sequentially.

cause lots of files on two servers need to be deleted before service start....

if putting the 'find-exec-rm' command into background, then i can't control when it's finished..

Yes, you can.

# delete those files matching $1 variable
for ((index=0; index<${#MAILERS[@]}; index++)); do
   ssh ${MAILERS[$index]} "find /path -name '*$1*' -exec rm {} \; >/dev/null 2&>1"  &
done
wait

The shell will wait for all of the ssh invocations before to complete continuing to the next step.

1 Like

if you administer multiple servers search:

clusterit
and cssh

They can seriously cut down your workload.