How can I automate running commands remotely over SSH to multiple servers in parallel?


Question

I've searched around a bit for similar questions, but other than running one command or perhaps a few command with items such as:

ssh user@host -t sudo su -

However, what if I essentially need to run a script on (let's say) 15 servers at once. Is this doable in bash? In a perfect world I need to avoid installing applications if at all possible to pull this off. For argument's sake, let's just say that I need to do the following across 10 hosts:

  1. Deploy a new Tomcat container
  2. Deploy an application in the container, and configure it
  3. Configure an Apache vhost
  4. Reload Apache

I have a script that does all of that, but it relies on me logging into all the servers, pulling a script down from a repo, and then running it. If this isn't doable in bash, what alternatives do you suggest? Do I need a bigger hammer, such as Perl (Python might be preferred since I can guarantee Python is on all boxes in a RHEL environment thanks to yum/up2date)? If anyone can point to me to any useful information it'd be greatly appreciated, especially if it's doable in bash. I'll settle for Perl or Python, but I just don't know those as well (working on that). Thanks!

1
19
9/27/2017 6:00:29 PM

Accepted Answer

Often, I'll just use the original Tcl version of Expect. You only need to have that on the local machine. If I'm inside a program using Perl, I do this with Net::SSH::Expect. Other languages have similar "expect" tools.

9
10/28/2008 4:19:07 PM

You can run a local script as shown by che and Yang, and/or you can use a Here document:

ssh root@server /bin/sh <<\EOF  
wget http://server/warfile    # Could use NFS here  
cp app.war /location  
command 1  
command 2  
/etc/init.d/httpd restart  
EOF 

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow
Icon