Run Multiple Bash Commands In Parallel
Category : How-to
Bash, whilst great for simple things, can be tricky to use more advanced programming techniques that are easily exposed in things like Java, or Go.
Multithreading is one such problem. I often find myself with a series of tasks to perform that I’d like to run in parallel up to a predefined concurrency threshold.
My recent task which I’ll use as an example was to run multiple curl commands against an endpoint. These commands were standalone in the fact that they could be executed in any order and would benefit from running several API calls at once.
The first step is to create your list of commands in a file. For this, I’ll use the echo and sleep commands to demonstrate.
vi /tmp/myCommands echo 1 && sleep 2 echo 2 && sleep 2 echo 3 && sleep 2 echo 4 && sleep 2 echo 5 && sleep 2 echo 6 && sleep 2 echo 7 && sleep 2 echo 8 && sleep 2 echo 9 && sleep 2 echo 0 && sleep 2
Once you have your list of commands, it’s time to run them!
cat /tmp/myCommands | while read n; do printf "%q\n" "$n"; done | xargs --max-procs=2 -I LC bash -c LC
The first command cat /tmp/myCommands is simply the path to your list of commands to run. The only other part to worry about is the —max-proxcs=2 attribute of xargs – this is what defines the concurrency and therefore how many ‘threads’ will run at once. xargs will do the rest – each command in your source file will be executed with 2 running at once!
So there you have it – threaded command execution in Bash!