Executing Multiple Commands – A Bash Productivity Tip

MultipleI love shell productivity hacks. I believe that if you work in a Linux environment, you owe it to yourself to become as good as you can be at working with your shell of choice (e.g. bash). You see, most people who have worked with Linux for any length of time have some level of skill with the shell, that level is usually mediocre at best. But, you do meet the occasional person who wows you with what they can accomplish and I will tell you right now that their skill does not come from superior knowledge (or at least not fully from superior knowledge). It is all about maximizing the effect of the knowledge you do have, finding little tricks and hacks that will save you, potentially less than a second, every time you perform a particular action. Thing is, some actions you might do hundreds of times per day, so those seconds really start to add up, especially once you have accumulated dozens of these hacks. Anyway, here is one such tip.

Running Multiple Commands

Whenever you work with the shell you almost always need to run several commands in a row. I am not talking about piping commands to each other, but just running several in sequence. Surprisingly, people usually tend to wait for each command to execute before running the next one. Why not execute all the commands at once? Bash has some decent support for this, and if you train yourself to do it this way, not only will it potentially save you some time, but it will also force you to think further ahead; all of which will make you that little bit more productive. Let's have a look.

Firstly we need to create some commands to make it easier to see what is happening. The first one will be called cmd1, and all it will do is sleep for 2 seconds then output something to the screen.

sleep 2
echo "cmd1"

The second command will be called long-cmd and it will sleep for 30 seconds before outputting something to the screen.

sleep 30
echo "long-cmd"

The third command will be called cmd-fail and it will exit with a non-zero exit status to simulate a failing command.

echo "Failing..."
exit 2

We're now ready to run some commands in sequence by kicking them off at the same time. All you need to do is separate the commands with semicolons and they will execute one after another.

[email protected]:~/tmp$ ./cmd1 ; ./cmd1 ; date
Sat May  1 23:29:49 EST 2010

As you can see, the two cmd1 commands executed first and then the date command ran – as expected. The only problem here is this, if a preceding command fails, the subsequent ones will still run:

[email protected]:~/tmp$ ./cmd1 ; ./cmd-fail ; date
Sat May  1 23:32:33 EST 2010

This may be the behaviour we desire, in which case all is well, but what if we do care about the success of the preceding commands. In that case, separate the commands with double ampersand instead of semicolon.

[email protected]:~/tmp$ ./cmd1 && ./cmd-fail && date

As soon as one of the commands fails, no subsequent command will be executed – handy.

Here is another thought, if you don't need to run multiple commands in sequence, but simply want to kick off multiple commands, why not execute all of them at once by putting them all in the background. All you need to do is separate each command with a single ampersand:

[email protected]:~/tmp$ ./long-cmd & ./long-cmd & ./long-cmd &
[1] 2643
[2] 2644
[3] 2645
[email protected]:~/tmp$

All three of our long commands have kicked off in the background. This works because as soon as we background the first command by putting an ampersand after it, the shell gives us another prompt and is ready to accept more input, so we can keep going and execute as many commands as we want on the same line, by backgrounding them all. It is a slightly quicker way (than waiting for each command to finish, or backgrounding each one separately) to execute multiple commands if they don't rely on each other. If the commands produce output, all the output will still go to the screen, so the output from all the commands can potentially get mixed up, but you can, of course, still redirect the output of each command somewhere else. You can also easily check on the status of each of the backgrounded commands:

[email protected]:~/tmp$ jobs
[1]   Running                 ./long-cmd &
[2]-  Running                 ./long-cmd &
[3]+  Running                 ./long-cmd &

The number in square brackets is the id of the job, and you can use it to perform actions on the running job, such as bringing it to the foreground, or killing it:

[email protected]:~/tmp$ ./long-cmd & ./long-cmd & ./long-cmd &
[1] 2679
[2] 2680
[3] 2681
[email protected]:~/tmp$ kill %2
[email protected]:~/tmp$ jobs
[1]   Running                 ./long-cmd &
[2]-  Terminated              ./long-cmd
[3]+  Running                 ./long-cmd &
[email protected]:~/tmp$  fg %1

If you foreground a job by using fg, and want to put it back in the background after, all you need to do is press Ctrl-Z and then background the job again by using its id.

[email protected]:~/tmp$ fg %1
[1]+  Stopped                 ./long-cmd
[email protected]:~/tmp$ bg %1
[1]+ ./long-cmd &
[email protected]:~/tmp$ jobs
[1]   Running                 ./long-cmd &
[2]-  Running                 ./long-cmd &
[3]+  Running                 ./long-cmd &
[email protected]:~/tmp$

There you go, not revolutionary, but some handy tips to keep in mind to shave off a few seconds here and there. And believe you me, those seconds really do start to add up after a while. More about shell stuff soon.

Image by Roger Smith

  • Thanks for sharing this. Good ideas I will (or already am) use.


    • Hi Ruben,

      Yeah I find it comes in really handy once in a while, but I do need to consciously remember to use it more since it is almost always applicable.

  • Pingback: Wykonywanie kilku poleceń jednocześnie - develway.pl()

  • Kiran

    Neat. Tfs!

    Also, found your site very informative n impressive. Gonna add it to my company’s bookmark list n am sure many newbies like me will find it very helpful.


  • Pingback: coproc help - a new feature in bash | Amit Agarwal()

  • Pingback: Quora()

  • Love

    Amazing! Thanks!

  • Waqas Ashraf

    Handsome thoughts :-)

  • Fabian Zeindl

    Another import trick: add “set -e” on top of your shell scripts to terminate it when a single command fails.


  • Joseph in Atlanta


    You wanted a way for ” awk ” to filter out First/Last/Blank lines…

    How about: awk ‘LN{print LN}NR>1{LN=$0}’

    And to do the same thing in SHELL: (even one line,if you want)

    NR=0;while read LINE;do ((${#LN}))&&echo “$LN”;[ $((NR+=1)) -gt 1 ]&&LN=”$LINE”;done

    expanded to:

    while read LINE
    do ((${#LN}))&&echo “$LN”
    [ $((NR+=1)) -gt 1 ]&&LN=”$LINE”

    be careful with that punctuation !!

  • Pingback: GIZA++ | Nghiem Quoc Minh()

  • prem

    how to give multiple printout (shall i use recorder software)

  • Pingback: How to Run More Than One Bash Command at Once()

  • Patrick Horgan

    Pretty cool basic tutorial. Have you thought about changing the CSS for your code boxes so that on narrow width (phones) devices they would either wrap or put up scroll bars? With long lines like in some of your commands, visually it just cuts off with no indication that there’s more. If people don’t already know what you’re doing it is confusing.

  • Goodun. Thanks!