MultipleI love shell productivity hacks. I believe that if you work in a Linux environment, you owe it to yourself to become as good as you can be at working with your shell of choice (e.g. bash). You see, most people who have worked with Linux for any length of time have some level of skill with the shell, that level is usually mediocre at best. But, you do meet the occasional person who wows you with what they can accomplish and I will tell you right now that their skill does not come from superior knowledge (or at least not fully from superior knowledge). It is all about maximizing the effect of the knowledge you do have, finding little tricks and hacks that will save you, potentially less than a second, every time you perform a particular action. Thing is, some actions you might do hundreds of times per day, so those seconds really start to add up, especially once you have accumulated dozens of these hacks. Anyway, here is one such tip.

Running Multiple Commands

Whenever you work with the shell you almost always need to run several commands in a row. I am not talking about piping commands to each other, but just running several in sequence. Surprisingly, people usually tend to wait for each command to execute before running the next one. Why not execute all the commands at once? Bash has some decent support for this, and if you train yourself to do it this way, not only will it potentially save you some time, but it will also force you to think further ahead; all of which will make you that little bit more productive. Let’s have a look.

Firstly we need to create some commands to make it easier to see what is happening. The first one will be called cmd1, and all it will do is sleep for 2 seconds then output something to the screen.

#!/bin/bash
sleep 2
echo "cmd1"

The second command will be called long-cmd and it will sleep for 30 seconds before outputting something to the screen.

#!/bin/bash
sleep 30
echo "long-cmd"

The third command will be called cmd-fail and it will exit with a non-zero exit status to simulate a failing command.

#!/bin/bash
echo "Failing..."
exit 2

We’re now ready to run some commands in sequence by kicking them off at the same time. All you need to do is separate the commands with semicolons and they will execute one after another.

alan@alan-ubuntu-vm:~/tmp$ ./cmd1 ; ./cmd1 ; date
cmd1
cmd1
Sat May  1 23:29:49 EST 2010

As you can see, the two cmd1 commands executed first and then the date command ran – as expected. The only problem here is this, if a preceding command fails, the subsequent ones will still run:

alan@alan-ubuntu-vm:~/tmp$ ./cmd1 ; ./cmd-fail ; date
cmd1
Failing...
Sat May  1 23:32:33 EST 2010

This may be the behaviour we desire, in which case all is well, but what if we do care about the success of the preceding commands. In that case, separate the commands with double ampersand instead of semicolon.

alan@alan-ubuntu-vm:~/tmp$ ./cmd1 && ./cmd-fail && date
cmd1
Failing...

As soon as one of the commands fails, no subsequent command will be executed – handy.

Here is another thought, if you don’t need to run multiple commands in sequence, but simply want to kick off multiple commands, why not execute all of them at once by putting them all in the background. All you need to do is separate each command with a single ampersand:

alan@alan-ubuntu-vm:~/tmp$ ./long-cmd & ./long-cmd & ./long-cmd &
(http://feeds.feedburner.com/softwaretechandmore) 2643
[2] 2644
[3] 2645
alan@alan-ubuntu-vm:~/tmp$

All three of our long commands have kicked off in the background. This works because as soon as we background the first command by putting an ampersand after it, the shell gives us another prompt and is ready to accept more input, so we can keep going and execute as many commands as we want on the same line, by backgrounding them all. It is a slightly quicker way (than waiting for each command to finish, or backgrounding each one separately) to execute multiple commands if they don’t rely on each other. If the commands produce output, all the output will still go to the screen, so the output from all the commands can potentially get mixed up, but you can, of course, still redirect the output of each command somewhere else. You can also easily check on the status of each of the backgrounded commands:

alan@alan-ubuntu-vm:~/tmp$ jobs
(http://feeds.feedburner.com/softwaretechandmore)   Running                 ./long-cmd &
[2]-  Running                 ./long-cmd &
[3]+  Running                 ./long-cmd &

The number in square brackets is the id of the job, and you can use it to perform actions on the running job, such as bringing it to the foreground, or killing it:

alan@alan-ubuntu-vm:~/tmp$ ./long-cmd & ./long-cmd & ./long-cmd &
(http://feeds.feedburner.com/softwaretechandmore) 2679
[2] 2680
[3] 2681
alan@alan-ubuntu-vm:~/tmp$ kill %2
alan@alan-ubuntu-vm:~/tmp$ jobs
(http://feeds.feedburner.com/softwaretechandmore)   Running                 ./long-cmd &
[2]-  Terminated              ./long-cmd
[3]+  Running                 ./long-cmd &
alan@alan-ubuntu-vm:~/tmp$  fg %1
./long-cmd

If you foreground a job by using fg, and want to put it back in the background after, all you need to do is press Ctrl-Z and then background the job again by using its id.

alan@alan-ubuntu-vm:~/tmp$ fg %1
./long-cmd
^Z
(http://feeds.feedburner.com/softwaretechandmore)+  Stopped                 ./long-cmd
alan@alan-ubuntu-vm:~/tmp$ bg %1
(http://feeds.feedburner.com/softwaretechandmore)+ ./long-cmd &
alan@alan-ubuntu-vm:~/tmp$ jobs
(http://feeds.feedburner.com/softwaretechandmore)   Running                 ./long-cmd &
[2]-  Running                 ./long-cmd &
[3]+  Running                 ./long-cmd &
alan@alan-ubuntu-vm:~/tmp$

There you go, not revolutionary, but some handy tips to keep in mind to shave off a few seconds here and there. And believe you me, those seconds really do start to add up after a while. More about shell stuff soon.

Image by Roger Smith