That will tell each command to run in the background, and send their output and error information to files. When a command is run in the background, it detaches from the current session, and allows the next command to run. It's important to put the ampersand (&) as the last character of the line. Otherwise, it might not get recognized correctly.

Alternately, have each of your shell scripts run the associated .php script you're apparently calling with the same semantics as above. In this scenario, your foo.sh might contain:

Note: If you really want the output going to the screen/terminal, you can leave off the output redirection:

foo.sh &

Remember, though, that running multiple commands concurrently can cause confusing output (you may now know what output or errors came from which command), and if you logout, the output has nowhere to go.

Hi Chris. Thanks for the reply. given that my scenario has each shell test running forever, i really don't want to write the stdout/stderr to a file, rather, i would like to see the output/stderr just displyed to the screen... i thought running in the background, meant i didn't have to wait for completion before running the next script.. have i missed something.. || also as to the acceptance rate.. i've tried to delete a bunch of things in the past but never could... in other cases, the answers were totally off base and no help
–
tom smithMar 28 '12 at 0:24

Running in the background does mean you don't have to wait for completion before running the next script/command. That's what the & at the end of the command does. If you don't place the ampersand at the end, it doesn't mean the same thing. Redirecting output to a file (especially for background commands) is usually a good thing, unless they send logs somewhere else (like syslog). Otherwise, it's easy to get output mixed up between the multiple commands, since they're running at the same time. I'll update the answer to add a note about output to the terminal.
–
Christopher CashellMar 28 '12 at 1:49

This will start by default as many process of the same command, depending of cpu count.

parallel -j 3 sh -c "echo hi; sleep 2; echo bye" -- 1 2 3

This will force to execute 3 processes at the same time, even if you have just 2 cpu's.

Example:

[me@neo]<bash><~> 31
06:09 Tue Mar 05 > parallel -j 10 sh -c "echo Hello World; sleep 3; echo Good morning" -- $(seq 1 10)
Hello World
Hello World
Hello World
Hello World
Hello World
Hello World
Hello World
Hello World
Hello World
Hello World
Good morning
Good morning
Good morning
Good morning
Good morning
Good morning
Good morning
Good morning
Good morning
Good morning

Chris has the good general idea for most scripts: end each command with & and they'll go into the background. That said, here's my something I keep around for some higher-end tasks. Let me introduce you to my handy toolbox script, parallel-exec.pl

With some light tweaking, you can change the number of threads, provide an array of commands, etc. As this stands right now, it takes the number of threads specified on the command line, a database name, and a list of files to import, then spawns one instance for each file running the specified number at once.