Here I go again touting my AsynQueue package. Sorry, but it just seems
to be a very appropriate solution to many of the problems being raised
recently.
I've recently added a "processworker" module that does just what it
sounds like. You can now queue up jobs to be run on a separate Python
interpreter. If the interpreter crashes due to a segfault or anything
else, you just construct a new worker instance and attach it to the
queue, and the jobs continue merrily along.
In addition to deferred-based priority queuing, the queue object has
powerful capabilities for hiring and firing workers, letting workers
resign when they can't perform their duties any more, assigning tasks to
appropriate workers, and re-assigning tasks from terminated workers.
See http://tinyurl.com/349k2o
(http://foss.eepatents.com/AsynQueue/browser/projects/AsynQueue/trunk/asynqueue/processworker.py)
By the way, AsynQueue (without the new processworker stuff) is now
available in Debian testing, thanks to efforts of Eric Evans. Just
apt-get install python-asynqueue.
Best regards, Ed