Venus and multithreading

> And that's mostly fine by me. I think, if the parser fails let's go on
> and drop the parsing of the URL. Venus instead goes in an infinite
> loop and keeps trying again and again with the same error as above. It
> seems the thread doesn't quit if some kind of errors arise.
I fixed the bug.
The patch is really, really trivial. The tests still pass. Here it is:
=== modified file 'planet/spider.py'
--- planet/spider.py 2007-03-14 12:16:04 +0000
+++ planet/spider.py 2007-03-28 15:43:47 +0000
@@ -323,7 +323,6 @@
for line in (traceback.format_exception_only(type, value) +
traceback.format_tb(tb)):
log.error(line.rstrip())
- continue
output_queue.put(block=True, item=(uri, feed_info, feed))
uri, feed_info = input_queue.get(block=True)
As you can see Venus tried to repeat the parsing of the feed in case
of error but sometimes it keep trying and trying and will do that
forever because of the continue.
Removing it does the trick.
HTH
--
Lawrence, oluyede.org - neropercaso.it
"It is difficult to get a man to understand
something when his salary depends on not
understanding it" - Upton Sinclair