[Andrae Muys]
>>>>> Found myself needing serialised access to a shared generator from
>>>>> multiple threads. Came up with the following
>>>>>>>>>> def serialise(gen):
>>>>> lock = threading.Lock()
>>>>> while 1:
>>>>> lock.acquire()
>>>>> try:
>>>>> next = gen.next()
>>>>> finally:
>>>>> lock.release()
>>>>> yield next
[Ype Kingma]
>>>> Is there any reason why the lock is not shared among threads?
>>>> From the looks of this, it doesn't synchronize anything
>>>> between different threads. Am I missing something?
[Jeff Epler]
>>> Yes, I think so. You'd use the same "serialise" generator object in
>>> multiple threads, like this:
>>>>>> p = seralise(producer_generator())
>>> threads = [thread.start_new(worker_thread, (p,))
>>> for t in range(num_workers)]
[Alan Kennedy]
>> Hmm. I think Ype is right: the above code does not correctly serialise
>> access to a generator.
[Ype Kingma]
> Well, I just reread PEP 255, and I can assure you a was missing
> something...
Ype,
Ah: I see now. You thought it didn't work, but for a different reason
than the one I pointed out. You thought that the lock was not shared
between threads, though as Jeff pointed out, it is if you use it the
right way.
But it still doesn't work.
[Alan Kennedy]
>> I believe that the following definition of serialise will correct the
>> problem (IFF I've understood the problem correctly :-)
>>>> #-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
>> import time
>> import thread
>> import threading
>>>> class serialise:
>> "Wrap a generator in an iterator for thread-safe access"
>>>> def __init__(self, gen):
>> self.lock = threading.Lock()
>> self.gen = gen
>>>> def __iter__(self):
>> return self
>>>> def next(self):
>> self.lock.acquire()
>> try:
>> return self.gen.next()
>> finally:
>> self.lock.release()
[Ype Kingma]
> Looks like a candidate for inclusion in a standard library to me.
Well, maybe :-)
To be honest, I don't have the time to write test cases, docs and
patches. So I think I'll just leave it for people to find in the
Google Groups archives ...
[Alan Kennedy]
>> Also, I don't know if I'm happy with relying on the fact that the
>> generator raises StopIteration for *every* .next() call after the
>> actual generated sequence has ended. The above code depends on the
>> exhausted generator raising StopIteration in every thread. This seems
>> to me the kind of thing that might be python-implementation specific.
>> For example, the original "Simple Generators" specification, PEP 255,
>> makes no mention of expected behaviour of generators when multiple
>> calls are made to the its .next() method after the iteration is
>> exhausted. That I can see anyway? Am I wrong?
[Ype Kingma]
> Quoting from PEP 234:
>http://www.python.org/peps/pep-0234.html>> "Once a particular iterator object has raised StopIteration, will
> it also raise StopIteration on all subsequent next() calls?
> ...
> Resolution: once StopIteration is raised, calling it.next()
> continues to raise StopIteration."
Yes, that clears the issue up nicely. Thanks for pointing that out.
So the same code will run correctly in Jython 2.3 and IronPython
(awaited with anticipation).
regards,
--
alan kennedy
------------------------------------------------------
check http headers here: http://xhaus.com/headers
email alan: http://xhaus.com/contact/alan