I have implemented a generator-based scanner in Python that tokenizes a string into tuples of the form (token type, token value):

for token in scan("a(b)"):
print token

would print

("literal", "a")
("l_paren", "(")
...

The next task implies parsing the token stream and for that, I need be able to look one item ahead from the current one without moving the pointer ahead as well. The fact that iterators and generators do not provide the complete sequence of items at once but each item as needed makes lookaheads a bit trickier compared to lists, since the next item is not known unless __next__() is called.

What could a straightforward implementation of a generator-based lookahead look like? Currently I'm using a workaround which implies making a list out of the generator:

Really nice, both simple and flexible. I think this implementation mostly fits what I would have imagined, thank you. By the way, I'm wondering how issues like that are commonly handled by scanners, parsers or the like in Python. I've gone threw some Python core library code like the SRE module or the tokenizer but I haven't seen something like a lookahead iterator being used.
–
jenaOct 5 '09 at 13:03

2

You might use a deque for the buffer, although efficiency probably doesn't matter too much for small lookaheads.
–
kindallJun 23 '11 at 4:47

Pretty good answers there, but my favorite approach would be to use itertools.tee -- given an iterator, it returns two (or more if requested) that can be advanced independently. It buffers in memory just as much as needed (i.e., not much, if the iterators don't get very "out of step" from each other). E.g.:

You can wrap any iterator with this class, and then use the .lookahead attribute of the wrapper to know what the next item to be returned in the future will be. I like to leave all the real logic to itertools.tee and just provide this thin glue!-)

Here is an example that allows a single item to be sent back to the generator

def gen():
for i in range(100):
v=yield i # when you call next(), v will be set to None
if v:
yield None # this yields None to send() call
v=yield v # so this yield is for the first next() after send()
g=gen()
x=g.next()
print 0,x
x=g.next()
print 1,x
x=g.next()
print 2,x # oops push it back
x=g.send(x)
x=g.next()
print 3,x # x should be 2 again
x=g.next()
print 4,x

Since you say you are tokenizing a string and not a general iterable, I suggest the simplest solution of just expanding your tokenizer to return a 3-tuple:
(token_type, token_value, token_index), where token_index is the index of the token in the string. Then you can look forward, backward, or anywhere else in the string. Just don't go past the end. Simplest and most flexible solution I think.

Also, you needn't use a list comprehension to create a list from a generator. Just call the list() constructor on it:

This is a very interesting idea since it avoids the issue in the first place. But I think there a two downsides: First, in case the part of accessing a token from the token stream is up to a different instance than the scanner, both token stream and original string would have to be provided. However, I could live with that and it might be a good idea to let the scanner do the accessing work anyway. But I think peeking a token by making use of the original string only provides the value but not other annotational stuff like the token's type which might be essential in some cases (so in mine).
–
jenaOct 5 '09 at 12:50