> Either way, it clarifies how to deal with the state. (Which is why
> I'm surprised that the point didn't come up so far.)
Right, so we have stateless seqs that are only used to create
statefull producer of their values (iterators). For and any other
iteration construct ask for iterator and use it. Each iterator is
implemented in the best way for its seq, so for in-range it's just a
wrapped counter. There are, of course, many optimizations possible,
like you described. You can do even more if you have a compiler pass
that understands sequences. This all sounds pretty close to Python.
One questions is whether functions like stream-map produce seq or
iterator. In Python they do the latter, so in
(let ([s2 (stream-map func s)])
(for ([i s2]) ...)
(for ([i s2]) ...))
the second loop will do nothing (the iterator is already exhausted by
the first loop). I think it's better to return seqs, so both loops do
the same thing. This is how it works at the moment.
This shows why memoizing iterator can be useful. It mutates seq to
store elements that were calculated, so they are not recalculated on
subsequent iterations. In the example above if func is expensive,
using memoizing iterator can be a win, since it avoid calculating func
in the second loop.
Since each seq type has it's own iterator implementation, "memoizing"
iterator for a particular sequence can skip memoization if it's always
cheaper to recalculate. Examples of such seqs are in-range,
in-naturals, immutable lists and other immutable containers (mutable
containers are trickier, they are either not seqs or should never do
memoization).
Eugene