A new TokenStream API has been introduced with Lucene 2.9. This API
has moved from being Token-based to Attribute-based. While
Token still exists in 2.9 as a convenience class, the preferred way
to store the information of a Token is to use AttributeImpls.

TokenStream now extends AttributeSource, which provides
access to all of the token Attributes for the TokenStream.
Note that only one instance per AttributeImpl is created and reused
for every token. This approach reduces object creation and allows local
caching of references to the AttributeImpls. See
incrementToken() for further details.

The consumer retrieves attributes from the stream and stores local
references to all attributes it wants to access.

The consumer calls incrementToken() until it returns false
consuming the attributes after each call.

The consumer calls end() so that any end-of-stream operations
can be performed.

The consumer calls close() to release any resource when finished
using the TokenStream.

To make sure that filters and consumers know which attributes are available,
the attributes must be added during instantiation. Filters and consumers are
not required to check for availability of attributes in
incrementToken().

You can find some example code for the new API in the analysis package level
Javadoc.

The TokenStream-API in Lucene is based on the decorator pattern.
Therefore all non-abstract subclasses must be final or have at least a final
implementation of incrementToken()! This is checked when Java
assertions are enabled.

Method Detail

incrementToken

Consumers (i.e., IndexWriter) use this method to advance the stream to
the next token. Implementing classes must implement this method and update
the appropriate AttributeImpls with the attributes of the next
token.

The producer must make no assumptions about the attributes after the method
has been returned: the caller may arbitrarily change it. If the producer
needs to preserve the state for subsequent calls, it can use
AttributeSource.captureState() to create a copy of the current attribute state.

To ensure that filters and consumers know which attributes are available,
the attributes must be added during instantiation. Filters and consumers
are not required to check for availability of attributes in
incrementToken().

end

This method is called by the consumer after the last token has been
consumed, after incrementToken() returned false
(using the new TokenStream API). Streams implementing the old API
should upgrade to use this feature.

This method can be used to perform any end-of-stream operations, such as
setting the final offset of a stream. The final offset of a stream might
differ from the offset of the last token eg in case one or more whitespaces
followed after the last token, but a WhitespaceTokenizer was used.

reset

Resets this stream to the beginning. This is an optional operation, so
subclasses may or may not implement this method. reset() is not needed for
the standard indexing process. However, if the tokens of a
TokenStream are intended to be consumed more than once, it is
necessary to implement reset(). Note that if your TokenStream
caches tokens and feeds them back again after a reset, it is imperative
that you clone the tokens when you store them away (on the first pass) as
well as when you return them (on future passes after reset()).