The Lexer is the most fundamental concept. A Lexer
has a member function, collect,
that accepts a token definition container Def,
and a the name of the lexer state the token definitions of the lexer
component need to be added to (a string). It doesn't return anything
(return type is void). Each
Lexer can represent a specific pattern or algorithm, or it can be a more
complex lexer component formed as a composition of other Lexer's. Additionally,
a Lexer exposes a member add_actions,
that accepts the token definition container Def,
while returning nothing (again, the returned type is void).

In the expressions below, the behavior of the lexer component, l, is left unspecified in the base
Lexer concept. These
are specified in subsequent, more refined concepts and by the actual
models thereof.

For any Lexer the following expressions must be valid:

Expression

Semantics

Return type

l.collect(def,state)

Add all token definitions provided by this Lexer instance to
the lexer state state
of the token definition container def.

void

l.add_actions(def)

Add all semantic actions provided by this Lexer instance to the
token definition container def.

On return, def holds
all token definitions defined in the Lexer, l.
This includes all Lexer's contained inside l.

Upon return from l.add_actions the following post conditions
should hold:

On return, def holds
all semantic actions correctly asociated with the corresponding token
definitions as defined in the Lexer, l.
This includes all semantic actiosn defined by the Lexer's contained
inside l.