>* If it doesn't work: Where are the problems with it? Do you know>counter-examples of programming languages where one can't do>lexical analysis like that?>(I know of Pascal's '..' problem; are there other problem cases?)

Currently I'm trying to construct an C scanner and parser, for cross
compilation. The C specification mentions more than 3 steps of lexical
processing, before tokens can be created. IMO the only practical
solution here is a multi-level scanner, which does all substitutions
before passing the characters to the next stage.

I also had some problems with the C preprocessor, which must know
about escaped and non-escaped line ends in #define. Also in #define
the leading '(' of an argument list must immediately follow the
identifier, with no whitespace allowed in between. In #include I had
problems with the <file> syntax, because '<' is an operator in other
contexts (expressions), and the allowed characters in a path
specification differ from other (literal, identifier) character sets.
To me this looks like a context sensitive lexical grammar?