3 Examples

Various concepts follow here, which can be seen as concrete examples covered by the arrow concept. Not all of them provide links to Haskell-related materials: some of them are here only to give a self-contained material (e.g. section #Automaton gives links only to the finite state concept itself.).

The funny thing which took a long time for me to understand arrow parsers is a sort of differential approach -- in contrast to the well-known parser approaches. (I mean, in some way well-known parsers are of differential approach too, in the sense that they manage state transitions where the states are remainder streams -- but here I mean being differential in another sense: arrow parsers seem to me differential in the way how they consume and produce values -- their input and output.)

The idea of borrowing this image from mathematical analysis comes from another topic: the version control systems article Integrals and derivatives written by Martin Pool uses a similar image.

written by Magnus Carlsson (page 9) mentions that computation (e.g. state) is threaded through the operands of

&&&

operation.
I mean, even the mere definition of

&&&

operation

p &&& q = arr dup >>> first p >>> second q

shows that the order of the computation (the side effects) is important when using

&&&

, and this can be exemplified very well with parser arrows. See an example found in PArrows written by Einar Karttunen (see module

Text.ParserCombinators.PArrow.Combinator

):

-- | Match zero or more occurrences of the given parser.
many :: MD i o -> MD i [o]
many = MStar
-- | Match one or more occurrences of the given parser.
many1 :: MD i o -> MD i [o]
many1 x =(x &&& MStar x)>>> pure (\(b,bs)->(b:bs))

The definition of

between

parser combinator can show another example for the importance of the order in which the computation (e.g. the side effects) take place using

&&&

operation:

between :: MD i t -> MD t close -> MD t o -> MD i o
between open close real = open >>>(real &&& close)>>^fst

operation can be important. But let us mention also a counterexample, e.g. nondeterministic functions arrows, or more generally, the various implementations of binary relation arrows -- there is no such sequencing of effect orders. Now let us see this fact on the mere mathematical concept of binary relations (not minding how it implemented):

(ρ

&&&

(ρ

|||

The picture illustrating

***

in Programming:Haskell_arrows article of Wikibooks suggests exactly such a view: order of side effects can be unimportant at some arrow instances, and the symmetry of the figure reflects this. In generally, however, the figure should use a notation for threading through side effects in a sequence.

3.3 Stream processor

The Lazy K programming language is an interesting esoteric language (from the family of pure, lazy functional languages), whose I/O concept is approached by streams.

Arrows are useful also to grasp the concept of stream processors. See details in

5 External links

Programming with Arrows, also written by John Hughes. A more recent paper on arrows, and also a very didactic one, introducing the arrow subclasses with detailed examples and rich explanations on the motivations of each decision.

Programming:Haskell arrows (article of the English Wikibooks) is not only a good introduction, but it shows also a funny metaphor for arrows, the factory/conveyor belt metaphor, we know this image for monads, but it is modified here for arrows, too.