Meijer, Fokkinga & Patterson identified in the paper Functional programming with bananas, lenses, envelopes and barbed wire a number of generic patterns for recursive programming that they had observed, catalogued and systematized. The aim of that paper is to establish a number of rules for modifying and rewriting expressions involving these generic recursion patterns.

As it turns out, these patterns are instances of the same phenomenon we saw last lecture: where the recursion comes from specifying a different algebra, and then take a uniquely existing morphism induced by initiality (or, as we shall see, finality).

Before we go through the recursion patterns, we need to establish a few pieces of theoretical language, dualizing the Eilenberg-Moore algebra constructions from the last lecture.

Definition If is an endofunctor, then a P-coalgebra on A is a morphism .

A morphism of coalgebras: is some such that the diagram

commutes.

Just as with algebras, we get a category of coalgebras. And the interesting objects here are the final coalgebras. Just as with algebras, we have

Lemma (Lambek) If is a final coalgebra, it is an isomorphism.

Finally, one thing that makes us care highly about these entities: in an appropriate category (such as ω − CPO), initial algebras and final coalgebras coincide, with the correspondence given by inverting the algebra/coalgebra morphism. In Haskell not quite true (specifically, the final coalgebra for the lists functor gives us streams...).

Onwards to recursion schemes!

We shall define a few specific morphisms we'll use repeatedly. This notation, introduced here, occurs all over the place in these corners of the literature, and are good to be aware of in general:

If is an initial algebra for T, we denote a = inA.

If is a final coalgebra for T, we denote a = outA.

We write μf for the fixed point operator

mu f = x where x = f x

We note that in the situation considered by MFP, inital algebras and final coalgebras coincide, and thus inA,outA are the pair of isomorphic maps induced by either the initial algebra- or the final coalgebra-structure.

A catamorphism is the uniquely existing morphism from an initial algebra to a different algebra. We have to define maps down to the return value type for each of the constructors of the complex data type we're recursing over, and the catamorphism will deconstruct the structure (trees, lists, ...) and do a generalized fold over the structure at hand before returning the final value.

The intuition is that for catamorphisms we start essentially structured, and dismantle the structure.

Example: the length function from last lecture. This is the catamorphism for the functor given by the maps

The hylomorphisms capture one of the two possible compositions of anamorphisms and catamorphisms. Parametrized over an algebra and a coalgebra the hylomorphism is a recursion pattern that computes a value in A from a value in A by generating some sort of intermediate structure and then collapsing it again.

It is, thus the composition of the uniquely existing morphism from a coalgebra to the final coalgebra for an endofunctor, followed by the uniquely existing morphism from the initial algebra to some other algebra.

The metamorphism is the other composition of an anamorphism with a catamorphism. It takes some structure, deconstructs it, and then reconstructs a new structure from it.

As a recursion pattern, it's kinda boring - it'll take an interesting structure, deconstruct it into a scalar value, and then reconstruct some structure from that scalar. As such, it won't even capture the richness of hom(Fx,Gy), since any morphism expressed as a metamorphism will factor through a map .

Paramorphisms were discussed in the MFP paper as a way to extend the catamorphisms so that the operating function can access its arguments in computation as well as in recursion. We gave the factorial above as a hylomorphism instead of a catamorphism precisely because no simple enough catamorphic structure exists.

The functor has right adjoint . The universal mapping property of the exponentials follows from the adjointness property.

The functor has a left adjoint given by the coproduct and right adjoint the product .

More generally, the functor that takes c to the constant functor constc(j) = c,constc(f) = 1c has left andright adjoints given by colimits and limits:

Pointed rings are pairs of rings and one element singled out for attention. Homomorphisms of pointed rings need to take the distinguished point to the distinguished point. There is an obvious forgetful functor , and this has a left adjoint - a free ring functor that adjoins a new indeterminate . This gives a formal definition of what we mean by formal polynomial expressions et.c.

Given sets A,B, we can consider the powersets P(A),P(B) containing, as elements, all subsets of A,B respectively. Suppose is a function, then takes subsets of B to subsets of A.

Viewing P(A) and P(B) as partially ordered sets by the inclusion operations, and then as categories induced by the partial order, f − 1 turns into a functor between partial orders. And it turns out f − 1 has a left adjoint given by the operation im(f) taking a subset to the set of images under the function f. And it has a right adjoint

We can introduce a categorical structure to logic. We let L be a formal language, say of predicate logic. Then for any list x = x1,x2,...,xn of variables, we have a preorder Form(x) of formulas with no free variables not occuring in x. The preorder on Form(x) comes from the entailment operation - f | − g if in every interpretation of the language, .

We can build an operation on these preorders - a functor on the underlying categories - by adjoining a single new variable: , sending each form to itself. Obviously, if f | − g with x the source of free variables, if we introduce a new allowable free variable, but don't actually change the formulas, the entailment stays the same.

It turns out that there is a right adjoint to * given by . And a left adjoint to * given by . Adjointness properties give us classical deduction rules from logic.

and demonstrate how this can be written as a catamorphism by giving the algebra it maps to.

Write the fibonacci function as a hylomorphism.

Write the Towers of Hanoi as a hylomorphism. You'll probably want to use binary trees as the intermediate data structure.

Write a prime numbers generator as an anamorphism.

* The integers have a partial order induced by the divisibility relation. We can thus take any integer and arrange all its divisors in a tree by having an edge if d | n and d doesn't divide any other divisor of n. Write an anamorphic function that will generate this tree for a given starting integer. Demonstrate how this function is an anamorphism by giving the algebra it maps from.

Hint: You will be helped by having a function to generate a list of all primes. One suggestion is: