These two last constructions are directly motivated by the maps induced from the universal properties of products and coproducts.

−

−

We shall write <math>(f\times g)</math> and <math>(f+g)</math> for the <math>\Delta</math> and <math>\nabla</math> constructions, respectively.

We note that in the situation considered by MFP, inital algebras and final coalgebras coincide, and thus <math>in_A, out_A</math> are the pair of isomorphic maps induced by either the initial algebra- or the final coalgebra-structure.

We note that in the situation considered by MFP, inital algebras and final coalgebras coincide, and thus <math>in_A, out_A</math> are the pair of isomorphic maps induced by either the initial algebra- or the final coalgebra-structure.

Revision as of 18:40, 17 November 2009

IMPORTANT NOTE: THESE NOTES ARE STILL UNDER DEVELOPMENT. PLEASE WAIT UNTIL AFTER THE LECTURE WITH HANDING ANYTHING IN, OR TREATING THE NOTES AS READY TO READ.

1 Recursion patterns

Meijer, Fokkinga & Patterson identified in the paper Functional programming with bananas, lenses, envelopes and barbed wire a number of generic patterns for recursive programming that they had observed, catalogued and systematized. The aim of that paper is to establish a number of rules for modifying and rewriting expressions involving these generic recursion patterns.

As it turns out, these patterns are instances of the same phenomenon we saw last lecture: where the recursion comes from specifying a different algebra, and then take a uniquely existing morphism induced by initiality (or, as we shall see, finality).

Before we go through the recursion patterns, we need to establish a few pieces of theoretical language, dualizing the Eilenberg-Moore algebra constructions from the last lecture.

1.1 Coalgebras for endofunctors

Definition If is an endofunctor, then a P-coalgebra on A is a morphism .

A morphism of coalgebras: is some such that the diagram

commutes.

Just as with algebras, we get a category of coalgebras. And the interesting objects here are the final coalgebras. Just as with algebras, we have

Lemma (Lambek) If is a final coalgebra, it is an isomorphism.

Finally, one thing that makes us care highly about these entities: in an appropriate category (such as ω − CPO), initial algebras and final coalgebras coincide, with the correspondence given by inverting the algebra/coalgebra morphism. In Haskell not quite true (specifically, the final coalgebra for the lists functor gives us streams...).

Onwards to recursion schemes!

We shall define a few specific morphisms we'll use repeatedly. This notation, introduced here, occurs all over the place in these corners of the literature, and are good to be aware of in general:

If is an initial algebra for T, we denote a = inA.

If is a final coalgebra for T, we denote a = outA.

We write μf for the fixed point operator

mu f = x where x = f x

We note that in the situation considered by MFP, inital algebras and final coalgebras coincide, and thus inA,outA are the pair of isomorphic maps induced by either the initial algebra- or the final coalgebra-structure.

1.2 Catamorphisms

A catamorphism is the uniquely existing morphism from an initial algebra to a different algebra. We have to define maps down to the return value type for each of the constructors of the complex data type we're recursing over, and the catamorphism will deconstruct the structure (trees, lists, ...) and do a generalized fold over the structure at hand before returning the final value.

The intuition is that for catamorphisms we start essentially structured, and dismantle the structure.

Example: the length function from last lecture. This is the catamorphism for the functor given by the maps

above, and the resulting function will - as we can verify by compiling and running - give us the same kind of reversed list of the n first integers as the

first

function above would.

1.4 Hylomorphisms

The hylomorphisms capture one of the two possible compositions of anamorphisms and catamorphisms. Parametrized over an algebra and a coalgebra the hylomorphism is a recursion pattern that computes a value in A from a value in A by generating some sort of intermediate structure and then collapsing it again.

It is, thus the composition of the uniquely existing morphism from a coalgebra to the final coalgebra for an endofunctor, followed by the uniquely existing morphism from the initial algebra to some other algebra.

1.5 Metamorphisms

The metamorphism is the other composition of an anamorphism with a catamorphism. It takes some structure, deconstructs it, and then reconstructs a new structure from it.

As a recursion pattern, it's kinda boring - it'll take an interesting structure, deconstruct it into a scalar value, and then reconstruct some structure from that scalar. As such, it won't even capture the richness of hom(Fx,Gy), since any morphism expressed as a metamorphism will factor through a map .

1.6 Paramorphisms

Paramorphisms were discussed in the MFP paper as a way to extend the catamorphisms so that the operating function can access its arguments in computation as well as in recursion. We gave the factorial above as a hylomorphism instead of a catamorphism precisely because no simple enough catamorphic structure exists.

1.7 Apomorphisms

The apomorphism is the dual of the paramorphism - it does with retention of values along the way what anamorphisms do compared to catamorphisms.

2 Further reading

3 Further properties of adjunctions

3.1 RAPL

Proposition If F is a right adjoint, thus if F has a left adjoint, then F preserves limits in the sense that .

Example: .

We can use this to prove that things cannot be adjoints - since all right adjoints preserve limits, if a functor G doesn't preserve limits, then it doesn't have a left adjoint.

Similarly, and dually, left adjoints preserve colimits. Thus if a functor doesn't preserve colimits, it cannot be a left adjoint, thus cannot have a right adjoint.

The proof of these statements build on the Yoneda lemma:

Lemma If C is a locally small category (i.e. all hom-sets are sets). Then for any and any functor there is an isomorphism

where we define .

The Yoneda lemma has one important corollary:

Corollary If yA = yB then A = B.

Which, in turn has a number of important corollaries:

Corollary

Corollary Adjoints are unique up to isomorphism - in particular, if is a functor with right adjoints , then U = V.

ProofhomC(C,UD) = homD(FC,D) = homC(C,VD), and thus by the corollary to the Yoneda lemma, UD = VD, natural in D.

3.2 Functors that are adjoints

The functor has right adjoint . The universal mapping property of the exponentials follows from the adjointness property.

The functor has a left adjoint given by the coproduct and right adjoint the product .

More generally, the functor that takes c to the constant functor constc(j) = c,constc(f) = 1c has left andright adjoints given by colimits and limits:

Pointed rings are pairs of rings and one element singled out for attention. Homomorphisms of pointed rings need to take the distinguished point to the distinguished point. There is an obvious forgetful functor , and this has a left adjoint - a free ring functor that adjoins a new indeterminate . This gives a formal definition of what we mean by formal polynomial expressions et.c.

Given sets A,B, we can consider the powersets P(A),P(B) containing, as elements, all subsets of A,B respectively. Suppose is a function, then takes subsets of B to subsets of A.

Viewing P(A) and P(B) as partially ordered sets by the inclusion operations, and then as categories induced by the partial order, f − 1 turns into a functor between partial orders. And it turns out f − 1 has a left adjoint given by the operation im(f) taking a subset to the set of images under the function f. And it has a right adjoint

We can introduce a categorical structure to logic. We let L be a formal language, say of predicate logic. Then for any list x = x1,x2,...,xn of variables, we have a preorder Form(x) of formulas with no free variables not occuring in x. The preorder on Form(x) comes from the entailment operation - f | − g if in every interpretation of the language, .

We can build an operation on these preorders - a functor on the underlying categories - by adjoining a single new variable: , sending each form to itself. Obviously, if f | − g with x the source of free variables, if we introduce a new allowable free variable, but don't actually change the formulas, the entailment stays the same.

It turns out that there is a right adjoint to * given by . And a left adjoint to * given by . Adjointness properties give us classical deduction rules from logic.

4 Homework

Write a fold for the data type

data T a = L a | B a a | C a a a

and demonstrate how this can be written as a catamorphism by giving the algebra it maps to.

Write the fibonacci function as a hylomorphism.

Write the Towers of Hanoi as a hylomorphism. You'll probably want to use binary trees as the intermediate data structure.

Write a prime numbers generator as an anamorphism.

* The integers have a partial order induced by the divisibility relation. We can thus take any integer and arrange all its divisors in a tree by having an edge if d | n and d doesn't divide any other divisor of n. Write an anamorphic function that will generate this tree for a given starting integer. Demonstrate how this function is an anamorphism by giving the algebra it maps from.

Hint: You will be helped by having a function to generate a list of all primes. One suggestion is: