User login

Navigation

Eff - Language of the Future

This is just a series of blog posts so far, as far as I can tell. But Andrej Bauer's work has been mentioned here many times, I am finding these posts extremely interesting, and I'm sure I will not be alone. So without further ado...

I just returned from Paris where I was visiting the INRIA Ï€rÂ² team. It was a great visit, everyone was very hospitable, the food was great, and the weather was nice. I spoke at their seminar where I presented a new programming language eff which is based on the idea that computational effects are algebras. The language has been designed and implemented jointly by Matija Pretnar and myself. Eff is far from being finished, but I think it is ready to be shown to the world. What follows is an extended transcript of the talk I gave in Paris. It is divided into two posts. The present one reviews the basic theory of algebras for a signature and how they are related to computational effects. The impatient readers can skip ahead to the second part, which is about the programming language.

Comment viewing options

It was always kind of interesting to note how few of the discussions on LtU are about new languages, or even about specific languages. This makes total sense, of course, but may be kind of surprising to some. More to the point, even while this makes sense, we are language geeks, aren't we?! So I am glad to note that the FOUR most recent home page items are about new and exciting languages!

It's not quite right to say that k is a map from memory cells to functions. In what you wrote k is the continuation. And yes, of course, eff has mutable store, which is implemented exactly with lookup and update that you wrote down (it is in prelude.eff).

The content of the cell can be read with operation lookup and written to with operation update.

When the content of the cell is read with lookup() I'd expect a value of type S to be returned and not a function of type T^S.

Note that I read this as an introductory text for the language and not as a supplement to a bigger theory where even the most basic operation is modelled as higher order function. I guess this is somehow meaningful for the author but it doesn't shine through in the introductory exposition.

I think that the lookup() example is just that, an example for the concepts of a carrier set, algebra, etc. Bauer doesn't explain why this choice captures state exactly. But he does give an informal explanation why:

In the single-cell example, our programs cannot be represented simply by return values, as they depend on the current state of the memory.

Thus, programs are functions: for each initial content of the cell, the program gives the corresponding return value. In other words, they are functions from S to T.

Note that this is no different than implementing state in a purely functional language.

In the general theory, covered by Plotkin, Power, Hyland, Pretnar and more, higher-order functions are not a central ingredient. Rather, the theory is presented using equational logic terminology. Programs are interpreted as terms denoting the effects that arise in them, modulo the equations for each particular theory.

So the program that simply looks up the contents of the memory will be denoted by the equivalence class of lookup(\s . return s). You can show that if the program deals with value of type T, then the set of all equivalence classes of terms involving lookup() and update() is isomorphic to (SxT)^S.

As a side note, one consequence of the general theory is that if we want to define state in terms lookup() and update(), satisfying Plotkin's and Power's seven equations of state (which are intuitive for any non-relaxed memory model), then we necessarily get the monad A |-> (SxA)^S. This follows from the argument in the previous paragraph, and the fact that the theory for state is Hilbert-Post complete.

But you have raised a good point. Programming in CPS is not something programmers always like to do. When we write "lookup()" we would like a single S-value, and not a complicated computation. Note that this "lookup()" is nothing else but "lookup(\s. return s)". This idea, of replacing an operation "op(t)" by "op(\s.return s)" holds in general, and gives rise to the concept of a generic operation.

I don't think that generic operations are covered in the blog (yet?), so if you want to learn more about them, try Matija's thesis.