Call by push-value

Call-by-push-value is a programming language paradigm that, surprisingly, breaks down the call-by-value and call-by-name paradigms into simple primitives. This monograph, written for graduate students and researchers, exposes the call-by-push-value structure underlying a remarkable range of semantics, including operational semantics, domains, possible worlds, continuations and games.

This topic is not unknown on LtU but my search did not turn up any thread focused on the subject. l am looking for some help understanding the concepts.

I am into "CBPV: Decomposing CBV and CBN".
Unfortunately my theorical background is lacking, so be patient with me.

In the CK-machine, I see to x. N :: K on the configuration stack. Can l interpret that as a continuation? Is the stack not implicitely typed, since the computation will stop if I don't put the correct type on the stack?

The 'tags' are confusing the hell out of me. What does the î (i accent) notation stand for? What do l do with Mî? Is there any relation with named channels?

Can I say that: values and computations are separate. Everything is a computation, except for values. A thunk is a value that can be forced to a computation.

Noel kindly pointed me to this discussion, and I'll try to answer questions.

"What does the î notation stand for"

You're referring to Fig. 2 in the paper you mentioned, right?

I find it helpful to use î to indicate when just one particular element of I is required (more precisely, when there's a side-condition that î ∈ I), and i when there's a subterm for each i ∈ I (i.e. when there's a premise for each i ∈ I). It doesn't really matter in Fig. 2 - I could have called both i - but in Fig. 12, for example, the fourth beta-law would be unclear if I called both things i.

As far as implicit vs explicit typing is concerned, CBPV is an explicitly typed calculus. But I am treating typing rules as defining inductively the set of terms inhabiting each judgement. I never defined a set of "raw terms"; that's actually unnecessary.

If you prefer to define a set of raw terms, and regard the typing rules as inductively defining a relation between terms, contexts and types, then you should write into each term all the typing information.

A continuation is a stack from an F type (sorry, the terminology changed between the thesis and the book). A stack of the form to x. N :: K is necessarily from an F type, so it is indeed a continuation.

Andris is correct that the transitions of the CK-machine preserve well-typedness.

My question would be how one gets from necessity to support general tuple types (like defined on page 259) to only sum types and binary pattern-match product types in JWA (e.g., page 119 of the same paper)?
Also, if this simplification is sound, can we have it more symetric (either make sums binary or products indexed)?

Your first question, as I understand it, is: why are jumbo connectives needed for jumbo lambda-calculus, but not in CBPV and JWA, right?

In the former case (at least in the presence of effects) the denotational semantics is not known a priori, so it's not legitimate to assume that jumbo connectives can be decomposed into sum and product. But in the case of CBPV and JWA, by the time we come to define them, we already know the (more precisely, examples of) denotational semantics. So we can see that the decomposition is legitimate.

However, one could also consider the jumbo versions of CBPV and JWA.

As for your second question, the reason that I provided nullary and binary sums, but indexed sums (which is really a question about jumbo lambda-calculus, a kind of prequel to CBPV) was for the sake of infinitary syntax. Quite often in semantics, it's helpful to think about infinitary versions of syntax - you can think of this as a kind of first step towards denotational semantics. In an infinitary version of CBPV, you could have countable sums, because, after all, tags are just implemented as natural numbers. But you can't have countable products (of value types), because bindings are made one at a time, and therefore typing contexts are finite.

Currently, Wikipedia doesn't have any content on the subject. If it still doesn't when I'm done reading (and if I don't find myself completely confused, which can happen sometimes), I'll take a crack at it. (Someone else can do so first if I take too long)

The argument seems to be about combining effects and functions. At the same time we are combining CBN and CBV into CBPV. We also have terms and computations. Can we associate terms and functions with CBN; computations and effects with CBV? Or is this an over simplification.

Edit: Also can we associate monads with computations; functions with terms? A need to compute something at a certain "time" or in a certain order implies a monad.

It's certainly true that CBPV combines terms and functions, things that "are", with effects and computations, things that "do". But I don't see the specific connection of terms and functions to CBN, computations and effects to CBV, that you suggest.

What is true though, is that function types work well in CBN, whereas sum types work well in CBV. (I tend to think that research in this area has been somewhat distorted by a bias towards function types.)

As for monads, it is certainly true that many of the denotational semantics for CBPV are buit out of a monad. Others are not. My paper CBPV: decomposing CBV and CBN is largely about the similarities and differences between CBPV and the monadic framework for semantics.

I saw this book at Amazon a while back, and it looked interesting, but for crying out loud, it's $156! I think the author should think a bit about whether he really wants to get the word out about this theory, and if so, he should provide a cheaper alternative (perhaps making his thesis available for free online?).

Probably because most of the people who would be interested in the subject matter are too cheap or impecunious to buy it, so they have to rely on a few hundred copies being sold to university libraries to make their money.

We really should be feel lucky that so much good material is available for free or for low cost these days: it wasn't always like this, and it may not always stay this way.

I completely agree with you about the price of the book! I should never have submitted it to Kluwer. Although Kluwer has now merged with Springer, the latter has apparently stuck to Kluwer's pricing policy.

Springer also has developed the annoying habit--recently--of shrinkwrapping their textbooks(!). Down at Powells, I've encountered several shrinkrapped Springer books--with no media or other inserts that the publisher was trying to keep together with the text. (Fortuntely, the good folks at Powells permitted me to unwrap the book in question--in this case, it wasn't Levy's book BTW--so I could read it before buying).

At least there wasn't a shrinkwrap license--"by breaking this seal, reader agrees to submit to numerous egregious terms". But still--shrinkwrapping this week's issue of Hot Cars and Hotter Chicks to keep greasy-fingered teens from reading it at the newsstand is one thing. Shrinkwrapping scientific texts is patently obnoxious. Has anyone else encountered this?

I don't know how many published authors inhabit this forum besides Levy (I've seen quite a few here)... but it would be cool if this site, in addition to its role as an excellent discussion forum, could also host research papers and other texts--including some sort of peer review mechanism.