Contents

SKI calculus is a subsystem of combinatory logic, itself a precursor of lambda calculus. A SKI calculus program is a binary tree where the leaves are combinators, the three symbols S, K, I. Using parentheses to notate the tree, a simple example of a SKI program is (((SK)S)((KI)S)). We can omit parentheses by assuming they are left-binding by default, so we simplify our program to SKS(KIS).

Like lambda calculus, SKI calculus has a process called beta-reduction, here denoted with \(\Rightarrow\). We take the leftmost combinator and reduce the tree according to the following rules (where):

\(\mathbf{I}x \Rightarrow x\)

\(\mathbf{K}xy \Rightarrow x\)

\(\mathbf{S}xyz \Rightarrow xz(yz)\)

A few clarifications are necessary for these rules. Here, \(x,y,z\) represent any valid trees, not just single combinators. These rules apply to the leftmost part of the tree, so any remaining combinators are left untouched by these transformations.

We repeat this process, and we say that beta-reduction terminates if we reach a point where none of the three above cases apply (for example, if we reach something of the form Kx). Some SKI expressions beta-reduce to a single I, some reduce to another small expression, while others keep on growing forever. If a SKI expression beta-reduces to a string consisting of n combinators, we say that it has output of size n.

SKI calculus alone is no more powerful than a Turing machine (in fact, they have the same computational power). But we can greatly increase its strength by adding an additional symbol \(Ω\), the oracle combinator:

If we start with a string of n symbols and we beta-reduce it, the largest possible finite output is called Ξ(n). It is possible for a SKIΩ calculus statement to be paradoxical (it can turn out to ask about its own halting, leading to situations like a combinator halting iff it does not halt), and such statements are ignored in the computation of Ξ.

We could add another oracle combinator Ω’ which works like Ω, except that it can check if a SKIΩ formula is well-founded (i.e. does not create any kind of paradox). Using this new combinator, we can make a variant of Ξ, which Goucher denoted Ξ2, which grows even faster than the ordinary xi function. It is possible to continue this further, in analogy to levels of oracle Turing machines.

Stronger bounds have been found by Lawrence Hollom, by constructing the fast-growing hierarchy in SKI calculus.[2] This method is called top-down. This construction yielded the following lower bounds to a weaker version of the function, which lacks the \(Ω\) combinator:

Below is the list of beta-reductions of the expressions that will have maximal length. For \(\Xi(1),\Xi(2),\Xi(3)\) and \(\Xi(4)\) the process is trivial: they are S, S(S), S(SS) and S(SSS) respectively. From \(5 \leq n \leq 7\), it goes as follows:

A common way to write combinators in a more visual way is by using binary trees. S, K, I combinators are identified with leaves. First argument is then the other child of combinator's parent, second argument is the other child of its grandparent and similarly for third argument.

On the right can be seen the sequence of combinators resulting from \(\beta\)-reduction of SSS(SI)S.