If we have a rewrite system for primitive recursive functions, which simplifies each term according to how the function was defined, then what is the computational complexity of this calculation? That is, what is the coplexity of the normalization procedure? I have heard a claim that for a closed term calculating the value of the function requires transfinte induction up to $\epsilon_0$. Is this true and where can I find a proof of this?

For example in (Schwichtenberg & Wainer 2012) there is a lemma which says that a primitive recursive function is computable in $F_{\alpha}$-bounded time, for some $\alpha<\omega$, where the set of all $F_{\alpha}$ for $\alpha<\epsilon_0$ is the Fast Growing Hierarchy.

Is the measure of transfinite induction related to this way of bounding the complexity?

3 Answers
3

I find some aspects of this question difficult to understand. For example, the part about "calculating the value of the function requires transfinite induction up to $\varepsilon_0$" seems to conflate methods of proof (like transfinite induction) with methods of computation. Also, much (if not all) of the question appears to be about primitive recursive functionals of higher type (as in Gödel's Dialectica interpretation) rather than mere primitive recursive functions. So, the following may not be what the OP was looking for, but here goes anyway:

Not much can be said about the time complexity of computing primitive recursive functions (from natural numbers to natural numbers) except that it's primitive recursive and thus bounded by a finite level $F_n$ of the Grzegorczyk hierarchy (essentially equivalent to the fast-growing hierarchy, I believe).

On the other hand, for primitive recursive functionals of finite type, you can consider the problem of evaluating terms of type 0 (i.e., terms that denote natural numbers). Here, the only bound I can see for the time-complexity is level $\varepsilon_0$ of the Grzegorczyk hierarchy. For any fixed primitive recursive functional, evaluation of its numerical values would take time bounded by $F_\alpha$ for some $\alpha<\varepsilon_0$, but different functionals would need different $\alpha$'s, with no uniform bound below $\varepsilon_0$.

The relevance of the proof principle of transfinite induction up to $\varepsilon_0$ (for open formulas) is that this is what is needed, on top of primitive recursive arithmetic, to prove that every closed term of type 0, built from primitive recursive functionals of higher type, can be reduced to a numerical value.

Thanks. I have confirmed that the claim that I head about TI up to $\epsilon_0$, was ment for Gödel's T (that is for functionals of higher types). My interest was in primitive recursive functions and their time-complexity. I'll have a look at the Grzegorczyk hierarchy.
–
AKSMay 28 '12 at 7:29

Actually there is no need for any transfinite induction.
A monotone interpretation yields that the derivation lengths
for any fixed term will be primitive recursive in the values of the numerals
plugged in. I have a paper in APAL with Cichon on the topic and, in addition, a paper
in JSL regarding Goedel's T showing that for T one needs epsilon_0 recursion
for proving strong normalization by a monotone interpretation.

yes indeed. We have the follwing (not too difficult) fact: Let S^m(0) be the numeral for $m$.
For any PRA term $t(x_1,\ldots,x_n)$ The derivation length
of $t(S^{m_1}(0),\ldots,S^{m_n}(0))$ will be bounded by a primitive recursive
function (depending on $t$) with arguments $m_1,\ldots,m_n$. (Here I assume
standard rewrite systems for modelling primitive recursion.)

For terms from Gödel's T the derivation lengths become more complex depending
on the typelevel of the terms in question.

My general conjecture is that
for typical (not too small) subrecursive classes C the derivation lengths functions
for terms in C will not leave C.

The result for primitive recursion is not difficult whereas the result for Gödel's T is
somewhat involved.

For PRA there is also a tradeoff in terms of termination orderings.
Termination for rewrite systems for prim rec functions can be shown
by the multiset path ordering. And termation proofs for the multiset
path ordering lead to prim rec derivation lengths (result by Dieter
Hofbauer). The corresponding result for multiple recursion is one
of my earliest results in term rewriting theory. Wilfried Buchholz
gave a nice prooftheoretic proof for these results in APAL.

There is also some nice application of term rewriting to more involved schemes of primitive
recursive functions. For example it can be used to show that prim rec functions
are closed under parameter recursion, or simple nested recursion, or even
unnested multiple recursion.

Thanks for the information. The reason I asked is that, in the original question, AKS wrote about primitive recursive functions but meant primitive recursive functionals of arbitrary types (see his comment on my answer). As a result, there has been a bit of ambiguity here. Speaking of ambiguity, I see that you have (at least) two accounts on MO, one under your full name, and one omitting your first name and the final n of your last name. If you register one of them, the moderators can merge the other(s) with it.
–
Andreas BlassJul 27 '12 at 16:47