> I have difficulty locating the notion of computation behind the Blum
> et al model. Blum lectured on it at U of Chicago a few years ago and
> spoke as though there were two ideas of computation, one starting
> with the *assumption* of discreteness and the other with continuity.
> But Turing's analysis didn't *start* with the assumption of
> discreteness. As I understand him, he began simply with the aim of
> analyzing the notion of (human) mechanical computation, and
> discreteness was a result of his analysis. (As I recall, he derived
> it from the requirement that states be readable.)
If I understand the Blum et al model correctly, it can perhaps
most easily be understood as a model involving an imaginary
calculator with infinite precision. In other words, the data
structures are arbitrary real numbers. This might sound
unreasonable, since our computational practice involves only
rational approximations (hence the bit model), but on the
other hand, if you think of the classical theory of
geometrical constructions, it seems more sensible,
since these constructions also are thought of as operating
on arbitrary real numbers.