Blog Stats

"Things are not what they appear to be: nor are they otherwise." -Surangama Sutra

Menu

Widgets

Search

On Hofstadter’s Law (Part – 2)

(Read part – 1 before reading this)

What still remains interesting is to get rid of this weird looking function and replace it with something purely mathematical. In a purely mathematical scenario, we cannot get any help from global variables, so a and r are out of question.

So, let’s say there is a function f that describes what we are doing inside HowLong(), ie, a function that takes as its input the current value of HowLong() and returns something larger than it. So in the first case, f(x) = x + 4. The second case, however, cannot be modeled by f. Now, let’s use the notation f[n](x) to denote f acted n times on x. So f[2](x) is f(f(x)). f[0](x) = x and so on. Notice that our friend HowLong() is now f[infinity](x).

Using the above notation, what we are looking for is a function f such that f[n](x) > f[n – 1](x) for all n and x and f[infinity](x) is finite. This, as it turns out, is an interesting and non-trivial exercise. You may want to try it yourself.

A possible function that satisfies the above properties is this –

f(x) = (x + [x + 1]) / 2

where [x] means the smallest integer larger than x.

All it does is to take the point x half-way closer to [x + 1], ie, the next integer. So each time you apply f, you go halfway closer to the next integer and hence reach the integer – a finite value – after an infinite number of iterations.

This function is still not that cute. The ultimate mathematical triumph will be to be able to write f(x) without using such complicated functions like the integer value function. Does anybody have an idea?

Note – A lot of the material above was developed during discussions with Robin Kothari and Nitin Basant (independently) Also, if you find a flaw in the post, please point it out.

Wow! The function you give, holds along with it that in all cases f[n]x-f[n-1]x<f[n-1]x-f[n-2]x. This follows from zeno’s paradox, but is it implicit in hofs law as such? As in does the law say that the time taken in successive trials shall be lesser than in the one before? Of course if that were not the case nothing would ever get done you’d say. But then does any work, on a philosophical plane, ever get done the way you initially envisioned it? Aren’t targets and expectations revised throughout to meet deadlines? Hence, while this function might be a good representation of how things practically move, it might not be always the correct representation of hofstadter’s law as such.

In a discussion about this with the previously mentioned Nitin Basant, another observation was made: The computer-that-said-42 wouldnt have said so if it didnt have a 7 million year deadline.

True, the function does not accurately represent Hofstadter’s Law. The law is more general than the function. The function points to only one specific case, the case where the successive modifications in the expectation are decreasing. That was in fact the aim of the article – to think of ONE function that proves that it is possible to follow Hofstadter’s Law and still take a finite amount of time.

It might be fun to think of a function that does not have f[n]x-f[n-1]x<f[n-1]x-f[n-2]x all the time, though. Hmmmm…

In our current version of f(x), we take x to the point that is exactly midway between x and [x + 1], i.e., we divide it in the ratio 1:1. If we make this ratio depend on [x + 1] – x, prolly we can achieve what we want. So for example, let d = [x + 1] – x, then
f(x) = (x + (sin(d))^2 * [x + 1]) / (1 + (sin(d))^2)
seems to be the desired function.

That is, instead of dividing in the ratio of 1:1, we are now dividing in the ratio 1:(sin(d))^2