At the cutting edge of mathematics is a function sometimes called “tetration” or “hyperpower” with which you stack exponentials, like 2 to the (2 to the (2 to the 2)), which you can then write as 42. The parentheses are there because you have to do the top power in the stack first. As compared to common functions like exponential or addition it is poorly understood, and we don’t really have a convenient or very much consistent way of finding values for things like ½2. While 2½ is just the number such that when you bring it to the 2nd power you get 2, i.e. (2½)2 = 2, it turns out that 2(½2) is not 2, and there isn’t any easy way of defining the hyperpower function for anything but integer values (that is to say, with f(x) = x2 only letting x be an integer.) You can even go into negative integer values by taking logarithms.

If you have excel, you can type in a simple formula: in cell A1, put a 2, in cell A2, just type in =2^(1/A1), and then do the click-drag trick (click-drag on the little black square in the lower right hand corner of the selected box) to copy the formula into the cells below. You end up with a column of values where each is based on the value of the cell above it. It’s a “recursive” formula, and by the time you get to A60 the value it shows should be something like 1.55961. 1.559611.55961 ≈ 2. Or we can say, 21.55961 approximately equals 2 (“≈” means “approximately equal to,” we have to say “approximately” because there are probably infinitely many digits to this number that starts 1.55961, but 6 of them is enough to make the point I am trying to make.)

Why do I think this is interesting? Because despite the mathematical establishment’s efforts (and there have been a fair bit, just wikipedia “tetration” or go to here) this excel method is probably by far the easiest way to come up with these hyperpower “roots” (to borrow the term.) It doesn’t have quite the rigor or proof of methods like finding a complete summation expansion for the function that could be used with any complex number and to find derivatives and integrals, but there is some basic logic here – if excel was going to come up with a number (and not infinity or zero or some chaotic values), the number it would come up with would be one that satisfies the relation y = x1/y. If x equals yy, this relation is always true.

In essence, what we have done is set a constraint, in the form of this recursive formula, and then let the program run. Excel calculates by using computer code, which is processed by the CPU, which is really just a finely manufactured collection of basic circuit elements like transistors. Physicists can describe very precisely how these basic circuit elements work. In a broad stroke of reductionism, the universe is doing the calculation. We may write the code and design the CPU, but if the way electrons behaved were to change, the computer screen would display different results. If you are one to believe in the material nature of the mind, you might argue that all the calculations that have and will ever be done are actually done through the repetition of events that take place in the physical universe and that we associate with an abstract mathematical process – the universe is doing all our work.

That’s not to say the universe is perfect – quantum mechanics implies a sort of “truth in the limit,” that the average result of an identical event measured an infinite number of times will be a mathematically predictable value. But for any given event there is some degree of stray from this average value. Through the ordered and also inherently random nature of the universe, its components evolve and move towards the stabilities provided by the natural constraints. Just like the numbers in the excel file, which are actually physical processes that we correlate to the abstract theory of mathematics, the particles of the universe interact solely through events that correspond (though not completely) to mathematical entities and reveal the qualities of reality. The anthropic nature of the universe, that we and everything in it exist because of its properties, is the fulcrum of physics. And if you believe in the power of reduction, we exist as the collection of events and changes that arise out of a combination of mathematics and inherent randomness. The various parts of ourselves that comprise any substance or action are a retention of data and the program of natural process that erases part of the data and then conducts an action based on it but it a manner that is obscured from us. We are the likely & stable members that arise from the universal tendency. But we have to hope that we don’t have enough of the story or entropy is conquerable or dark energy doesn’t exist, or else we are only finitely stable.

Advertisements

Like this:

LikeLoading...

Related

12 Responses to “Universal Attractors”

This is really fascinating. I like the “universe doing the calculation” bit, because people often forget that computers and media, as abstract as their operation may seem to us, are still operating on a very real, physical set of components and processes.

Another point: “And if you believe in the power of reduction, we exist as the collection of events and changes that arise out of a combination of mathematics and inherent randomness. The various parts of ourselves that comprise any substance or action are a retention of data and the program of natural process that erases part of the data and then conducts an action based on it but it a manner that is obscured from us.”

This part is particularly interesting to me, because I actually see quite a few parallels with Deleuze, mainly something that he dragged out from the Hume swamp: the concept of contemplation, which also ties into Bergson a bit. But the main idea is that being is not a state, it’s a process. In this particular instance they talk about it as a synthesis of time, wherein the “contemplating” life synthesizes time by contracting what has happened toward the present and creating expectations, judging possibilities for the future. In an essay I read about recording media, someone characterized Deleuze’s descriptions of them as essentially a system of separate data banks in a flowing process of transcoding information more or less perfectly, sometimes with remainders.

Maybe this is something to discuss further, but I would be interested to see what you think about that, because there are some weird consonances here.

no, that’s no really what it means at all, the idea is that when a code is exchanged between two coded bodies, the transcoding is more or less perfect, so sometimes parts of the codes from one thing are either changed or elided by the other.

So, for example, we eat food, but our digestion is not perfect, so we produce waste. The transcoding is imperfect. This is a very reductive example, but the excess, or remainders, are often then wrapped up in other systems. So, not really after-life or after-death, but just life as such.

but think about it — a human as a code (which is what we are, encoded information), and the universe, or perhaps, less the universe and a ‘something else’. we perish, but are being transferred, or, transcoded between another coded body.

the waste from that being the many ‘illogical’ things that seem to appear throughout the world, which we perceive. you know, ‘disturbances’.

okay, so i veered off the idea of what you’re talking about being an after-death idea, but i find them to be linked, but not the genesis.

shoot, humans could very well be the waste of the universe, the “waste code” waiting to be reassimilated into the main coding bodies…

wheres the connection between the illogical waste element, and then vanishing into the quantum uncertainty?

can any one thing ever be removed forever from the universe?

to be removed, means there is something on the other end of the spectrum where the ‘removed’ much exist.

removed is linked to moving and movement, and remove is just another term for movement, no? so we’re just bouncing between one known aspect of the universe and one unknown. but still a part of the same universe…

The connection I’m (very loosely) hypothesizing about lies in how Quantum Mechanics and theories of “completeness” handle how we look into the world: They break it down into the logical (the predictable, the understandable, the formalisable) and the illogical (the inherently random, the parts that we can’t understand, the “elsewhere” parts of the universe that are physically beyond all our scope).
I was using “removed” in the sense of “destroyed” – yes, the universe does do this, and if you want to posit some “other end of the spectrum” I don’t see why that must exist – the universe does not retain 100% of all information. This loss of information has a lot to do with entropy, if you want to look into it further.

The illogical waste element vanishes into the quantum uncertainty because if it was some logical part it would be understandable and therefore not something I would call waste. Quantum Mechanics tells us that we can have some limited degree of predictability but there is also an inherently random bit that we fundamentally can’t understand or get rid of. It is simply how the universe works.

I forgot to consider entropy. Information loss seems to be a much-overlooked idea in contemporary philosophy.

The idea I was pushing at is that each body, or heterogenous network, is in some manner constructed by a code. Each single molecule within a coded network (not necessarily a molecule in the scientific sense) probably belongs to multiple other networks, and might even be a network itself. These networks interact, so this is the transcoding I was talking about. Seems to dovetail with what you are talking about, Jim.

The remainder in this case, since no transcoding is perfect, is just something that is excised from the network, which I suppose would also allow for the concept of entropy and information loss. It is a philosophy of uncertainty and indeterminacy, much like what you seem to describe.

I have no real squabbles or points to make, I just thought that it might help a bit if I were to clarify my point some.