More on M-space, N-space, and so on

I've been reading Norman Friedman's 1997 book The Hidden Domain: Home of the Quantum Wave Function, Nature's Creative Source. One of the interesting things about this book is that Friedman, in addition to relying on standard sources, takes advantage of insights provided by Seth, the discarnate entity channeled by Jane Roberts. Friedman finds Seth's interpretation of ultimate reality to be useful in making sense of subatomic phenomena, which is perhaps noteworthy in light of the common complaint that channeled material never contributes anything to the understanding of science.

Incidentally, the Friedman excerpts presented below do not owe anything to Seth, whose influence comes later in the book. Here, Friedman is relying only on mainstream sources.

Friedman, page 23:

Einstein ... rejected Newton's notion of an absolute space, at rest and immovable, relative to which all objects in the universe are moving. Einstein suggested that this description of space be discarded because any observer, in whatever frame of reference, could say that he or she was at rest at all else was moving relative to them.

M-space, or what we call the "physical world," is a projection of a virtual reality tailored to the point of view of our particular consciousness. Therefore, each of us is the focal point of our own private M-space, and from our point of view, everything else is in motion relative to our fixed vantage point. By analogy, the VR environment of a computer game typically moves relative to the point of view of the user's avatar.

Einstein's original four-dimensional space-time is now understood to be a projection, or a shadow, of a larger higher-dimensional space called the "fiber bundle." The elementary particles and nongravitational forces are then interpreted as geometric structures in the higher dimensions beyond Einstein's space-time.

M-space is a projection or shadow of the larger higher-dimensional N-space. Elementary particles areinterpreted as informational structures in N-space.

Friedman, page 45:

The uncertainty principle leads to the idea that one electron (or any particle) is in principle indistinguishable from any other.

The best comment on this is from Whitworth's essay, linked above:

Quantum equivalence: All quantum entities, like
photons or electrons, are equivalent. Digital equivalence: Every digital "object" created
by the same code must be equivalent.

Or as Whitworth put it in an earlier version of his essay (I'm paraphrasing): Every digital symbol calculated by the same program is identical to every other, just as every photon is identical to every other photon because each is created by the same digital calculation.

Friedman, page 49:

Thus, it would seem that we live in a world of at least two levels: the wave function is on one level and correlates with object of affection in the other level, which is ordinary three-dimensional reality.

N-space, corresponding to the wave function, is one level, and M-space is what we call three-dimensional reality.

Friedman on the two-slit experiment, page 254:

If we duplicate this experimental arrangement throughout the world with many experimenters, each firing just one electron at a given prearranged time, with each individual photographic plate showing the arrival of the one electron, and the results from all the plates are added together, then, amazingly, the interference pattern shows up again! These experiments are arranged so that no signal can travel between them at less than the speed of light, so there can be no physical communication between the electrons. Just how does each electron know where to strike the plate so that in the interference pattern appears?

The electrons know where to strike the plate because the calculations have already been performed in N-space. The fact that the experiments are being carried out in different labs is irrelevant, since N-space is nonlocal. Since it is all one experiment (albeit broken up into different parts), the outcome is determined by a single set of calculations in N-space.

Friedman, page 56:

An explanation for this strange wave-particle duality, called the Copenhagen interpretation, was presented by Bohr in 1927 at the fifth Solvay conference in Brussels. The simplest statement of Bohr's view is this: the quantum world is not real. Bohr recognized that the quantum world is completely different from our normal everyday world governed by the familiar laws of classical physics. Though we have mathematical formalisms to describe it, the unreal world of the quantum relates to the real world only by an act of measurement.

I would prefer to flip this around and say that M-space is not fully real, while N-space – the underlying information realm – is what is ultimately real. However, it depends on how you look at it. The key point is that the two dimensions of "reality" are qualitatively different, and that N-space can be understood only in terms of "mathematical formalisms," which makes sense given that it is a realm of pure information and information processing.

Friedman, page 58:

In classical physics, mathematics is used to represent the attributes of a system. The formulae are taken at face value and assumed to be actual descriptions of the evolution of the system. In quantum theory, the situation is quite different. Here the mathematics is an algorithm for calculating the results of experiments, at least as far as the Copenhagen interpretation is concerned. Actuality is no longer considered, but has evaporated into the mists of the mystical.

I would look at this differently, and say that N-space is the actuality, while our three-dimensional "physical" world is, in a sense, part of the "mists of the mystical" – in the sense that it is a projection of consciousness.

Friedman describes his own point of view on page 62:

It is our thesis that the wave function is not merely a symbol to be used in a calculational procedure. Rather, it has real meaning and describes a hidden domain that is the creative source of our three-dimensional world.

This, unlike the views quoted earlier by Friedman, is essentially the same as the M-space/N-space idea. N-space is the "hidden domain that is the creative source of our three dimensional world," and "is not merely a symbol to be used in a calculational procedure." Rather, the calculational procedures are the basis of what we call reality.

Friedman, footnote on page 64:

Some scientists think the space continuum may actually be grainy, which implies a universal minimum length, usually estimated to be 10 [to the power of] -33 cm. Every gravitational wave in space has a zero-point energy and because of that, the lengths below 10 [to the power of] -33 become undefinable.

The universal minimum length in a virtual-reality universe would correspond to a single pixel on a computer screen, the smallest image that can be displayed. The universal minimum time would correspond to the length of time between screen refreshes.

Friedman, page 68:

We can conjecture that there are two types of time. One type describes a sequence of actual events brought about by the collapsing wave function, which is the time we are aware of in the three-dimensional universe. The other kind of time, used to create space-time in Einstein's relativity theory, refers to the evolution of possibilities in the Schrodinger equation. Since possibilities are not real events, that type has been called imaginary or virtual time.

The "time" that applies to N-space is qualitatively different from the "time" that we proceed in M-space. This is why precognition and retrocognition are possible; these abilities apparently involve tapping into N-space directly.

In what way are the mathematical formulae of N-space "rendered" into the multidimensional, multisensory images of M-space? Friedman discusses how the probabilties of the "hidden domain of the quantum wave function" can be translated into what we know as realities. This is a little complicated, so I will summarize in a series of steps.

1. Schrodinger's equation expresses the quantum wave function as a complex equation, i.e., a mixture of real numbers and imaginary numbers. This wave can be written as a sum of sine waves using Fourier's equations, but the sine waves are also complex functions (with imaginary numbers).

2. Imaginary numbers cannot be plotted as coordinates in physical space. Therefore Schrodinger's equation cannot tell us the location of the subatomic particle in space.

3. Is there any way to convert imaginary numbers into real numbers and thus locate the particle in physical space? Yes. If you multiply an imaginary number by its complex conjugate, the result is a real number.

4. What is the complex conjugate of the quantum wave function? All waves can be understood as "retarded waves," which travel forward in time, and "advanced waves," which travel backward in time. The advanced quantum wave is the complex conjugate of the retarded quantum wave; the product of these two waves gives us a wave function that can be expressed purely in real numbers, and which thus provides us with a location in physical space.

In terms of our scenario, we might think of N-space as consisting of complex equations - a mix of real numbers and imaginary numbers. What we call "rendering" would consist of multiplying these equations by their cognates. The result would be real numbers only, or "physical reality." Note that this process can be understood either in terms of mathematical calculations (multiplication) or wave interactions. N-space, then, could be said to consist of "complex equations" or of "quantum waves" - two different ways of lookingat the same thing. Equations relate better to the computer analogy, while waves relate better to the hologram analogy. The two analogies (or metaphors, or modekls) are very similar, since a hologram can be created entirely by a computer, and since the wave interference patterns on a holographic plate can be translated into data.

Friedman sums up the sequence of steps outlined above, page 73:

The basic tenet of Cramer's transactional interpretation of quantum theory (as it is called) is that every quantum event involves a kind of 'handshake' between the past and the future, so that in some way, the future is affecting the past.

In other words, the retarded wave and the advanced wave "shake hands" across time - something that's hard to visualize in our three-dimensional world (M-space), but easier to understand in the context of N-space, where time behaves differently.

Some additional points made by Brian Whitworth's essay. (The following are not verbatim excepts but paraphrases and abridgements.)

Processing load effects could explain relativity effects. Space and time arise from a fixed information processing allocation, so the sum total of space and time processing adds up to the local processing available.

In other words, time appears to expand and space appears to contract as you approach light speed because the information processing system is reaching the limit of its processing load. (See Whitworth's essay for details.)

The algorithmic simplicity of fundamental physical laws and constants is explained by the needs of the information processing system. In a virtual reality, the basic rules must be simple because they must be constantly calculated and recalculated.

If complementary object properties use the same memory location, the object can appear as having either position or momentum, but not both at once. (See the website The Bottom Layer for a step-by-step discussion of this point.)

In a VR universe, all object movement would be expected to be by state transitions.

Quantum jumps and quantum tunneling are known examples of state transitions - i.e., discontinuous movement, in which a particle shifts its energy level or its position or from one state to another without passing through the intervening state. This is difficult to explain in terms of objective reality, but easy enough to understand if the transitions reflect calculations taking place in N-space. The particle does not need to pass through the intervening states, because it simply shifts from one state to another as a result of a behind the scenes calculation. The new state shows up as soon as the virtual-reality screen is refreshed.

A virtual-reality system may start with a sudden influx of information as the virtual-reality universe boots up. This corresponds to the apparent origin of the universe out of nothing, as posited by the Big Bang theory.

Finally, Whiteworth asks:

Given the speed of light is a universal maximum, what is simpler, that it depends on the properties of featureless space, or that [it] represents a maximum network processing rate?

In this view, the speed of light as an absolute maximum simply indicates the maximum speed at which the information processing system can crunch the numbers. In the M-space/N-space scenario, the limit may also involve the maximum capacity of the render engine. Note that the rendering is done individually for each observer, so the render effects would be apparent only to an observer(s) who was approaching light speed in his particular M-space. The render effects would not be apparent to anyone not approaching light speed, because that person's unique M-space would not be under an unusually high processing load.

Comments

Michael I object to much of what you (your sources) present above, from a scientific standpoint. But its too much form my split mind to handle. You are remarkably productive. I am really impressed! You win by brute force. My second language writing process is like a snail crossing the trail. And yours like the guy coming walking on it. Crrsshhh..

I would say that the theory that this blog is bringing forward is very much like panpsychism, a theory that was strongly favoured at the Esalen Survival conference and one that is gaining popularity among philosophers and even mainstream scientists.

Ok, just thought I would drop this in, some of you may know more about her, or the method used.

Michelle is a trained hypnotist as well as a medium who hypnotisies her clients to allow them to communicate on a personal level with spirits that want to connect. If you are visual for example, as the spokesperson on the video, you may end up in the presence with the spirit and communicate in person. She acts as the facilitator of the exchange.

The tape is long - 3/4 of an hour, and I found myself wishing they would talk faster, but of interest all the same. "Michelle"(calls herself-The Corporate Woowoo) has a web site called soul-felt.com , for those that may to experience the same.

"Is this the Norman Friedman who is described in wiki as a 'naval analyst'? If so maybe that should have read 'navel analyst' :)"

Different guy. Good joke, though!

"You win by brute force."

Not trying to win anything, really. Just tossing a bunch of ideas against the wall and seeing if any of them stick.

"I would say that the theory that this blog is bringing forward is very much like panpsychism"

I'm not very knowledgeable about panpsychism, but isn't it the view that everything is conscious - that mind pervades the universe and can be found in all things? If so, I would say that it differs from the view suggested here. Everything physical may be reducible to information, but information isn't mind (as I see it). An encyclopedia consists of a great deal of information, but it has no awareness and it can't think.

A great deal of this is beyond me (though interesting to see Seth referenced). I probably have it all wrong, but from where is everything being projected? I can sort of see the universe as being constantly refreshed - but from where or what? Interesting too that as with an image made of pixels (even just a photo in a newspaper), you can only see or understand the image from a certain vantage point (if you're too close, you just see meaningless dots). It's the same with a test for color blindless - if you can't see colors, you won't be able to see characters in the image.

"Just tossing a bunch of ideas against the wall and seeing if any of them stick."

But you seem to find these ideas very credible. If everything is indeed calculated, do you think it possible, or even probable, that deceased spirits and mediumistic insights are simply calculations too - "imaginary beings" who don't actually occupy any space or astral dimension?

"the universe as being constantly refreshed - but from where or what?"

M-space, the mind-space we explore and call reality, is constantly refreshed for each of us. From where or what? From N-space, the underlying information field.

"If everything is indeed calculated, do you think it possible, or even probable, that deceased spirits and mediumistic insights are simply calculations too - "imaginary beings" who don't actually occupy any space or astral dimension?"

We are all "imaginary beings" in that sense. It doesn't matter if we are deceased or not. Discarnate spirits continue to project their own M-space, just as they did when incarnated. (Actually the term "discarnate" is misleading, since they still experience themselves as having bodies and moving through space, just as we do.) Nothing occupies any space, because there is no physical space. There is only N-space (pure information), M-space (a mental projection or construct), and the mind (consciousness). The "astral dimension" is just a particular kind of M-space, originating in the same information matrix that undergirds all experiences.

I'm not claiming to be sure of this, of course - just explaining what the ideas would logically entail.

I always thought of the "other side" or "heaven" as being another dimension. After learning about the holographic universe theory I figure that the "holographic film" is in another dimension, and that it's the place we call "heaven."

I don't believe there could really be "calculations" going on. If the calculations were themselves performed by *something*, then that something would be a system (like our Universe) that would require its own explanation, and we would be back to square one, so to speak. (IOW, if we are like a virtual reality program running on a physical or even non-physical system, that system would require a science of its own.)

Since there are no calculations IMO, then there is no limitation to processing power, and I think any explanation in terms of such a processing limitation is on the wrong track.

To me, it makes sense to see the speed of light as an arbitrary rule of our system and not a limitation to processing power or anything of that sort.

I think it boils down to information, but it is not like information inside a computer.

"Since there are no calculations IMO, then there is no limitation to processing power"

I'd be reluctant to give up on the idea of a limitation on processing power, because it is so useful in explaining things.

Heisenberg's uncertainty principle can be understood as the inability of the system to hold more than one of two complementary values in the same memory space at the same time. This would not be a problem in a system without limitations.

Wave-particle duality can be understood as a consequence of the preservation of processing power, with the necessary calculations being performed and the "images" being rendered only when called for (i.e., when observed). Again, this would not be an issue for a system with unlimited power.

Relativity effects, as mentioned in the post, also can be understood in terms of the limitations of processing power. As the system strains its capacity in attempting to handle near-light-speed conditions, it must slow down some of its calculations to compensate.

"If the calculations were themselves performed by *something*, then that something would be a system (like our Universe) that would require its own explanation"

I'm assuming the calculations are performed by an information processor, what Thomas Campbell calls the cosmic CPU. Of course, it is not a physical information processor. Where it comes from and how it all got started is beyond me. The cosmic CPU could be "God." But why does God have to be infinite and unlimited? (In fact, certain philosophical problems are easier to address if we take God to be limited in some respects.)

"it is not like information inside a computer."

Well, not literally a computer, of course. But remember that wave functions can be expressed as equations, and can be calculated. So if we want to picture N-space as quantum wave functions, or as wave-interference patterns in an evolving holographic plate (Bohm's "holomovement"), it still amounts to the same thing: information forming patterns and undergoing transformations in accordance with algorithmic rules.

||I'd be reluctant to give up on the idea of a limitation on processing power, because it is so useful in explaining things.||

It's not useful if it makes no philosophical sense. :)

||Heisenberg's uncertainty principle can be understood as the inability of the system to hold more than one of two complementary values in the same memory space at the same time. This would not be a problem in a system without limitations.||

We are talking about a system with near-infinite processing power anyway. It's not as though the system *tries* to do both but merely fails. It *never* does both. That means that there is a law that says that it never *can* do both.

Now, you could argue that the law was established to preserve processing power, but that doesn't make a lot of sense either. The only time this particular type of processing would be needed was when humans were doing a certain type of experiment. "Go ahead, give them both this time--just this once! Oh, they did the experiment again? OK, just give it to them every time they do the experiment--no biggie. We've got bigger fish to fry.

Same thing about wave-particle duality.

||Relativity effects, as mentioned in the post, also can be understood in terms of the limitations of processing power. As the system strains its capacity in attempting to handle near-light-speed conditions, it must slow down some of its calculations to compensate.||

I don't see why the system would "strain"; the kinds of calculations we are talking about could be performed on ordinary physical computers. It's not as though calculating in a pure Newtonian system would be harder; if anything, it would be easier.

||I'm assuming the calculations are performed by an information processor, what Thomas Campbell calls the cosmic CPU. Of course, it is not a physical information processor. Where it comes from and how it all got started is beyond me. The cosmic CPU could be "God." But why does God have to be infinite and unlimited? (In fact, certain philosophical problems are easier to address if we take God to be limited in some respects.)||

If the non-physical Cosmic CPU is limited, then there would have to be *reasons* for such a limitation--and what could they be? There would also have to be an explanation for why it (or someone/something else) chooses to save processing power in one way and not another way. Even if we call such a system "non-physical," it would be analogous to a physical system and would require its own explanation, probably its own complete science. Thus, we are back to the problem of explaining a physical system in terms of another physical or analogously physical system. We could mentally draw a big circle around the whole thing (the Universe and the Cosmic CPU) and say, "This *all* needs to be explained."

||Well, not literally a computer, of course. But remember that wave functions can be expressed as equations, and can be calculated. So if we want to picture N-space as quantum wave functions, or as wave-interference patterns in an evolving holographic plate (Bohm's "holomovement"), it still amounts to the same thing: information forming patterns and undergoing transformations in accordance with algorithmic rules.||

I agree. But I don't think actual computations are being made. I think the whole idea of "calculations" is a metaphor that leads to philosophical error in this case.

Relativity effects, as mentioned in the post, also can be understood in terms of the limitations of processing power. As the system strains its capacity in attempting to handle near-light-speed conditions, it must slow down some of its calculations to compensate.

It is interesting that a rough value of the cosmic computer operation rate could be estimated for a "Universe simulation". This rate would be limited by the so-called "Planck time", supposedly the absolute minimum time for any physical transition, a sort of ultimate granulation of the Universe. "The Planck time is the unique combination of the gravitational constant G, the relativity constant c, and the quantum constant h, to produce a constant with units of time" (Wiki).

This estimate would be the number of elementary particles in the observable universe (about 10^80) times the maximum possible rate per second of transitions in physical states (the inverse of the Planck time, 10^45). This is 10^125 operations per second. This estimate would actually be very very low because of not accounting for dark energy and dark matter (which have been discovered to make up most of the energy and mass of the Universe), and the fact that each cosmic computer "operation" would have to actually be a complex computation of its own embodying a large number of elementary operations involving other elementary particles. This unimaginably great total computational burden would only be part of the total, since it represents just the calculations necessary to maintain "N space" independent of whether or not it is rendered into any consciousness. We still have to add the rendering calculations. And this doesn't include any simulations of other N-spaces and other M-spaces to account for afterlife states of consciousness.

But I don't think actual computations are being made. I think the whole idea of "calculations" is a metaphor that leads to philosophical error in this case.

This reminds me of Searle idea that nature does not calculates, but the calculations are only an instrument that put us in nature. It would be like saying that the planets solve Newtonian calculations to orbit, but it is not because the planets simply orbit and Newtonian calculations are our way of describing their orbits. Even so this does not exclude that there fundamental limitations are to be admitted as primitive.