You can use the terms "and" & "or" in your search; "or" phrases are resolved
first, then the "and" phrases. For example, searching for "black hole and
galaxy or universe" will find articles that have the phrase "black hole" in them
and also have either "galaxy" or "universe" in them. Please note that other
search syntax like quote marks, hyphens, etc. are not currently supported.

When you view web pages with matches to your search, the terms you searched for will be highlighted in yellow.

If you are aware of an interesting new academic paper (that has been published in a peer-reviewed journal or has appeared on the arXiv), a conference talk (at an official professional scientific meeting), an external blog post (by a professional scientist) or a news item (in the mainstream news media), which you think might make an interesting topic for an FQXi blog post, then please contact us at forums@fqxi.org with a link to the original source and a sentence about why you think that the work is worthy of discussion. Please note that we receive many such suggestions and while we endeavour to respond to them, we may not be able to reply to all suggestions.

Please also note that we do not accept unsolicited posts and we cannot review, or open new threads for, unsolicited articles or papers. Requests to review or post such materials will not be answered. If you have your own novel physics theory or model, which you would like to post for further discussion among then FQXi community, then please add them directly to the "Alternative Models of Reality" thread, or to the "Alternative Models of Cosmology" thread. Thank you.

There is no way to reduce physics to information theory. Matter is not just empty space with isolated bits of information. The quantum is not digital data, logic, probability, or information. There is a long history of trying to understand the ethereal mysteries of quantum mechanics by reduction to discrete information, as if the universe were a giant ghostly digital computer without the hardware. These attempts have failed, and should be seen as evasions of the central truths of quantum mechanics. In short, there is no it from bit.

Author Bio

Roger Schlafly has a BSE from Princeton U, and a PhD in Mathematics from U California Berkeley, under I. Singer. He blogs at DarkBuzz.com.

While I find nothing I can disagree with in your essay, I think you're far too sober. You do not tolerate mystical views of non-observable "entities" that form the basis of much, if not most, of today's physics. How can you not be obsessed about whether information is lost in black holes? (based on the yet to be proven existence of black holes as singularities behind event horizons.) What's wrong with you? Maybe there's something wrong with your imagination (as with mine).

So I thank you for your impeccable logic and your being grounded in the sanity of real territory versus abstract maps and sophisticated fairy tales. You clarify concepts that are muddied and confused by long habits of "rigorization" of math (per James Beichler) and "ephemeralization" of matter.

You wrote an interesting essay, and I agree to what you wrote about matter. Although we have a better insight into matter with contemporary physics but still we have long way to go, and in fact we can't never make sure if we could have a true understanding of the reality. But we can agree whatever is in the background, and no matter what size the elementary particles may have, one thing is clear that, the matter creates fields and interactions.

The other facts is that we see is trace of something that in fact is not visible, what's your view on that?

You seem to put forward a number of propositions as being commonly regarded as true, then state that they are not. But I am not sure that they are commonly regarded as true in the first place.

For example:

- “but the suggestion has been made that information is more fundamental”. Where is this the case, when not translated as ‘information is all we have with which to discern the it’. Similarly…

-“The history of science could be viewed as systematically denying the substantiality of matter”. Again I am not aware of any mainstream thinking that has ever argued that ‘there is nothing there’. “It is now commonplace to say that modern physics has proved that there is no such thing as solid matter because it is almost entirely empty space”. Indeed, but this is not the same as saying there is no substance, just that there is less than we thought/appears.

By definition, if we physically receive something, which we do, then it physically exists. Indeed we physically exist. The issue is that our ability to receive what is there is limited. And if something exists then it is comprised of something. And to exist inherently involves a discrete physically existent state of that something. To deny that is to deny existence as manifest to us. What may or may not be ‘asctually’ happening, e can never know.

The issue with QM is that it asserts a form of indefiniteness in existence, which then, because that presumption is wrong, has to be rationalised by other incorrect assertions, for example, the role of observation.

I supplied a quote for matter being mostly empty space. The quote did not say that matter was entirely empty space.

I do not agree that QM asserts a form of indefiniteness in existence.

Jacek Safuta wrote on Jun. 22, 2013 @ 10:45 GMT

Hi Roger,

I agree that there is no way to reduce physics to information theory. But I think that it is possible to reduce physics to pure geometry. You should like that view as a mathematician.

But then matter is just an empty space or rather deformed region of the conformally flat spacetime (so every entity is a wavepacket and we are close to Schroedinger and Einstein at the same time).

I think my publications could help you change your mind because my concept generates clear predictions that can be falsifiable by experiment. If the experiment falsified my idea than of course Standard Model and your concept would win.

Roger, I fully agree with your sentiments and your statement "Whatever uncertainty there is may be entirely due to our lack of knowledge about the state, and the discreteness imposed by the measuring process." and I would add that our mathematics is not complete (Gödel) to describe quantum processes.

I can add support in with a simple analogy: These guys can control their spinning tops but can we model deterministically the motion of two spinning tops after they collide? I bet you the best we can do is a QM like probabilistic calculation - the reason there are simply too many variables.

The only critique I have, after raise doubt in the field of QM that this doubt is not carried forward when you raise the subject of black holes and quantum information, black holes in my opinion are figments of our imagination and artifacts of general relativity.

I personally believe in the power of the mathematics; in my essay I present a simple problem in physics, other may call it a paradox, mathematicians would call it a counter example. I am curious of your profession reaction to my essay.

PS. I just discovered your blog, excellent work!

PPS. To counter the cowardly 1 that every new essay seems to be given by some joker a 10 to even things out will not harm.

I found your essay to be superbly constructed, exquisitely written with all of its major points meticulously analyzed and dealt with perfect thoroughness and clarity.

Please bear with me; I am a creaky old self-taut (thinking makes me tense) realist. I am uneasy about the assertion that just because the physicists can break some purified matter down into particles that occupy skads of empty space at one time and that means that me and the chair I sit on have to be made of the same sort of particles. The chair and I obviously occupy a different part of the Universe than your particles do. Where does the reality lay sir? Is it in my head and my seated bottom, or is reality located in your spaced out particles? It cannot be in both places can it? How can I know about the space of your particles and not know of the supposedly only empty space surrounding the particles you are suggesting are contained compactly in my brain?

The “free willers who are convinced that only determinism or randomness could exist are completely wrong. As I have pointed out in my splendid essay BITTERS, Only unique exists, once. Unique cannot be determined. Unique cannot be random. Unique cannot be probable. Unique can only ever be inevitable.

It is very refreshing to read a well written skeptical essay and I share much of your skepticism. However, in my view, there are a few misconceptions in your essay.

- The so called "free will theorem" does not establish that particles have free will or exhibit genuine stochasticity, whatever those terms may mean. It is just another proof of Bell's theorem, pure and simple. Of course,...

It is very refreshing to read a well written skeptical essay and I share much of your skepticism. However, in my view, there are a few misconceptions in your essay.

- The so called "free will theorem" does not establish that particles have free will or exhibit genuine stochasticity, whatever those terms may mean. It is just another proof of Bell's theorem, pure and simple. Of course, Kochen and Conway do not conclude this, stating instead that measurement outcomes must be undetermined prior to measurement. However, they fail to note that this is incompatible with the other assumptions they have made. In particular, TWIN implies that measurement outcomes on the two wings have to be perfectly correlated and the only way this can happen in a hidden variable theory is if it is deterministic. Therefore, undetermined measurement outcomes is not an option unless you give up at least one of their other assumptions, with locality and realism being the obvious choices.

- In quantum cryptography, or more accurately quantum key distribution, the goal is for Alice and Bob to end up sharing correlated secret bits that are uncorrelated with and eavesdropper. It is not to share some kind of ill defined quantum information. The laws of QM show that they can do this, or more accurately that they can expand an existing secret key into an arbitrarily long one if we are relying on a classical protocol for authentication. This is not something you can do with classical systems, so I do not understand what your criticism of this is about.

- There are secure quantum authentication protocols. Of course, one would need a quantum computer to implement them, and you may be skeptical of that.

- Speaking of which, I don't understand your skepticism of quantum computing. Surely the free will theorem has nothing to do with it. If the mathematics of QM says that we can efficiently factor integers then we can do so, assuming we can get over the engineering difficulties. What precisely is the problem with Shor's algorithm.

Thanks for your comments. The free will theorem may be overstated. I do not dispute that. My only point is that it is an additional reason to believe that whatever info is communicated by an electron beam, it is not classical info like ordinary bits.

Expanding a short shared secret to a long shared secret is easily done with classical cryptography. Just iterate a secure hash function, for example. It is vulnerable to someone with infinite computing resources, as is all practical cryptography. Quantum cryptography substitutes some other vulnerabilities. I just don't see any practical utility to it.

My skepticism about quantum computing has nothing to do with the free will theorem, and I do not dispute the mathematics of Shor's algorithm. Mainly I just think that a quantum computer executing Shor's algorithm would be surprising in a way that goes way beyond the standard experiments confirming quantum mechanics. But that is outside the scope of this essay.

Author Roger Schlafly replied on Jun. 23, 2013 @ 03:04 GMT

I just stumbled across this quote: "Free will is to mind what chance is to matter." — Charles Darwin, Notebook M (begun July 1838). In Charles Darwin, Paul H. Barrett and Peter J. Gautrey, Charles Darwin's Notebooks, 1836-1844 (1987, 2009), 536. Free Will Quotes

basudeba mishra wrote on Jun. 23, 2013 @ 02:21 GMT

Dear Sir,

Mathematics explains only “how much” one quantity accumulates or reduces in an interaction involving similar or partly similar quantities and not “what”, “why”, “when”, “where”, or “with whom” about the objects involved in such interactions. These are the subject matters of physics. The validity of a physical statement is judged from its correspondence to...

Mathematics explains only “how much” one quantity accumulates or reduces in an interaction involving similar or partly similar quantities and not “what”, “why”, “when”, “where”, or “with whom” about the objects involved in such interactions. These are the subject matters of physics. The validity of a physical statement is judged from its correspondence to reality. The validity of a mathematical statement is judged from its logical consistency. Your essay is logically consistent.

Because of the over-dependence on mathematical modeling, the cult of incomprehensibility, search for easier and faster ways like reductionism, and superstitious belief in the established theories, progress of science has been hampered. Hence there is a need to look afresh at the prevailing theories in a logically consistent manner based on the data now available and make necessary changes wherever necessary. One such area is the division by zero. Because of your mathematical background, we are putting these before you.

Division of two numbers a and b is the reduction of dividend a by the divisor b or taking the ratio a/b to get the result (quotient). Cutting or separating an object into two or more parts is also called division. It is the inverse operation of multiplication. If: a x b = c, then a can be recovered as a = c/b as long as b ≠ 0. Division by zero is the operation of taking the quotient of any number c and 0, i.e., c/0. The uniqueness of division breaks down when dividing by b = 0, since the product a x 0 = 0 is the same for any value of a. Hence a cannot be recovered by inverting the process of multiplication (a = c/b). Zero is the only number with this property and, as a result, division by zero is undefined for real numbers and can produce a fatal condition called a “division by zero error” in computer programs. Even in fields other than the real numbers, division by zero is never allowed.

Now let us evaluate (1+1/n)n for any number n. As n increases, 1/n reduces. For very large values of n, 1/n becomes almost negligible. Thus, for all practical purposes, (1+1/n) = 1. Since any power of 1 is also 1, the result is unchanged for any value of n. This position holds when n is very small and is negligible. Because in that case we can treat it as zero and any number raised to the power of zero is unity. There is a fatal flaw in this argument, because n may approach ∞ or 0, but it never “becomes” ∞ or 0.

On the other hand, whatever be the value of 1/n, it will always be more than zero, even for large values of n. Hence, (1+1/n) will always be greater than 1. When a number greater than zero is raised to increasing powers, the result becomes larger and larger. Since (1+1/n) will always be greater than 1, for very large values of n, the result of (1+1/n)n will also be ever bigger. But what happens when n is very small and comparable to zero? This leads to the problem of “division by zero”. The contradicting result shown above was sought to be resolved by the concept of limit, which is at the heart of calculus. The generally accepted concept of limit led to the result: as n approaches 0, 1/n approaches ∞. Since that created all problems, let us examine this aspect closely.

Now, let us take a different example: an = (2n2 +1) / (3n + 4). Here n2 represents a two dimensional object, which represents area or a graph. Areas or graphs are nothing but a set of continuous points in two dimensions. Thus, it is a field that vary smoothly without breaks or jumps and cannot propagate in true vacuum. Unlike a particle, it is not discrete, but continuous. For n = 1,2,3,…., the value of an diverges as 3/7, 9/10, 19/13, ...... For every value of n, the value for n+1 grows bigger than the earlier rate of divergence. This is because the term n2 in the numerator grows at a faster rate than the denominator. This is not done in physical accumulation or reduction. In division, the quotient always increases or decreases at a fixed rate in proportion to the changes in either the dividend or the divisor or both.

For example, 40/5 = 8 and 40/4 = 10. The ratio of change of the quotient from 8 to 10 is the same as the inverse of the ratio of change of the divisor from 5 to 4. But in the case of our example: an = (2n2 +1) / (3n + 4), the ratio of change from n = 2 to n = 3 is from 9/10 to 19/13, which is different from 2/3 or 3/2. Thus, the statement:

limn→∞ an = {(2n2 +1) / (3n + 4)} → ∞,

is neither mathematically correct (as the values for n+1 is always greater than that of n and never a fixed ratio n/n+1) nor can it be applied to discrete particles (since it is indeterminate). According to relativity, wherever speed comparable to light is involved, like that of a free electron or photon, the Lorentz factors invariably comes in to limit the output. There is always length, mass or time correction. But there is no such correcting or limiting factor in the above example. Thus, the present concept of limit violates the principle of relativistic invariance for high velocities and cannot be used in physics.

If we divide 20 by 5, then what we actually do is take out bunches of 5 from the lot of 20. When the lot becomes empty or the remainder is below 5 (divisor), so that it cannot be considered a bunch and taken away further, the number of bunches of 5 are counted. That gives the result of division as 4. In case of division by zero, we take out bunches of zero. At no stage the lot becomes zero or less than zero. Thus, the operation is not complete and result of division cannot be known, just like while dividing 20 by 5, we cannot start counting the result after taking away three bunches. Conclusion: division by zero is mathematically void, hence it leaves the number unchanged.

Mathematics is also related to the measurement of time evolution of the state of something. These time evolutions depict rate of change. When such change is related to motion; like velocity, acceleration, etc, it implies total displacement from the position occupied by the body and moving to the adjacent position. This process is repeated due to inertia till it is modified by the introduction of other forces. Thus, these are discrete steps that can be related to three dimensional structures only. Mathematics measures only the numbers of these steps, the distances involved including amplitude, wave length, etc and the quanta of energy applied etc. Mathematics is related also to the measurement of area or curves on a graph – the so-called mathematical structures, which are two dimensional structures. Thus, the basic assumptions of all topologies, including symplectic topology, linear and vector algebra and the tensor calculus, all representations of vector spaces, whether they are abstract or physical, real or complex, composed of whatever combination of scalars, vectors, quaternions, or tensors, and the current definition of the point, line, and derivative are necessarily at least one dimension less from physical space.

The graph may represent space, but it is not space itself. The drawings of a circle, a square, a vector or any other physical representation, are similar abstractions. The circle represents only a two dimensional cross section of a three dimensional sphere. The square represents a surface of a cube. Without the cube or similar structure (including the paper), it has no physical existence. An ellipse may represent an orbit, but it is not the dynamical orbit itself. The vector is a fixed representation of velocity; it is not the dynamical velocity itself, and so on. The so-called simplification or scaling up or down of the drawing does not make it abstract. The basic abstraction is due to the fact that the mathematics that is applied to solve physical problems actually applies to the two dimensional diagram, and not to the three dimensional space. The numbers are assigned to points on the piece of paper or in the Cartesian graph, and not to points in space. If one assigns a number to a point in space, what one really means is that it is at a certain distance from an arbitrarily chosen origin. Thus, by assigning a number to a point in space, what one really does is assign an origin, which is another point in space leading to a contradiction. The point in space can exist by itself as the equilibrium position of various forces. But a point on a paper exists only with reference to the arbitrarily assigned origin. If additional force is applied, the locus of the point in space resolves into two equal but oppositely directed field lines. But the locus of a point on a graph is always unidirectional and depicts distance – linear or non-linear, but not force. Thus, a physical structure is different from its mathematical representation.

Since you already called Einstein overestimated, it is not a surprise to me that you called a spade a spade and ridiculed Wheeler-related ideas: “as if the universe were a giant ghostly digital computer without the hardware”.

Your essay was enlightening and enjoying to me but certainly not to everybody. Let me just add one more aspect. You mentioned: “Einstein … later became dissatisfied with a theory of observables, and wanted a more “complete” view of reality.” I am the one who argues that future data are not observable in advance and this restriction provides a more complete view of reality. Please try and jump over Singer’s shadow.

If find your opening statement, "I consider different ways in which physics might be reduced to bits of information, but argue that none of these is more fundamental than quantum mechanics." true to the content of your well presented and logical essay. Although I agree with you position the QM is about 'predictable' measurable variables, I did not find an answer from you to 'how' these variables come to existence in the first place? Herein lies the chasm created by QM when it assumes measurements (effects) without its cause.

I appreciated your coin analogy comment, "The coin itself may be deterministic. Likewise a quantum mixture of two eigenstates could be a deterministic object that only seems like a coin toss because of the way that it is measured. Whatever uncertainty there is may be entirely due to our lack of knowledge about the state, and the discreteness imposed by the measuring process."

I like what you write, and cannot argue 0’s or 1’s. But, tell me, do electrons flowing in a wire move by jumping from atom to atom, and are these jumps quanta of negative charge with a charge of -1? And, do these transitions generate electromagnetic waves? And, would not such waves, if they exist, be at an ultrahigh frequency, but occur randomly adding up to a single wavefront which we call a magnetic field, whether AC or DC?

And upon what do these waves ride, if not the Dark Mass, a mass which is gravitationally responsive and is therefore real, and to which, in my essay, I assign the permeability and permittivity of space. This Dark Mass fills the interstices of the atom as well as of all of space, so gives us a handle with which we can do wonders.

A computer with no visible hardware is a consequence of my conjecture, which is specifically how and what things are visible.

Imagine a computer model of fundamental particles, arranged into atoms, molecules, and finally a clock.

As the program runs, the simulated clock runs too.

A faster computer would run the simulated clock faster compares to a slower computer, but an observer inside made of the same types of simulated particles as the simulated clock would not observed any difernce.

In fact, you could pause the whole simulation for a month, the CPUs clock will progress, but the simulated model clock will stop.

If you restart after a month long pause the simulated observer would not realize any time passed.

Likewise the only material that would exist to the simulated observer is stuff like the simulated clock, the hardware of he computer providing the simulated world with simulated particles of matter and light wouldn't be measurable itself

To change the atmosphere "abstract" of the competition and to demonstrate for the real preeminent possibility of the Absolute theory as well as to clarify the issues I mentioned in the essay and to avoid duplicate questions after receiving the opinion of you , I will add a reply to you :

To change the atmosphere "abstract" of the competition and to demonstrate for the real preeminent possibility of the Absolute theory as well as to clarify the issues I mentioned in the essay and to avoid duplicate questions after receiving the opinion of you , I will add a reply to you :

1 . THE ADDITIONAL ARTICLES

A. What thing is new and the difference in the absolute theory than other theories?

The first is concept of "Absolute" in my absolute theory is defined as: there is only one - do not have any similar - no two things exactly alike.

The most important difference of this theory is to build on the entirely new basis and different platforms compared to the current theory.

B. Why can claim: all things are absolute - have not of relative ?

It can be affirmed that : can not have the two of status or phenomenon is the same exists in the same location in space and at the same moment of time - so thus: everything must be absolute and can not have any of relative . The relative only is a concept to created by our .

C. Why can confirm that the conclusions of the absolute theory is the most specific and detailed - and is unique?

Conclusion of the absolute theory must always be unique and must be able to identify the most specific and detailed for all issues related to a situation or a phenomenon that any - that is the mandatory rules of this theory.

D. How the applicability of the absolute theory in practice is ?

The applicability of the absolute theory is for everything - there is no limit on the issue and there is no restriction on any field - because: This theory is a method to determine for all matters and of course not reserved for each area.

E. How to prove the claims of Absolute Theory?

To demonstrate - in fact - for the above statement,we will together come to a specific experience, I have a small testing - absolutely realistic - to you with title:

2 . A SMALL TEST FOR MUTUAL BENEFIT :

“Absolute determination to resolve for issues reality”

That is, based on my Absolute theory, I will help you determine by one new way to reasonable settlement and most effective for meet with difficulties of you - when not yet find out to appropriate remedies - for any problems that are actually happening in reality, only need you to clearly notice and specifically about the current status and the phenomena of problems included with requirements and expectations need to be resolved.

I may collect fees - by percentage of benefits that you get - and the commission rate for you, when you promote and recommend to others.

Condition : do not explaining for problems as impractical - no practical benefit - not able to determine in practice.

To avoid affecting the contest you can contact me via email : hoangcao_hai@yahoo.com

Hope will satisfy and bring real benefits for you along with the desire that we will find a common ground to live together in happily.

Thank you for presenting your nice essay. I saw the abstract and will post my comments soon. I totally accept your view point.

I am requesting you to go through my essay also. And I take this opportunity to say, to come to reality and base your arguments on experimental results.

I failed mainly because I worked against the main stream. The main stream community people want magic from science instead of realty especially in the subject of cosmology. We all know well that cosmology is a subject where speculations rule.

- -Material objects are more fundamental- - is being proposed in this paper; It is well known that there is no mental experiment, which produced material. . . Similarly creation of matter from empty space as required in Steady State theory or in Bigbang is another such problem in the Cosmological counterpart. . . . In this paper we will see about CMB, how it is generated from stars and Galaxies around us. And here we show that NO Microwave background radiation was detected till now after excluding radiation from Stars and Galaxies. . . .

. . . . We should use our minds to down to earth realistic thinking. There is no point in wasting our brains in total imagination which are never realities. It is something like showing, mixing of cartoon characters with normal people in movies or people entering into Game-space in virtual reality games or Firing antimatter into a black hole!!!. It is sheer a madness of such concepts going on in many fields like science, mathematics, computer IT etc. . . .

B.

Francis V wrote on May. 11, 2013 @ 02:05 GMT

Well-presented argument about the absence of any explosion for a relic frequency to occur and the detail on collection of temperature data……

C

Robert Bennett wrote on May. 14, 2013 @ 18:26 GMT

"Material objects are more fundamental"..... in other words "IT from Bit" is true.

….And your question is like asking, -- which is first? Egg or Hen?— in other words Matter is first or Information is first? Is that so? In reality there is no way that Matter comes from information.

Matter is another form of Energy. Matter cannot be created from nothing. Any type of vacuum cannot produce matter. Matter is another form of energy. Energy is having many forms: Mechanical, Electrical, Heat, Magnetic and so on..

E

Antony Ryan wrote on Jun. 23, 2013 @ 22:08 GMT

…..Either way your abstract argument based empirical evidence is strong given that "a mere description of material properties does not produce material". While of course materials do give information.

I think you deserve a place in the final based on this alone. Concise - simple - but undeniable.

Roger: interesting essay. I think you are too quick in rejecting the digital computer hypothesis. As I argue in my essay, and my published article "A New Theory of Free Will", quantum uncertainty and other such phenomena are an inevitable emergent result of peer-to-peer networked digital computers. But, I agree with you (and say as much in my essay): not everything can be reduced to digital bits.

Interesting essay and good responses to comments above. On your skepticism of quantum computing, I have a motion/question I am trying to formulate appropriately for those enamored of the Qubit idea:

Given the different examples of binary choices and their physical supports (implicit in the assumption that bits must be carried by Its), e.g. vertical/horizontal polarization = photon; spin up/spin down = electron, etc

And, if existence/non-existence is a binary choice of messages (as has been admitted in evidence, e.g. by Georgina Parry on 28 June), viz."the binary choice of there being an atom or no atom at a location... is like existence or non existence...seems to me a most basic attribute, like 1 and 0. The way I was thinking about it a material structure of some kind is required to carry the absence so that it is communicated. It is a really interesting point though that the existent Bit has a corresponding It but the non existent one does not but the absence can still be information. Thank you for raising that very interesting question."

Whereas no material thing can carry non-existence as an information and

Whereas no superposition can be contemplated between existence/non-existence, unlike some other binary choices,

The least that can be said is that this particular information is not a Qubit,

And if this information is what lies at the "very very deep bottom" and "the ontological basement",

I hereby move that all further discussion of Qubits and Quantum computing, where Bits entangle themselves and are superposed on each other should henceforth be suspended unless further evidence contradicting the above are presented.

Best regards and all the best,

'Senator' Akinbo

*You are free to second or modify above motion before presenting to FQXi parliament :).

A very sound and well written thesis for reality which I appreciate and find generally very agreeable.

I must question your 'one liner' summing up von Neumann's position as perhaps a little misleading, with particular regard to the EPR paradox. I discuss this in my essay, including the quote including his proposal that for a consistent QM;

"...as the 'system' and the 'meter' physically interact both must act as quantum mechanical systems, so each 'meter' should 'equally obey the uncertainty principle."

Now this would actually 'preclude' the need for FTL or 'saaad' as the cosine curve would first be produced at each detector. Not the kind of stochastic hidden variable anticipated, but resolving the paradox none the less, and also as Bell anticipated (also quoted).

My essay describes a way of obtaining this which you almost derive yourself, and related to Godel n-valued ('fuzzy') logic. You seem uniquely qualified to analyse this and I'd be most grateful of your comments. You'll also find the essay consistent with your last years thesis ref mathematics, hopefully pinning down a description using the "Dirac Line" or limit of describability.

Very well done for your essay. I'm quite relieved to find I may exist after all! Best wishes for the contest.

This is our post to Dr. Wiliam Mc Harris in his thread. We thought it may be of interest to you.

Mathematics is the science of accumulation and reduction of similars or partly similars. The former is linear and the later non-linear. Because of the high degree of interdependence and interconnectedness, it is no surprise that everything in the Universe is mostly non-linear....

This is our post to Dr. Wiliam Mc Harris in his thread. We thought it may be of interest to you.

Mathematics is the science of accumulation and reduction of similars or partly similars. The former is linear and the later non-linear. Because of the high degree of interdependence and interconnectedness, it is no surprise that everything in the Universe is mostly non-linear. The left hand sides of all equations depict free will, as we are free to chose or change the parameters. The equality sign depicts the special conditions necessary to start the interaction. The right hand side depicts determinism, as once the parameters and special conditions are determined, the results are always predictable. Hence, irrespective of whether the initial conditions could be precisely known or not, the results are always deterministic. Even the butterfly effect would be deterministic, if we could know the changing parameters at every non-linearity. Our inability to measure does not make it chaotic – “complex, even inexplicable behavior”. Statistics only provides the minimal and maximal boundaries of the various classes of reactions, but never solutions to individual interactions or developmental chains. Your example of “the deer population in Northern Michigan”, is related to the interdependence and interconnectedness of the eco system. Hence it is non-linear.

Infinities are like one – without similars. But whereas the dimensions of one are fully perceived, the dimensions of infinities are not perceptible. (We have shown in many threads here without contradiction that division by zero is not infinite, but leaves a number unchanged.) We do not know the beginning or end of space (interval of objects) or time (interval of events). Hence all mathematics involving infinities are void. But they co-exist with all others – every object or event exists in space and time. Length contraction is apparent to the observer due to Doppler shift and Time dilation is apparent due to changing velocity of light in mediums with different refractive index like those of our atmosphere and outer space.

Your example of the computation of evolutionary sequence of random numbers omits an important fact. Numbers are the inherent properties of everything by which we differentiate between similars. If there are no similars, then it is one; otherwise many. Many can be 2,3,…n depending upon the sequence of perceptions leading to that number. Often it happens so fast that we do not realize it. But once the perception of many is registered in our mind, it remains as a concept in our memory and we can perceive it even without any objects. When you use “a pseudorandom number generator to generate programs consisting of (almost) random sequences of numbers”, you do just that through “comparison and exchange instructions”. You develop these by “inserting random minor variations, corresponding to asexual mutations; second, by ‘mating’ parent programs to create a child program, i.e., by splicing parts of programs together, hoping that useful instructions from each parent occasionally will be inherited and become concentrated” and repeat it “thousands upon thousands of time” till the concept covers the desired number sequences. Danny Hillis missed this reasoning. Hence he erroneously thought “evolution can produce something as simple as a sorting program which is fundamentally incomprehensible”. After all, computers are GIGO. Brain and Mind are not redundant.

Much has been talked about sensory perception and memory consolidation as composed of an initial set of feature filters followed by a special class of mathematical transformations which represent the sensory inputs generating interacting wave-fronts over the entire sensory cortical area – the so-called holographic processes. It can explain the almost infinite memory. Since a hologram retains the complete details at every point of its image plane, even if a small portion of it is exposed for reconstruction, we get the entire scene, though the quality is impaired. Yet, unlike an optical hologram, the neural hologram is formed by very low frequency post-synaptic potentials providing a low information processing capacity to the neural system. Further, the distributed memory mechanisms are not recorded randomly over the entire brain matter, as there seems to be preferred locations in the brain for each sensory input.

The impulses from the various sensory apparatus are carried upwards in the dorsal column or in the anterio-lateral spinothalamic tract to the thalamus, which relays it to the cerebral cortex for its perception. At any moment, our sense organs are bombarded by a multitude of stimuli. But only one of them is given a clear channel to go up to the thalamus and then to the cerebral cortex at any instant, so that like photographic frames, we perceive one frame at an instant. Unlike the sensory apparatuses that are subject specific, this happens for all types of impulses. The agency that determines this subject neutral channel, is called mind, which is powered by the heart and lungs. Thus, after the heart stops beating, mind stops its work.

However, both for consolidation and retrieval of sensory information, the holographic model requires a coherent source which literally ‘illuminates’ the object or the object-projected sensory information. This may be a small source available at the site of sensory repository. For retrieval of the previously consolidated information, the same source again becomes necessary. Since the brain receives enormous information that is present for the whole life, such source should always be illuminating the required area in the brain where the sensory information is stored. Even in dream state, this source must be active, as here also local memory retrieval and experience takes place. This source is the Consciousness.

Roger - Nice essay, summarizing the state of the union in bits, probability no hidden variables, free will, qubits and black holes.

Before we begin, note that while I rated your essay highly, I am the other side of the fence. I believe that there remains at least one major gap in the analysis done so far of the potential interpretations of quantum theory which deserves our attention, and...

Roger - Nice essay, summarizing the state of the union in bits, probability no hidden variables, free will, qubits and black holes.

Before we begin, note that while I rated your essay highly, I am the other side of the fence. I believe that there remains at least one major gap in the analysis done so far of the potential interpretations of quantum theory which deserves our attention, and that is a critical examination of the role of a monotonically increasing irreversible background in time: a fallacy I see committed over and over by those arguing the case for “shut up and calculate” mechanics.

Since I’m homing in on your section on no hidden variables, lets look at Entanglement specifically. Note that: (1) I am not trying to defend hidden variables, (2) We do not have to sacrifice locality, (3) eliminating the background of time and replacing it with by an increment of information time/space with a photon arrival, and a decrement of information time/space with a photon departure. The theory is described in my essay [1] this is the simplest description I can come up with that appears to predict current Bell tests, and yet has a logically consistent ontological description which can be falsified by relatively straightforward experiments.

Before you get on my case, I understand that entanglement is a term normally given to the “non-classical” phenomenon where joint measurements show correlations stronger than what would be “classically explainable”. I recognize that using the term entanglement for this “photon hot potato” protocol might provoke reactions from mainstay quantum mechanic’s. But unless you or someone else can find a hole in my argument, this protocol in combination with the concept of subtime, would appear to manifest exactly the same results as the purely probabilistic quantum formalism; but might now be considered “explainable” (I hesitate to say classically, because it isn’t that either).

The conventional formalism for entanglement says that two distantly separated quantum systems may be “coupled” via Hilbert space, such that measurement of one can suddenly change the state of the other. I have simply tried to describe an insight as to what form that “coupling” might take.

The difficulty with the entangled (pure state of vectors) in the Hilbert space, is that entanglement is seen as only one thing: the impossibility of writing a density matrix as a linear combination of tensor products. There are two basic issues with this. The first theoretical: it appears to be incomplete without including (at least) a backwards evolving quantum state [2]. The second is experimental: it appears to be a classic example of the independence fallacy [3].

This is the biggest reason the essay is completely devoid of mathematical formalism: I wanted to begin with a describable phenomenon, and not with an endless argument over the current formalisms before we can get to the real issues.

Let me know your thoughts.

Kind regards, Paul

[1] http://fqxi.org/data/forum-attachments/Borrill-TimeOne-V1.1a.pdf

[2] Lev Vaidman argues that the Two State Vector Formalism needs to consider backwards evolving quantum state because information provided by a “forwards only” state is not complete. Both past and future measurements are required for providing complete information about quantum systems. [ http://www.pirsa.org/08090067/]

Thanks for your comments. I am glad to see you stick with locality and some form of causality, but I have a hard time understanding theories that try to do away with time. I will have to study your paper more.

I think that Minkowski was the first to explicitly say statements like "no space without time" in 1908.