Remembering some of the <cough> "discussions" we've had regarding probability versus determinism, there's a new paper out from Steve Hsu from the University of Oregon.

"On the origin of probability in quantum mechanics

I give a brief introduction to many worlds or "no wavefunction collapse" quantum mechanics, suitable for non-specialists. I then discuss the origin of probability in such formulations, distinguishing between objective and subjective notions of probability."

Remembering some of the <cough> "discussions" we've had regarding probability versus determinism, there's a new paper out from Steve Hsu from the University of Oregon.

"On the origin of probability in quantum mechanics

I give a brief introduction to many worlds or "no wavefunction collapse" quantum mechanics, suitable for non-specialists. I then discuss the origin of probability in such formulations, distinguishing between objective and subjective notions of probability."

I haven't read it yet, I'll admit.

Click to expand...

I read it...in a nutshell, we are no wiser. The cause of stuff is unknown, nothing is certain, random fluctuations will probably happen in accordance with evolved complexity. But like a storm within a storm there are forces acting in a particular way to suggest some kind of direction although probably unpredictable. We are being swept along on a tiny cosmic wave and we can't get off!

However, rapid progress is being made toward the creation of macroscopic
(Schrondinger's Cat) superposition states, including, possibly, superpositions of viruses and bugs [5]. If a bug can be
placed in a superposition state, why can't you?

Quantum computing, usually done optically, depends on us being able to maintain things in a superposition state long enough that we can extract useful informational work (which is, in a physics sense, simply work) from them. Thus there is a whole industry in producing superimposable qubits sufficiently massive that we can reliably measure them in the real world not in a thought experiment. And they seem to be achieving something although the "useful" is still missing from the "informational work". Unless the NSA are keeping it all really quiet.

If we can produce a massive object in a superposition state, what is the physics limit on how big that object can be? No-one knows (there are some claims that it may be related to gravity versus the other forces, but AFAIK these are merely well argued hypotheses). If there is no fundamental limit, Schrodinger's Cat (or even Schrodinger's Dashing Chap) become a possibility. As Schrodinger created his eponymous gedanken-o-gram to satire the whole concept that superposition is even rational applied to macroscopic objects, this is somewhat ironic.

Quantum computing, usually done optically, depends on us being able to maintain things in a superposition state long enough that we can extract useful informational work (which is, in a physics sense, simply work) from them. Thus there is a whole industry in producing superimposable qubits sufficiently massive that we can reliably measure them in the real world not in a thought experiment. And they seem to be achieving something although the "useful" is still missing from the "informational work". Unless the NSA are keeping it all really quiet.

If we can produce a massive object in a superposition state, what is the physics limit on how big that object can be? No-one knows (there are some claims that it may be related to gravity versus the other forces, but AFAIK these are merely well argued hypotheses). If there is no fundamental limit, Schrodinger's Cat (or even Schrodinger's Dashing Chap) become a possibility. As Schrodinger created his eponymous gedanken-o-gram to satire the whole concept that superposition is even rational applied to macroscopic objects, this is somewhat ironic.

Click to expand...

I consider myself a reasonably intelligent bloke, with an engineering degree and quite few post-grad technical qualifications, but I haven't got a feckin' clue what you are on about.

Okay - I'm off to bed. I'll try a re-post in the morning. Civil, mechanical, electrical or pansy-waffu, just to give me some idea whether we need the Janet and John version or can up to Ladybird?

Click to expand...

Telecommunications Engineering 1988. It was very heavy on physics, maths and some chemistry. I have a vague grasp of quantum mechanics and the philosophies behind it (or should that be in front of it?) including wave theory.

Quantum computing, usually done optically, depends on us being able to maintain things in a superposition state long enough that we can extract useful informational work (which is, in a physics sense, simply work) from them. Thus there is a whole industry in producing superimposable qubits sufficiently massive that we can reliably measure them in the real world not in a thought experiment. And they seem to be achieving something although the "useful" is still missing from the "informational work". Unless the NSA are keeping it all really quiet.

If we can produce a massive object in a superposition state, what is the physics limit on how big that object can be? No-one knows (there are some claims that it may be related to gravity versus the other forces, but AFAIK these are merely well argued hypotheses). If there is no fundamental limit, Schrodinger's Cat (or even Schrodinger's Dashing Chap) become a possibility. As Schrodinger created his eponymous gedanken-o-gram to satire the whole concept that superposition is even rational applied to macroscopic objects, this is somewhat ironic.

Click to expand...

If someone's read a few books on QM then they may get the basic gist of things, but I think the main part where people become lost is through professional jargon, which we all do. Superposition, superimosable, quibits, eponymous gedanken-ogram (ref Schrodinger's experiment) and possibly even macroscopic. These are all terms which people do not come across if they're not familiar with QM. But then QM is not a subject that can be explained easily...

One of the weird things about quantum, whether mechanics or field theory (QM is the 1+1 dimension spacetime simplification of QFD, which is normally still done in 3+1 dimension spacetime rather than any of the 'interesting' additional dimension theories because the maths is just horrid), is the "superposition state". Basically, particles can be in situation where they have one or more facts about them fundamentally uncertain - usually, in experiments, this is the "spin" state of the particle or the polarisation of a wave but those are not mandatory, they are just the easiest to engineer. (Please note that this is nothing to do with Heisenberg's Uncertainty Principle - which is about the fundamental limit on the ability to get accurate measurements of both position and momentum and is seen as a hard physics limit on divine omniscience.)

So, a particle can have this characteristic and it isn't actually fixed by the universe. But you can measure it. Either by constructing an experiment or by the particle being in a position where the characteristic makes a difference. At that point the characteristic becomes fixed. Quite often (but not always) these particles are generated in pairs and the characteristics are linked - usually that they must either be identical or opposite - this is known as quantum entaglement. If those pairs are separated and the characteristic fixed for one by measurement, the characteristic for the other is also immediately fixed, which can be confirmed by a later measurement. And by "immediately", I mean "simultaneously" in the general relativity sense of "there is no such thing as simultaneous". This being one of the points where QM and relativity disagree. According to experimental results, QM is right. And whatever propagates this change, if anything, travels infinitely fast (i.e. much, much faster than the speed of light in vacuum as measured by an observer at rest. This is technically just another view of the same disagreement between QM and GR.)

So, this leads to two different interpretations - either there is a probabilistic collapse of the particle characteristic (often these are a 50/50 choice in the simplest experiments such as electron spin state but they don't need to be) or there are two universes, one in which the characteristic was x and one in which it was -x (or whatever) and all you have done is determine which universe you are in. As these things are happening all of the time, everywhere without us noticing, it means that we would have to have have a staggering profusion of universes - the "many worlds" interpretation. Lots of people oppose the probabilistic collapse model because it means that effects sometimes have no cause other than chance. (Although I don't see why introducing any other model adds that pre-chance cause, but hey-ho.)

Back in the 1920s, a bunch of the founders of QM got together on a regular basis, mostly at Niels Bohr's lab in Copenhagen, and decided that the multiple universe theory was a load of bollocks (or, more probably, if they insisted in telling people about it they would get sniggered at in the Senior Common Room bar), so, for a while, the "wavefunction collapse" model was the accepted wisdom. Which isn't too bad, as there parallels - in Quantum Chromo-Dynamics, for example. Except that more modern math and some experimental results, are suggesting that "many worlds" is actually correct.

Anyway. Quantum computing. People are interested in this are a trying to manufacture quantum computers where the calculating element, a "qubit", is held in a superposition state. For various reasons (wave polarity being easier to measure than electron spin, for example), most people are trying to make these with opto-electronics rather than simple electronics. However, this means that some very small, but still "macroscopic" (in that they are composed of many, many atoms and therefore are usually treated as behaving classically rather than quantumly. "Massive" in my earlier post is another way of describing this.) optical elements have to be in a superposition state. If this can be done then what is the problem with putting larger yet objects into superposition states? Answer: currently unknown although QM does not describe gravity, therefore there may be something in the consistent combined QM/gravitational description, when we get one, that gives us aeither a practical or an absolute limit.

Obviously, this all means that there are some things which are in no actual state - not yet collapsed or universes not yet separated (Ponder Stibbon's 'Trousers of Time' - or universe not yet 'chosen' if that is a more meaningful concept. Which it probably isn't.) Schrödinger's Cat was a thought experiment to demonstrate that this concept was meaningless at the macroscopic level. Cats are dead or alive (regardless - 'bloody furious' is a subset of alive). So with a piece of radioactive material and a gieger counter controlling a cyanide bottle and a cat, all in a sealed opaque box, you can construct a QM wave-function that is dependent on a definitely-quantum effect (the decay or not of the radioactive material) that has macroscopic consequences. So the cat is in a superimposed state of alive|not alive. Which is meaningless (all cat owners know that cats are permanently in in a superimposition state of asleep|demanding food with menaces.) To make this even more complex, there is another thought experiment "Wigner's Friend" where you have one observer outside the box and one inside it. And they see different wave functions.

But, if we can create macroscopic objects in superimposition states, then there is no current reason why this is actually meaningless. Except "common sense". But if you start trying to apply common sense around QM, you get lost very quickly.

Thanks for the explanation, which I now think I understand. Perhaps I should buy a book after all.

I often find it interesting that people working in fields they perceive to be new and sexy, invent terms for things even though an existing term will suit the requirement perfectly and even aid in cross-specialisation understanding.

The 'not yet chosen' or 'collapse of characteristic' situation is known in electronics and communications as an indeterminate state, meaning it matches none of the expected criteria or its state cannot be determined. The superposition state is known as determinable. This can be seen in things like hysteresis loops or interstate transition, which only translates into whether perceivable information is carried or not. Sometimes the very fact that something is transiting between 2 states is information in its own right.

Mathematicians and electronics engineers have had multiple dimensions for a lot longer than QM or QFD specialists. You only have to look at Newton and Leibnitz calculus, Fourier and LaPlace transforms and time/frequency/polarity domains to see that this is so.

The sexy bit of QM for me, is that you can cause a measurable effect in a particle (or other entity) by creating or forcing a characteristic on a partner entity in a physically separate place, and that this happens simultaneously, however you define that word.

EM waves in a waveguide possess an effect that travels faster than light, although no-one can actually prove it and it is really a presumption made by using a possibly flawed mathematical model. This is almost certainly true of all the quantum science fields too. Over the centuries we have learned that things that seem mysterious or magical can eventually be explained by an improved understanding of the world around us. God didn't bring the flood, mountains don't get upset with villagers and fire is not the work of the devil. Perhaps instead of conjuring up ever more fanciful notions, and stacking them one of top of the other in interdependent circular logical conclusions, the scientists should exploit what they know, try to estimate what they don't know and not pretend that they have understood something they haven't. (A bit like me really)

Some of the terminology used in this field reminds me of the Catholic church, which for over 1000 years would only hold mass and print bibles in Latin, as they knew the non-ordained generally didn't understand Latin and could therefore not question it; oiks.

Mathematicians and electronics engineers have had multiple dimensions for a lot longer than QM or QFD specialists. You only have to look at Newton and Leibnitz calculus, Fourier and LaPlace transforms and time/frequency/polarity domains to see that this is so.

Click to expand...

Mathematicians, yes. Electronic engineers (I'm one) - err, no. We barely had valves before QM was around - and the understanding of those was practical rather than mathematical. The Fleming diode valve was about 1904?

EM waves in a waveguide possess an effect that travels faster than light, although no-one can actually prove it and it is really a presumption made by using a possibly flawed mathematical model.

Click to expand...

Phase velocity of waves versus group velocity of waves - proven and not flawed. But it is impossible to use it to transmit information

Basically all those magnificent blackboard theories dreamt up by overpaid early 20th century chalk sniffers are coming unravelled whilst the EU has rebuilt the Maginot line, at enormous cost, upside down under the franco swiss border and we are non the wiser.

They should come down to earth, work on something practical. A kettle which pours well or a smart beer mat.

Mathematicians, yes. Electronic engineers (I'm one) - err, no. We barely had valves before QM was around - and the understanding of those was practical rather than mathematical. The Fleming diode valve was about 1904?

Really? I am impressed with your grasp of engineering history, which I have clearly not researched, and I bow to your superior knowledge.

Phase velocity of waves versus group velocity of waves - proven and not flawed. But it is impossible to use it to transmit information

It's proven mathematically, I agree. Still I have never heard of anyone being able to demonstrate it practically, which would be the only real proof.

Received 10 March 1969; published in the issue dated 26 May 1969We report experimental observation of resonance cones in the angular distribution of the radio-frequency electric field of a short antenna in a plasma in a static magnetic field. The cone angle is observed to vary with incident frequency, cyclotron frequency, and plasma frequency in agreement with simple plasma dielectric theory. We discuss the relationship of these cones to the limiting phase- and group-velocity cones which appear in the theory of plane wave propagation.

Click to expand...

I've not phoned my Mum to check but I expect I wasn't long off the teat at that point ...