Posted
by
kdawson
on Friday May 01, 2009 @09:21AM
from the just-small-enough dept.

Urchin writes "Physicists in California have made the smallest ever incandescent lamp using a carbon nanotube as the filament. The nanotube is so small it behaves as a quantum mechanical system but it's just large enough that the classical physics rules of thermodynamics should apply. Analyzing the light emitted from the tiny light will give the team a better picture of what happens in the twilight zone between the quantum and classical worlds." The New Scientist article doesn't mention the researchers' surprise, as the abstract does: "Remarkably, the heat equation and Planck's law together give a precise, quantitative description of the light intensity as a function of input power, even though the nanotube's small size places it outside the thermodynamic limit."

Obviously. Which isn't to say that the concept of a classical regime versus a quantum one isn't useful. You wouldn't describe the motion of a baseball using Schroedinger's equation: it's perfectly possible, but impractical.

Any information we can get about the transition between the two regimes is very valuable indeed.

Obviously. Which isn't to say that the concept of a classical regime versus a quantum one isn't useful. You wouldn't describe the motion of a baseball using Schroedinger's equation: it's perfectly possible, but impractical.

Good point. I was browsing Douglas Hofstadter's I Am a Strange Loop recenlty, and he made a great point about levels of description. He notes that when we discover "X is reducible to the more fundamental phenomenon of Y", people seem to think that means Y is more important and useful. But, he says, that discovery is equivalent to "Y can be ignored at the level X". That is, even though there might be a lower-level description, the discovery of enough regularity at that level is also useful since it means a simpler way to describe what's going on.

Any pitch can be both a strike and not a strike until the home plate umpire calls it, especially at a Cardinals/Cubs game. With 60,000 in attendance, you can have around 30,000 arguments about what the call should be, but only one call.

"You wouldn't describe the motion of a baseball using Schroedinger's equation: it's perfectly possible, but impractical."

Solving certain classical problems with quantum theory (and getting the right answers) is a typical graduate level homework question. There's a classic advanced textbook of quantum problems that's full of this sort of thing.

Which is not incompatible with my point. When a way of solving a particular problem is impractical, the only value such solution may have is as an academic exercise. You'll learn a lot from such an exercise as long as you realize it's just that, an exercise.

If you really use quantum mechanics for a purely classical problem in your research, you are a lousy researcher, because you are being unable to focus on the important parts of the problem and neglect those that do not matter, thus wasting time and resou

This is true. Quantum Mechanics' "true randomness" could sit atop a perfectly deterministic deeper reality. When Einstein said he believed in "reality", he meant there were definite objects out there with definite properties. QM wipes that away -- *at that level*. Of course, an even deeper reality would basically make the QM level a sort of virtual world, which doesn't exactly help Einstein's case, even if it were to be deterministic. The "real stuff" he believed existed would now be twice-removed, so

With the proviso that the differences in those rules between the two paradigms reflect quite different views of reality (though luckily with some comparable mathematical structures such as transformations that allow a good deal of extrapolation of concepts from 19C physics into the quantum realm - greatly helping how we conceive of quantum ideas that really have no exact macroscopic equivalent).

You're right. As the article implies it's now known as thermodynamics. It can also be called Newtonian physics. Our moms tell us we have two ears and one mouth therefore we should listen twice as much as we speak. Unfortunately for this site and its linked articles we have ten fingers and only two eyes.

The whole point of the beautiful minds studying this nanotube filament is to observe something that truly does require the calculations from both old physics and new physics (call them whatever you want, b

The wave function of all quantum physicists says the probability some physicist will have the energy to change the light bulb is spread across all physicists. It does not imply any physicist has sufficient energy to change the light bulb. So changing the light bulb may be impossible.

This experiment is important because it reveals something about physics.
However, I wonder if this could also lead to practical inventions. Could a high intensity energy efficient light bulb be made from millions of tiny nanolamps clumped together?

The history of science suggests that exploring the intersection of two bodies of theory is a very important kind of experiment. It was Thomas Young's double slit experiments [wikipedia.org], Planck's study of blackbody radiation [wikipedia.org], and Einstein's work with the photoelectric effect [wikipedia.org] that revealed the necessary clues to the quantum theory that resolved the paradox of the apparent wave/particle duality of electromagentic radiation.

It took 19th century classical physicists an entire century to resolve this issue, so long that the discipline became a little stagnant and some folks were beginning to claim that physics had explained everything there was to explain. However, Planck's work was especially important in revealing the quantized energy nature of light that was the key to opening up 20th century physics. [wikipedia.org]

Anyway, to keep this short, I suggest that we find ourselves in a similar situation. Our current models have been played out, and are leaving a lot of important questions unanswered. There are a few candidate theories that hold promise but aren't supported by observations. Looking at the cracks between our building blocks worked before -- it opened up whole universes of possiblility. We need to keep doing it. This experiment is a great example of that kind of work.

that resolved the paradox of the apparent wave/particle duality of electromagentic radiation.

We didn't actually resolve the paradox, we just showed that we didn't have to resolve it to do useful calculations. The legacy of positivism and the Copenhagen Interpretation has been to simply sweep the whole question under the carpet.

Even modern approaches that attempt to explain the central question of quantum theory, which is "How does the classical world arise out of quantum phenomena?" don't actually answer it. They just make you feel better about it, distracting you from the fact that they have explained nothing. The whole Many Worlds approach is like this: it actually says nothing about why consciousness experiences only one of the many possible outcomes, despite its rather clever intellectual edifice.

To look at it another way, if all you knew about was the quantum universe of smoothly evolving probability densities, you would never guess at the existence of the classical universe at all. You would never suspect there was such a thing as "wavefunction collapse" (or any of its conceptual equivalents in different interpretations.) You would simply be aware (insofar as awareness might be possible in such a universe) that the various components of wavefunctions decohere smoothly over time due to interactions and entanglements with systems that have many degrees of freedom. You would not under any circumstances say, "Hey, all the components of that wavefunction just vanished except for this one!" Yet that is what WE say all the time, and no one has a clue as to why it happens.

My own take on this is that far from being some bizarre quantum phenomenon, consciousness is fundamentally classical in a way that physics is not. This is a Kantian view, that there are necessary conditions to consciousness that are more restrictive than the general conditions of existence.

So far, no empirical test of any interpretation of quantum mechanics (except experimental violations of Bell's Inequalities, which rule out any local causal interpretation) have been proposed. It may be that systems like this one will allow for novel tests, and in any case they are likely to put a finer point on the fundamental question even if they get us no closer to answering it.

You would simply be aware (insofar as awareness might be possible in such a universe) that the various components of wavefunctions decohere smoothly over time due to interactions and entanglements with systems

Actually, a very great and quite under appreciated physicist, HS ("Bert") Green, did show with colleagues that this collapse does occur just because of the interaction between systems and that mathematically it is not the least bit mysterious or spooky. Why the name of this man, who Max Born called "brilliant", is not better known has to be the real mystery.

In 1958 Bert published one of his best papers [53]. It was entitled 'Observation in Quantum Mechanics' and addressed one of the outstanding problems of modern physics, namely the process by which indeterminate superpositions in quantum mechanics become converted to the determinate, although possibly unknown, alternatives of ordinary macroscopic physics. For many years the prescription of von Neumann, usually called the 'collapse of the wave packet', was the accepted view of how this happened. As it assumed that some processes outside quantum mechanics had to be invoked, even going so far as involving the brain of the human observer, people were not comfortable with it, although it seemed the only possible answer. The best known representation of this difficulty appears in the well-known SchrÃdinger's cat paradox. Bert, together with a number of others such as Wakita and Ludwig, found a much more satisfying explanation, which is basically still the received description, although nowadays in various forms. The idea was to suppose that a measuring apparatus could be of almost any form so long as it was very complicated, that is, contained a very large number (often for mathematical convenience taken to be infinite) of components such as molecules or electrons. The system being measured could be microscopic. When the two systems interact, any 'interference terms' in the state of the microscopic system become vanishingly small purely as a consequence of the size of the measuring instrument. There are, of course, many processes in nature in which a human observer is not involved â" especially before homo sapiens evolved â" and the von Neumann description is quite unable to say how these could happen. However with Bert's theory all one has to do is to replace the measuring apparatus by the environment to bring about the necessary disappearance of interferences. The only place where this very satisfactory explanation might run into some difficulty is in the early evolution of the universe, where there is no environment!

Fascinating! Thanks for the link--the role of decoherence became clear to me in the '90's, and there's a whole little group pushing it as the solution to this problem as if it was new.

I never published on the topic because it rapidly became obvious to me that it in fact says nothing about the real problem. There's a subtle bait-and-switch going on. Decoherence doesn't actually address the problem: why is there a classical world at all? Why aren't we aware of the damned probability distributions, cohere