Posted
by
timothy
on Sunday March 11, 2012 @04:20AM
from the part-of-a-healthy-weight-loss-program dept.

ananyo writes "In 1961, IBM physicist Rolf Landauer argued that to reset one bit of information — say, to set a binary digit to zero in a computer memory regardless of whether it is initially 1 or 0 — must release a certain minimum amount of heat, proportional to the ambient temperature. New work has now finally confirmed that Landauer was right. To test the principle, the researchers created a simple two-state bit: a single microscopic silica bead held in a 'light trap' by a laser beam. (Abstract) The trap contains two 'valleys' where the particle can rest, one representing a 1 and the other a 0. It could jump between the two if the energy 'hill' separating them is not too high. The researchers could control this height by changing the power of the laser, and could 'tilt' the two valleys to tip the bead into one of them by moving the physical cell containing the bead slightly out of the laser's focus. By monitoring the position and speed of the particle during a cycle of switching and resetting the bit, they could calculate how much energy was dissipated."

It probably reflects the spirit of Landauer's claims. Claims such as this depend upon an understanding of physics, which was much more common in computing back in the days when innovation depended upon an understanding of physics in order to develop new hardware. You also have to consider that a variety of different techniques were used to make computer memories back then, so his claims had to be based upon the underlying physics rather than a particular memory technology. So it is fair game to apply different physical models to prove his claims.

I can appreciate that. But I question the actual relevance of the results, given that the "memory technology" used doesn't resemble anything I've ever heard of being used in a production computer in 30+ years.

The fact that energy would be needed to force a state change should have been intuitively obvious to anyone with even a Grade 12 physics education.

I've seen serious claims that "reversable computation" can be done with no energy input at all. What this doesn't cover, of course, is setting up the initial conditions, or extracting the results of the computation. One requirement is that at the end of the computation, the state of the system should be identical to the initial state.

I must admit that I don't understand either the utility, or the feasibility, of such a system. But there have been serious claims that computation does not, itself, require

"Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.

Probably the largest motivation for the study of technologies aimed at actually implementing reversible com

No kidding? You DO WORK and ENERGY IS RELEASED? Is anybody surprised to see that Landauer was right? Nobody?

What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

What's surprising is that somebody bothered to verify a result that's obvious to everybody with a basic understanding of physics. If the claim weren't true, the machinery that they used to perform the experiment wouldn't have worked either.

Science publishing is not what it used to be.

You are absolutely right. And that's why we have modern technology and, in fact, physics themselves: because people began verifying obvious "facts".

Landauer's claim was about the relationship between entropy as used in information theory and entropy as used in thermodynamics: specifically, that entropy in information theory is identical to the entropy in thermodynamics. The scientists used this set-up so they could measure a change of exactly one bit (the information-theoretic conception of entropy) while controlling outside heat influences (the thermodynamics conception of entropy), and see if the change in information corresponded to the change in heat as predicted by thermodynamics and information theory.

Without precisely controlling the change in information and precisely measuring the change in heat, the result is much less clear. That's why they used this methodology and equipment. Moreover, as this is empirical evidence for a very general identity between heat and information, the result will hold for computer memory as well.

In that case, I could have used a mechanical switch to represent 0 and 1 and told you that heat was dissapated. There needs to be a little more to draw a parallel between a random experiment and computer memory.

Computer memory is a bunch of mechanical switches. The point is that they have a lot of sources of heat aside from reductions in the information content of the physical system. The researchers built a switch that was as efficient as possible so the vast majority of heat dissipation could be attributed to changes in the information content of the switch. Real computer memory will have heat dissipation due to changes in information content along with heat dissipation from such things as moving read/write head

I'm going out on a limb here, not having had the time to study this stuff enough,but my intuition says that the unification of information theory and physics will yield a great breakthrough in physics.

I take the view that thermodynamics and Shannon information theory are literally about the same thing exactly, not just by weak analogy.

Related factoids:1. All information is embodied mutual information.a. It must be embodied in some local configuration of matter/energy.b. It must be mutual in that the informa

It says that information is disorder. And thermodynamic entropy is (for some definitions of order) order as well. If you have all of the air molecules in a room compressed into the corner, maybe that's ordered? But that's one small lump of air, and a whole lot of vacuum. Evenly distributed air is more ordered because it is uniform. If you let a system starting in any arbitrary corner-gas configuration (and there are a lot, since each molecule can have any number of different values describing it) progress for X amount of time, you find that almost certainly you have ended up in an even-gas configuration. On the other hand, if you start in an even-gas configuration, and progress for X amount of time, you will almost certainly still be in an even-gas configuration. This may seem at odds with the fact that laws of motion are time reversible (at least if you assume that molecules are like frictionless billiard balls, as physicists are wont to do). But it's not. If you take some specific corner-gas start A , and run it for X time, you will (probably) have an even-gas configuration B. If you take B, reverse the velocity of all molecules, and run it for X time again, you will be at A (again, assuming molecules are frictionless billiard balls). But, with discrete space and velocity, you can count the possible velocity and position vectors. There are a LOT more even-gas configurations than there are corner-gas configurations. So, with a tiny room and only a few molecules, you can establish the chance that after X time starting at even-gas, you end up at corner-gas. And even for very small systems it basically 0. Entropy is the concept of changes to a system that are not reversible, not because of laws of PHYSICS but laws of STATISTICS. The second law is the observation that, by statistics, you will tend to a uniform (ordered) system because there are a lot of ways to go that direction, and very few ways to go the other direction.

Landauer's observation is that any computational device, at the end of the day, stores information mechanically (again, I refer you to the fact that for our purposes, subatomic particles are frictionless billiard balls, so even things like the atom-trap from TFA are mechanical devices). So if you have a 32 bit register, it has 2^32 configurations. If you consider how many possibilities there are for ordered bit flips involving X bit flips total, it's 32^X. And if you start at 0, almost all of those ordered flips will take you to a pretty chaotic state. But if you start from a random state, almost none of those same bit flip orders will get you to 0. So treating the system as a completely mechanical one, thermodynamics applies and puts limits statistical limits on such changes. What Landauer did is establish a maximum circuit temperature T for your memory/CPU, and observe that you won't want Brownian motion breaking your system, so 0/1 need a minimum separation for the system to be useful at temperature T. This puts a lower bound on the state counts, and lets traditional thermodynamics establish a minimum energy dissipation to go from a high entropy state to a low one (like a 0'd out register). What information entropy does is take the same thing and say that therefore the disordered information has intrinsic entropy, since regardless of system design it requires a certain minimum entropy to store that information. It's avoidable if your system is reversible, which is possible if you have more ways to represent a bit pattern the more ordered that bit pattern is. So if you have fewer ways to store 10010101 compared to how many ways you have to store 00000000. It's also beatable if you find a way to store information non-physically. But good luck on that front.

Neat, huh? I took a course on Kolmogorov Complexity [wikipedia.org], which is somewhat related, and pretty cool.

Pretty far afield followup question: every time Work is performed, Entropy increases. Using the Landauer Principle, it seems like you could you consider information processing to be a sort of Work being done, leading to a similar increase in entropy. If our conscious minds are a form of information processing engine, could consciousness be a byproduct of the Work being conducted by the information processing, which manifests itself simply as extra heat being radiated by the system

It's also beatable if you find a way to store information non-physically.

I think this is what throws everyone when they think about the physics of knowledge. The vast majority of people don't realize that the physical embodiment of information must obey the laws of physics, and even many who do seem to believe knowledge ought to have some form of "soul" not shackled by physical constraints.

To store information, you need the ability to set something into at least two possible states, one of which can be the intrinsic state. No matter what you use for storage, you'll always need energy to reach the non-intrinsic state(s), since the intrinsic state is, essentially by definition, the state achieved with no external energy applied.

If you must add energy to enter a non-intrinsic state, it makes perfect sense that the energy would need to be dissipated to return to the intrinsic state (which equates

Say you have two valleys named 0 and 1, and a mountain between. Setting our bit by rolling a ball from 0 to 1 would require energy expenditure, but once the ball is in the valley it is stable and won't roll out again without further input. 0 and 1 may be at different heights relative to each other, but need not be. They might even be at the same altitude. But if 1 were higher than 0, then yes, you would be storing energy in some sort of potential energy form, and may be able to recover that energy when

It's theoretically possible to change the state of a bit without spending energy. Here's a dumb example: think of a closed system (so no energy is being gained or lost) consisting of a box filled with oxygen and only one molecule of water. Divide the box in two halves and say a bit is "0" if the molecule of water is in the left half and "1" if it's in the right half. If you wait a while, eventually the bit will flip with absolutely no change in energy. That's a dumb example, but it shows that there's nothing that requires a "intrinsic state" and energy loss when you move away from it, like you described.

The only time energy dissipation is unavoidable (in theory) is when you erase information. That's a strange concept because, usually, we don't think about "conservation of information" in the same sense of conservation of energy, but there's a relation [wikipedia.org]. A little more discussion with more relevance to computing can be found here: http://en.wikipedia.org/wiki/Reversible_computing [wikipedia.org].

Yes, because of the zero point energy, since we're using a molecule. The bond has a minimum vibrational energy of 1/2 h*nu when the vibrational quantum number is 0 (ground state), so even when the temperature is 0 K, the bond still has energy and the molecule will still move around.

I was going to post something about reversible computing. I found it an interesting concept when I read that Richard Feynman did some work in computation and was a proponent of it. As far as I can tell, the idea was largely ignored.

I think reversible computing would not only be more energy efficient, but from what I understand might make for some interesting debugging, because I think you could run the program counter backward to an error.

Changing the state of a bit is not necessarily the same as storing information. To be used for information storage, the system can only move between valid states through external stimuli. If it changes to a different state without external stimuli, then it either doesn't store information or the states are not defined correctly.

The whole point of storing something is to have it maintain its state. If an item is not maintaining a single state, then it's not storing information. And if the item is maintaining

You're thinking in terms of a storing information the way a normal (irreversible) computer does. Not all computation must be done that way, I was describing a specific way that's not like that. Imagine that in the system of my (dumb, as I said) example the problem being calculated was, conveniently, the equivalent of "in which side of the box the water molecule will be after 3 days". In this case, I have to spend no energy at all to compute that, assuming the box is perfectly isolated from the environment.

Since we're discussing information storage rather than calculations (certainly the two are related but not the same), then per your example the information storage act would require energy to place the water molecule into the box in the first place. If you ignore that by assuming the molecule is already there, then you haven't stored anything and are simply in the intrinsic state of the box like I discussed originally. A computation with no controlled inputs yields no information, it's just nature running i

Perhaps you are thinking of this in a purely theoretical sense. In that case then yes, if you can harvest 100% of the energy stored when changing a value, then no additional energy is required.

That's the whole point of IBM's experiment and Landauer's principle: even in a purely theoretical sense, if you erase information when you're changing the state of a bit, you necessarily spend a minimum amount of energy. You can't, even theoretically, harvest 100% of the energy back. I was showing that there are other useful ways to change the state of a bit (e.g. in reversible computing) that do not incur in this purely theoretical energy cost, where you could theoretically harvest 100% of the energy back

You don't need to keep checking whether the bit has flipped. In fact, look at the most "quantum world" example possible: the usual way to define a quantum computer uses reversible computing (because quantum logic gates are reversible [wikipedia.org]).

Sure. But the whole point is that the *computation* itself doesn't need to spend energy[1]. The only time you have to spend energy, in principle, is when preparing the computation and when measuring -- the computation itself could run for years and no energy would be necessary (if the system is sufficiently isolated from the environment). In contrast, with "normal" irreversible computation, every time you irreversible flip a bit (e.g., when you apply an AND gate) you must spend energy.

Specifically in the calculation of the Landauer limit, E = kT(ln2), the minimal energy needed to transform a single bit. The interesting thing is that 10^20 bit operations is just a watt. This means that the efficiency of today's computers is just 0.00001%. More details at http://tikalon.com/blog/blog.php?article=2011/Landauer.

Not really that surprising, a silicon atom is about 0.11nm and the lattice grid in a silicon crystal 0.54nm, which is still way smaller than the 32nm processors he's talking about. I don't know how many electrons flow down each 32nm path but they're between 0.1nm and 0.000006nm in diameter depending on what model you use - quantum mechanics makes a mess of this anyway - so it's way more than one. If you want single electron calculations you'll have single electron signals, one quantum event and your signal is lost. So the limit is likely to remain a very theoretical limit.

The other thing is that this only includes the operation itself, no clock cycle, no instruction pointer, no caching, prefetching, branching, this is the ideal you could get out of a fixed-function ASIC that only does one thing, not even as programmable as a GPU shader. We already know that there's a significant gain to that, but even supercomputers aren't built that specifically to the task. Formulas must be tweaked, models adjusted, parts must be able to be used in many computers. We've already seen that a GPGPU can beat a CPU by far on some tasks, but even they aren't close to such an ideal.

If you think about this in encryption terms it's not that much... it says you can at most improve 23-24 bits, in encryption most have used the Landauer limit to "prove" there's not enough energy to break a 256 bit chipher by brute force. In some places I don't think it's that relevant either, in for example mobile I think the energy involved in bandwidth use will be more significant. Want to stream a HD movie? It's not the decoding that kills the battery, it's the 3/4G data connection. Just like cameras get better but good optics still isn't small, light or cheap.

Sure, Leonard Susskind talks about computer memory and entropy on some recent Youtube video as part of a lecture on the holographic principle and the total amount of information in a system. OK, this is sort of a duh moment but I suppose it's good science to test it anyway.

Let 0s be room temperature and let 1s be somewhat below room temperature. Then to erase the memory I expose it to the room. As it erases the memory will absorb some heat from the room instead of releasing heat.

Not really a practical form of computer memory, but seems sufficient to disprove Landauer.

That the mechanism you select absorbs heat does not mean that some heat was not released. Possibly a small amout of heat was released by the state change whilst the mechanism also absorbed heat. It is not the net effect (overall heat absorbed) that is the critical point here.

Not convinced that the thought experiment disproves Landaue (but very interesting - thank you AC), but I am not an expert in this area - very interested if someone with deeper specific knowledge could enlighten?

Yes, the memory will absorb heat, but it costs heat from the hot room. You have to consider the total energy of a closed system and it your naïve approach, the best you can get is a net neutral energy balance. The argument is primarily about the fundamental increase of entropy associated with erasing a bit, and thermal equilibration (between a hot and a cold object) definitely represents an increase in entropy.

By your own example, it took energy to erase the bit, just that the energy came from the pre-erasure difference in temperature between the bits and the environment. And the end result of the erasure in your example is an increase in entropy for the (assumedly closed) system of the room plus the bits. So, no, your example does not come close to disproving Landauer.

In 1961, resetting a bit involved passing a huge current through the wires surrounding a toroidal core which represented one memory bit. So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

So to say that it releases heat is ridiculous, it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

You're misunderstanding the statement. Actually flipping the bit releases heat. Doing the work required to flip the bit also involves the generation of heat, but that heat isn't flipping the bit, and therefore it's not CONSUMED in the process of flipping the bit, just WASTED.

it actually consumes orders of magnitude more heat than could possibly be considered in theory or measured in practice.

What? Scientific models of how our Sun work exist, and as do measurements of heat from it. They are big. I strongly doubt even any old/inefficient human made computing system has yet got anywhere near our Sun.

Could you be writing from some amazingly dangerous alternative universe with a massive energy cost on information? Or am I the victim of a sophisticated troll employing meaningless hyperbole?

Does this suggest that by saving up erasures to be done more slowly, perhaps by flipping bits to 0 near the time when they are flipped to 1, could energy be saved and the Landauer limit approached? Also, are there architectures in which a flipping a bit in one direction uses less power, or when blocks of bits can be deselected by some pointer instead of actually erased, trading memory hardware space for power usage?

A more practical way of improving efficiency would be to move to reversible computing [wikipedia.org]. However, we are far, far away from the Landauer limit in any practical computers, so this is not what is limiting efficiency.

So in an expanding universe there is a loss of information -- and by Landauer's principle this loss of information should release dissipated energy -- and Gough claims that this dissipated energy accounts for the dark energy component of the current standard model of universe.

There are rational objections to this proposal. Landauer's principle is really an expression of entropy in information systems -- which can be mathematically modeled as though they were thermodynamic systems. It's a bold claim to sa

Back in the Uni library, I once had an old ('60's?) book in my hands which stated that for every logical AND circuit, combining two '1' bits would also result in heat. The author suggested designing AND circuits so taht they would have two results: the logical outcome, and the overflow 'exhaust', both connected to the rest of the circuitry. This would be used to keep the processor from generating heat, but might also have more practical, logical uses. (He probably said similar things for other kinds of circ

its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.

its not necessarily stupid test.. in terms of science, we can estimate the amount of energy from various sources, suchas nuclear plant, or total earth energy, or our solar system, or galaxy... using that estimate, we can put an upper bound on the maximum amount of computational power we have at our disposal.. such as, a certain problem is shown to require X calculational complexity, and X exceeds or the amount of disposable energy in our solar system, thus, X is uncalculable given current technology.

Now, let X be some sort of encryption complexity. now do u see how it could be useful?

Cool down the disk to a point where you can measure the temperature changes really well. Now start the encryption. How much information does the change in temperature of the disk (or SSD, or RAM) give you? Could be interesting.

Oh it has a law on Wikipedia, must be a waste of time to test or verify it then! Seriously, have a read about how science works before attempting to comment again. A "law" in science is not like a legal law - i.e. it is not a fact merely by self-assertion (a legal law is a law because law makers say so). Scientific "laws" require test and proof; they often require refinement in details. Scientific "laws" do not exist as abstract facts about the universe - they are human attempts to model the universe fr

I'm not attempting to challenge the "laws" of thermodynamics - my guess would be that we have the broad picture right (we have a lot of evidence in favour), but again, given the history of science I would be surprised if every detail of taught theory in that area survives the next few hundred years without some modification.

Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.

Having the broad picture right just means you have a working model, though. It doesn't mean you've actually discovered how the universe works, just that you can make accurate predictions. Maybe later it turns out that what happens, happens for a totally different reason than what you thought.

Science is all about making predictions, and not about discovering how anything works (formally, anyhow). Or as a physics professor put it: "There are no particles, only clicks in my Geiger counter".

Additionally, the prediction was a great deal more specific than "durrr it will get more hot," it was more: "the heat will change by this particular amount, relative to the ambient temperature, as predicted by these equations,"