Entropy drop: Scientists create “negative temperature” system

Bizarre setup may help researchers model dark energy.

In a negative temperature system, temperatures get lower as more atoms pile up close to its maximum energy.

LMU/MPQ Munich

Over the past decades, researchers have made significant progress in cooling objects closer to absolute zero, the temperature at which all molecular motion reaches its minimum. This has allowed them to study unusual states of matter, like Bose-Einstein condensates, which behave quite differently from the materials we're familiar with. But absolute zero is as low as a temperature can get, and we can't actually reach it, so progress will ultimately be limited.

Maybe not.

As thermodynamics defines temperature, it's theoretically possible to have a negative value. Yesterday, a team of German researchers reported that they were actually able to produce a system with exactly that. They found that the negative temperature system was stable for hundreds of milliseconds, raising the prospect that we can study a radically different type of material.

To understand how temperatures can go negative, you have to think in terms of thermodynamics, which is governed by energy content and entropy. In a normal system, there's a lower limit on energy content—absolute zero—but no upper limit. If you start with a system at absolute zero and add energy, the atoms or molecules it contains start occupying higher energy states. With more energy, they start spreading out evenly among these states. This in turn increases the entropy of the system, since fewer and fewer atoms are in the same energy state.

Now imagine a system where there's an upper limit on the energy state an atom can occupy. As you add more energy, more and more atoms start occupying the maximum energy state. As this happens, entropy actually starts to go down, since an increasing fraction of the atoms begin to occupy the identical energy state. In thermodynamic terms, you've reached negative temperatures.

This has some pretty bizarre consequences. If you could maximize the entropy in the system, temperature becomes discontinuous—it jumps from positive to negative infinity. Strange things would happen if you bring it together with a system that has a normal temperature. "In thermal contact," the authors write, "heat would flow from a negative to a positive temperature system. Because negative temperature systems can absorb entropy while releasing energy, they give rise to several counterintuitive effects, such as Carnot engines with an efficiency greater than unity."

To create one of these systems, the authors set up an optical lattice of potassium atoms, chilled to near absolute zero. Under normal circumstances, these atoms repel, and thermodynamics behaves as we've come to expect. But the authors were able to switch things so that the atoms had attractive interactions. This created something that could be viewed as an "anti-pressure," which should cause the collection of atoms to collapse. It's only the negative temperature that keeps the cloud of atoms from collapsing in this set of circumstances.

It's important to emphasize that this negative temperature isn't some state "below" absolute zero. The atoms in this system still have energy, and the negative temperatures are reached through a sudden transition, rather than by gradually shifting to negative values by going past absolute zero.

Still, the system is more than just a quirky consequence of how we define temperature, since it really behaves quite differently from normal systems. The authors suggest it may help us model dark energy, in which the expansion of the Universe is a product of the sort of "negative pressure" that this system displays.

"Left to their own devices, drifting particles find the arrangements with the highest entropy. That arrangement matches the idea that entropy is a disorder if the particles have enough space: they disperse, pointed in random directions. But crowded tightly, the particles began forming crystal structures like atoms do -- even though they couldn't make bonds. These ordered crystals had to be the high-entropy arrangements, too.

Glotzer explains that this isn't really disorder creating order -- entropy needs its image updated. Instead, she describes it as a measure of possibilities. If you could turn off gravity and empty a bag full of dice into a jar, the floating dice would point every which way. However, if you keep adding dice, eventually space becomes so limited that the dice have more options to align face-to-face. The same thing happens to the nanoparticles, which are so small that they feel entropy's influence more strongly than gravity's."

In statistical physics entropy can be defined as a measure of the available states. This is why entropy here goes to zero as the gas is forced into the highest energy state.

In unbounded systems there is of course more energy states available as you increase temperature. More states means more system configurations, hence looks like disorder.

But there is no one-one relationship. At best one can say that as Newton's classical gravity, it is a model that works inside its region of applicability. But unlike gravity it is an ad hoc proxy model. The best analog is that it is like a model of an ant hill. Look for higher ant hills, and you will likely find more ant tunnels. But it doesn't follow from first principles of ant construction.

"Disorder is like an ant hill."

This example is extraordinarily misleading. This is a tiny edge case where the physical constraints of a system force polyhedrons into a shape, and that shape is narrowly defined by the shape of the polyhedrons. Just because order can arise from simulated systems where entropy is the only interparticle interaction, does not mean that the general conception of entropy as "randomness" is any less correct than it ever was.

Obviously, entropy is far more about the number of possible energetic states available for a specific configuration, but there is only so much space to explain the concept in an article like this, and for 99.9% of usages, randomness is a valid descriptor.

If you could maximize the entropy in the system, temperature becomes discontinuous—it jumps from positive to negative infinity.

Sounds like an overflow error with signed integer, since the MSB signifies the +/-. Maybe this world really is a computer simulation.

If you increase resolution enough you can always distinguish between a simulation and the real system reactions, as quantum mechanics forbids hidden variables (say, simulations) at the level of particle fields (quantum field theory).

But there is no one-one relationship. At best one can say that as Newton's classical gravity, it is a model that works inside its region of applicability. But unlike gravity it is an ad hoc proxy model. The best analog is that it is like a model of an ant hill. Look for higher ant hills, and you will likely find more ant tunnels. But it doesn't follow from first principles of ant construction.

"Disorder is like an ant hill."

This example is extraordinarily misleading. This is a tiny edge case where the physical constraints of a system force polyhedrons into a shape, and that shape is narrowly defined by the shape of the polyhedrons. Just because order can arise from simulated systems where entropy is the only interparticle interaction, does not mean that the general conception of entropy as "randomness" is any less correct than it ever was.

Obviously, entropy is far more about the number of possible energetic states available for a specific configuration, but there is only so much space to explain the concept in an article like this, and for 99.9% of usages, randomness is a valid descriptor.

I am disappointed that you call it misleading, and "extraordinarily" at that, because I was attempting to point out and clearly describe that "entropy is disorder" is misleading!

It is misleading in the same way that believing newtonian gravity is all you need to predict gravity effects, and that you can construct GPS systems or understand general relativistic cosmology by it. And the reference is explicit about how it can, likely will, help make designer materials.

[And I honestly think that similar effects have been noted in gravitational sorting, but I don't have the energy to find these results right now.]

I will not go into the finer point of "randomness" and disorder, but they too are not to be confused. See a statistical physics text for that.

Yes, newtonian gravity is applicable in, say, 99.9 % of usages, but we are all keenly aware that there is a difference between it and general relativity.

But when it comes to entropy, and we here have an explicit example of a "0.1 %" system that works according to the statistical physics theory of thermodynamics, we suddenly shouldn't be keenly aware!?

I fail to see the logic, and I will not take your end of the shit stick. You are the one that is attempting to mislead, sir. Not extraordinarily so, but boringly traditionally so.

If you could maximize the entropy in the system, temperature becomes discontinuous—it jumps from positive to negative infinity

I thought that when a system reach its maximum entropy , is in thermodynamic equilibrium. And its temperature is not only positive, but stays constant with time . Am i wrong?

Ouch, that is another misleading conception, depending if you mean local or global maximum. It is true for nice systems, but as Landauer showed you can have several maximums. Looking at the derivatives of entropy can be misleading (that 99.9 % situation again) if you want to know if the system is in a true equilibrium (or steady state).

This is a similar situation, I think (must rush to a shop). The author's, and commenters here, claim equilibrium. Look at their figure 1 for understanding how the system can be so. [ http://arxiv.org/pdf/1211.0545v1.pdf ]

I meant, they are decreasing the possible accessible states to the system. Why is this entropy increase?

No wonder I didn't understand your question, we were misunderstanding each other. I explicitly claimed "entropy here goes to zero as the gas is forced into the highest energy state."

Ok but the "entopy is not disorder" is not well illustrated here. Maybe the maximum entropy of the tight system is just different, lower, thus more organized or with less disorder if you wish. I do think that the article is little bit misleading by saying that ' These ordered crystals had to be the high-entropy arrangements, too' Well if you say so, but that doesn't mean that it has the same or higher entropy than the sparse and more disorganized system .

Even if the simulation is unrealistic and the particles don't release or transfer any heat by friction , deformation ,etc, and no bounds are allowed , you are still decreasing the number of accessible states to the system , thus decreasing its entropy. That's why it looks more organized. So less entropy <-> more organized. Am i wrong?

But there is no one-one relationship. At best one can say that as Newton's classical gravity, it is a model that works inside its region of applicability. But unlike gravity it is an ad hoc proxy model. The best analog is that it is like a model of an ant hill. Look for higher ant hills, and you will likely find more ant tunnels. But it doesn't follow from first principles of ant construction.

"Disorder is like an ant hill."

This example is extraordinarily misleading. This is a tiny edge case where the physical constraints of a system force polyhedrons into a shape, and that shape is narrowly defined by the shape of the polyhedrons. Just because order can arise from simulated systems where entropy is the only interparticle interaction, does not mean that the general conception of entropy as "randomness" is any less correct than it ever was.

Obviously, entropy is far more about the number of possible energetic states available for a specific configuration, but there is only so much space to explain the concept in an article like this, and for 99.9% of usages, randomness is a valid descriptor.

I am disappointed that you call it misleading, and "extraordinarily" at that, because I was attempting to point out and clearly describe that "entropy is disorder" is misleading!

It is misleading in the same way that believing newtonian gravity is all you need to predict gravity effects, and that you can construct GPS systems or understand general relativistic cosmology by it. And the reference is explicit about how it can, likely will, help make designer materials.

[And I honestly think that similar effects have been noted in gravitational sorting, but I don't have the energy to find these results right now.]

I will not go into the finer point of "randomness" and disorder, but they too are not to be confused. See a statistical physics text for that.

Yes, newtonian gravity is applicable in, say, 99.9 % of usages, but we are all keenly aware that there is a difference between it and general relativity.

But when it comes to entropy, and we here have an explicit example of a "0.1 %" system that works according to the statistical physics theory of thermodynamics, we suddenly shouldn't be keenly aware!?

I fail to see the logic, and I will not take your end of the shit stick. You are the one that is attempting to mislead, sir. Not extraordinarily so, but boringly traditionally so.

You obviously don't understand the work you are quoting. The key point (relating to entropy anyway), was that order could arise in simulated systems where no enthalpic contributions were considered. Its not at all that we need to rethink how we colloquially explain entropy. You would understand this if you read more than the press release. All they did was slowly confine "hard sphere" interacting polygons to smaller spaces and discern what types of structures emerge. It says nothing at all like you are trying to push here. The real crux of the paper is the unbelievable abundance of structures that can emerge from such simple interactions and shapes.

These two descriptions are nothing at all like the difference between newtonian gravity and relativity. Its far more like the difference between describing space time in 3 dimensions (2 space dimensions and 1 time) so that people can understand and forcing people to just stare at the equations and screaming, "Don't you idiots see the obvious ramifications of this??"

As far as randomness and disorder, I did certainly get those crossed up in my original post. My stastical thermo profs will be ashamed, but hopefully they will understand what its like to accurately discuss thermo with a one year old attached to your leg! They are certainly less pedantic and patronizing than you.

I meant, they are decreasing the possible accessible states to the system. Why is this entropy increase?

No wonder I didn't understand your question, we were misunderstanding each other. I explicitly claimed "entropy here goes to zero as the gas is forced into the highest energy state."

Ok but the "entopy is not disorder" is not well illustrated here. Maybe the maximum entropy of the tight system is just different, lower, thus more organized or with less disorder if you wish. I do think that the article is little bit misleading by saying that ' These ordered crystals had to be the high-entropy arrangements, too' Well if you say so, but that doesn't mean that it has the same or higher entropy than the sparse and more disorganized system .

Even if the simulation is unrealistic and the particles don't release or transfer any heat by friction , deformation ,etc, and no bounds are allowed , you are still decreasing the number of accessible states to the system , thus decreasing its entropy. That's why it looks more organized. So less entropy <-> more organized. Am i wrong?

Please help me understand this, because is interesting....

You are confused because Torbjörn is a pedantic douche and he is wildly misleading about the contents of that paper.

The point of that paper, if one actually reads it instead of just the press release, is that order can arise even if you only consider entropic interactions (no bonds form between the nanoparticles). It has almost nothing to do with what he is claiming here. (In the press release, that Torbjörn apparently read, one of the paper's authors discusses these issues, and he is right, but it is pretty far outside the scope of the paper)

In statistical thermodynamics, entropy is proportional to (the natural log of) the number of states that can satisfy a condition divided by the total possible states in the system (this is the partition function). Thus, if a system is ordered, the placement of one atom (or molecule, or colloid, etc.) determines the position of many of the others. This dramatically reduces the number of possible configurations that satisfy the conditions imposed by the crystalline lattice. Thus, an ordered array of atoms (ie. a crystal) is generally a lower entropy configuration than a disordered system. This is why materials melt at high temperature and freeze at low temperature.

To be more precise, entropy is concerned with the number of energetic states available to a system, rather than simply the geometric arrangement of atoms/colloids/whatever. In most cases, the geometric arrangement is very strongly tied to the available energetic states, so the two are one and the same. The real key is how you define a "disordered" state, because the entropy of any precisely defined arrangement of atoms (in a closed system) is the same. We only think that disorder increases entropy because once the atoms are not in a precise lattice, we really don't care about the exact arrangement.

If you could maximize the entropy in the system, temperature becomes discontinuous—it jumps from positive to negative infinity

I thought that when a system reach its maximum entropy , is in thermodynamic equilibrium. And its temperature is not only positive, but stays constant with time . Am i wrong?

Ouch, that is another misleading conception, depending if you mean local or global maximum. It is true for nice systems, but as Landauer showed you can have several maximums. Looking at the derivatives of entropy can be misleading (that 99.9 % situation again) if you want to know if the system is in a true equilibrium (or steady state).

This is a similar situation, I think (must rush to a shop). The author's, and commenters here, claim equilibrium. Look at their figure 1 for understanding how the system can be so. [ http://arxiv.org/pdf/1211.0545v1.pdf ]

Yes, it is true for nice systems, that was the deal when the thermodynamics laws first appeared. You can always observe fluctuations within a large and classical system, or in a tiny and quantum system, so?. It is possible to observe the sudden cooling of tiny regions that last few milliseconds within a large system in equilibrium. I don't think that a local temperature can be well defined at all.

And the system that appears in this ars' article is certainly not in a steady nor stable equilibrium, since it only last few milli seconds despite the aid of the magnetic field. The equilibrium they are talking about looks more like a forced unstable mechanical equilibrium , like trying to balance a pencil on its point using a cheap glue. This is not how a system in thermodynamic equilibrium behaves . So, is not surprising to find weird temperatures .

If you could maximize the entropy in the system, temperature becomes discontinuous—it jumps from positive to negative infinity

I thought that when a system reach its maximum entropy , is in thermodynamic equilibrium. And its temperature is not only positive, but stays constant with time . Am i wrong?

Ouch, that is another misleading conception, depending if you mean local or global maximum. It is true for nice systems, but as Landauer showed you can have several maximums. Looking at the derivatives of entropy can be misleading (that 99.9 % situation again) if you want to know if the system is in a true equilibrium (or steady state).

This is a similar situation, I think (must rush to a shop). The author's, and commenters here, claim equilibrium. Look at their figure 1 for understanding how the system can be so. [ http://arxiv.org/pdf/1211.0545v1.pdf ]

Yes, it is true for nice systems, that was the deal when the thermodynamics laws first appeared. You can always observe fluctuations within a large and classical system, or in a tiny and quantum system, so?. It is possible to observe the sudden cooling of tiny regions that last few milliseconds within a large system in equilibrium. I don't think that a local temperature can be well defined at all.

And the system that appears in this ars' article is certainly not in a steady nor stable equilibrium, since it only last few milli seconds despite the aid of the magnetic field. The equilibrium they are talking about looks more like a forced unstable mechanical equilibrium , like trying to balance a pencil on its point using a cheap glue. This is not how a system in thermodynamic equilibrium behaves . So, is not surprising to find weird temperatures .

Of course, i might be wrong. I just want to understand

I think you are confusing the stability of the Bose-Einstein Condensate (BEC) used to make these measurements with the thermodynamic stability of the negative temperature state of the BEC. It all comes down to your definition of the "system" (in my experience, the core problem in most of thermo is accuretely defining the system!)

For the authors, the "system" is the BEC and, according to them, the negative temperature state that they can put the system in is a global free energy minimum for the system. It cannot decay into other, lower energy states while the BEC remains in tact. If the BEC could last forever, it would be likely to stay in the negative temperature state forever (depending on the available energy and the spacing of the energy levels above this state).

I have never worked with them (and never will), but I think BECs themselves are very unstable because they are easy to disturb with outside, non-ideal influences that arise in experimental physics. Thus, they last a short time and decay quickly. For the authors of this paper, that type of decay has nothing to do with the thermodynamic stability of the state that they can put the system in.

In short, if you define the BEC as the system, the negative temperature state is stable. If you define the whole experimental apparatus as the system, some other, more disordered state is lower energy.

I think you are confusing the stability of the Bose-Einstein Condensate (BEC) used to make these measurements with the thermodynamic stability of the negative temperature state of the BEC. It all comes down to your definition of the "system" (in my experience, the core problem in most of thermo is accuretely defining the system!)

For the authors, the "system" is the BEC and, according to them, the negative temperature state that they can put the system in is a global free energy minimum for the system. It cannot decay into other, lower energy states while the BEC remains in tact. If the BEC could last forever, it would be likely to stay in the negative temperature state forever (depending on the available energy and the spacing of the energy levels above this state).

I have never worked with them (and never will), but I think BECs themselves are very unstable because they are easy to disturb with outside, non-ideal influences that arise in experimental physics. Thus, they last a short time and decay quickly. For the authors of this paper, that type of decay has nothing to do with the thermodynamic stability of the state that they can put the system in.

In short, if you define the BEC as the system, the negative temperature state is stable. If you define the whole experimental apparatus as the system, some other, more disordered state is lower energy.

I am just an engineering student, and as such, I am beneath "High Science", but how is what the authors did with their experiment fundamentally different from pressing the button on a normal red laser pointer that you can buy at Staples?

You just kind of squashed the assertion that it, unlike the laser pointer, is in thermodynamic equilibrium by saying that that is only true if you look at a sub-system of the experiment. What is novel about this journal article?

So, is "negative temperature" then really a matter of perspective seen from the specifics of the given conditions of the experiment?

If absolute zero means all molecular motion stops, but the molecules in the experiment kept moving was absolute zero or below technically reached? Are there any time-space ramifications to molecules that have reached absolute zero or negative temperature?

Is negative temperature even the proper terminology used here since it seems to be more likely here based on reader commentary that what was actually achieved was not really a temperature below absolute but instead the reversal of the rules governing transfer of energies? (ie, someone mentioned that negative temperature would actually be even hotter, is that so?)If that is the case, could then the same process be applied in balance to the normal processes and actually achieve zero molecular movement?

Ugh, sometimes this site makes me feel like I have an aneurysm heading my way.

I think you are confusing the stability of the Bose-Einstein Condensate (BEC) used to make these measurements with the thermodynamic stability of the negative temperature state of the BEC. It all comes down to your definition of the "system" (in my experience, the core problem in most of thermo is accuretely defining the system!)

For the authors, the "system" is the BEC and, according to them, the negative temperature state that they can put the system in is a global free energy minimum for the system. It cannot decay into other, lower energy states while the BEC remains in tact. If the BEC could last forever, it would be likely to stay in the negative temperature state forever (depending on the available energy and the spacing of the energy levels above this state).

I have never worked with them (and never will), but I think BECs themselves are very unstable because they are easy to disturb with outside, non-ideal influences that arise in experimental physics. Thus, they last a short time and decay quickly. For the authors of this paper, that type of decay has nothing to do with the thermodynamic stability of the state that they can put the system in.

In short, if you define the BEC as the system, the negative temperature state is stable. If you define the whole experimental apparatus as the system, some other, more disordered state is lower energy.

I am just an engineering student, and as such, I am beneath "High Science", but how is what the authors did with their experiment fundamentally different from pressing the button on a normal red laser pointer that you can buy at Staples?

You just kind of squashed the assertion that it, unlike the laser pointer, is in thermodynamic equilibrium by saying that that is only true if you look at a sub-system of the experiment. What is novel about this journal article?

Think of it this way - you have an enormous experiment set up in your kitchen with tons of very fragile glassware. Using this apparatus, you are able to get some chemicals in that glassware into an interesting thermodynamic equilibrium state. However, because your glassware is very fragile, it all shatters every hour leaving you with a huge mess, but interesting data. I don't think one would argue that the true equilibrium state of your chemicals is lying in a pool with broken glass on the floor, rather than the interesting state you were able to achieve with your apparatus. This is one way of thinking about what happens when the bose-einstein condensate decays in the paper discussed above.

The system discussed here is much closer to the analogy above than to the laser pointer analogy. When you take your thumb off the button of a laser pointer, the diode (or crystal, or gas) still exists in a similar physical state, you just have stopped adding energy to it. In the case of the experiment discussed in the article, the state of matter changes destroying the system, rather than the system dropping into a lower energy state.

The system discussed here is much closer to the analogy above than to the laser pointer analogy. When you take your thumb off the button of a laser pointer, the diode (or crystal, or gas) still exists in a similar physical state, you just have stopped adding energy to it. In the case of the experiment discussed in the article, the state of matter changes destroying the system, rather than the system dropping into a lower energy state.

Why can't you lump in the battery with the laser diode and call that the "laser system"? Now there is no energy source external to the system.

Edit: I guess if you include the battery, you may not have a negative temperature anymore. The battery has electrical noise at room temperature, although it doesn't seem like attaching a battery to a laser system stops it from lasing. I don't know how to think about this. Maybe what I am saying is nonsensical.

This is the same basic effect (the carnot is one instance) seen in almost all reported free energy / over-unity systems developed in the last 60 (120 if you want to be picky) years.

You begin with a broken symmetry in local energy distribution created artificially (whether heat, electron mass distribution, plasma discharge, etc... effectively creating a temporary dipole of one sort or another) and the surrounding environment responds in whatever way is appropriate for conditions to rebalance the symmetry.

As has been previously mentioned, the favorite example of many is the energy vacuum created in a magnetic field collapse and the subsequent "magical" current inrush seen on nearby conductors. This is why transformers and magnetic motors work, although they also work against themselves because they are balanced in their design, hence the sub 100% efficiency.

A Tesla coil (also a transformer), by contrast, is not a balanced machine, and thus causes aberrant current effects in the local environment (lighting fluorescent bulbs with no conductor contact at distance) without a marked (read - conventionally appropriate) increase in current consumption in the primary. The current effects also change in character with significant change in the pulse frequency on the primary and according to the tuning and resonance of the system.

The plasma arcs emanating from the secondary in a Tesla coil are actually more akin to BEC than ionized gas plasma and are longitudinal waves rather than transverse in orientation due to the configuration of the primary in relation to the secondary. Furthermore, radio signals passed through tesla coils move through the earth itself as freely as the atmosphere because the waveform does not interfere with the structure of matter, instead passing through it like sound through the ocean. This would make the energy coming off the secondary inherently "cold". It can cause elevation of energy levels in gases (in enclosed, resonant chambers), and cause photon release, but through what action? Tesla himself was pictured throwing these BEC arcs to conductors in his own hand at energy levels of several million volts. Normal plasma would atomize (or at least immolate) a human in these conditions.

So it would seem that controlled feedback from resonance is a desired effect where self-sustaining power systems are concerned as well, leading me to my next point.

This is also the same effect seen in most cold fusion experiments - they all use a HF pulsed DC current to power the reactor and the post asymmetry collapse of "cold" inrush catalyzes the reaction, thousands or millions of times per second.

This is also why transmutative effects (alchemical) are usually seen in these instances - though they create no radioactive isotopes, because the reactions are focused on rebalancing (fusion) a system rather than unbalancing it (fission).

This is, of course, overly simplified and redacts the mathematics of the physics but the concepts are key to the understanding of all next gen energy systems. An interesting analogue to the experiment outlined in the article here by John is the Chung negative resistor, which you can read about here and here. This may seem like pseudo-science to some, but to those that have actually done the experiments in the lab, they are very real.

The unfortunate aside is that the politics of the energy sector are repressive and monopolistic, and this is unlikely to change until a confluence of materials science advancements and market conditions make it inevitable. Physicists have been trying to sneak this under and over the radar of the oil companies for decades, but it would seem that a different tack will be required if the situation is to improve.

"negative temperature states are HOTTER than positive temperature states, i.e. in thermal contact heat would flow from a negative temperature to a positive temperature system." pg2 of the arXiv: 1211.0545v1 [cond-mat.quant-gas] 2 Nov 2012 article. That's why Norman Ramsey wrote, "When account is taken of the possibility of negative temperatures, various modifications of conventional thermodynamics statements are required. For example, heat CAN be extracted from a negative-temperature reservoir with no other effect than the performance of an equivalent amount of work," in Phys. Rev. 103, 20–28 (1956).

Forgive my naivety on the subject, but does this mean theoretically one could heat particles to absolute zero. It sounded like as matter was heated to higher temperatures entropy dropped, and before the switch to gaining entropy after crossing back to positive temperatures, absolute zero would have to be crossed. While I'm sure it is easier to cool to absolute zero, I am curious if it is possible to go the other way.

Forgive my naivety on the subject, but does this mean theoretically one could heat particles to absolute zero. It sounded like as matter was heated to higher temperatures entropy dropped, and before the switch to gaining entropy after crossing back to positive temperatures, absolute zero would have to be crossed. While I'm sure it is easier to cool to absolute zero, I am curious if it is possible to go the other way.

No. Adding energy to a system isn't the same as heating it. Heat only exists when there is thermal contact between two systems. So, for an isolated system (without thermal contact) there is no such thing as heating.

Forgive my naivety on the subject, but does this mean theoretically one could heat particles to absolute zero. It sounded like as matter was heated to higher temperatures entropy dropped, and before the switch to gaining entropy after crossing back to positive temperatures, absolute zero would have to be crossed. While I'm sure it is easier to cool to absolute zero, I am curious if it is possible to go the other way.

No. Adding energy to a system isn't the same as heating it. Heat only exists when there is thermal contact between two systems. So, for an isolated system (without thermal contact) there is no such thing as heating.

The physicists made a Bose-Einstein Condensate (BEC) of about 10,000 potassium atoms, trapping them in a 3-D optical lattice (a crystal made of photons), in a near perfect vacuum, at near absolute zero Kelvin. Then they accomplished their feat: they were able to add energy to the system and decrease the entropy of the system at the same time. This means that the temperature of the BEC became FORMALLY negative, even though the BEC became HOTTER than it was before (its internal energy increased even though its entropy decreased). This weirdness is brought to you by the thermodynamic definition of temperature: the partial change in entropy with respect to internal energy. Normally, the entropy of a system increases when you add energy to it. So, the temperature of the system increases too.

Forgive my naivety on the subject, but does this mean theoretically one could heat particles to absolute zero. It sounded like as matter was heated to higher temperatures entropy dropped, and before the switch to gaining entropy after crossing back to positive temperatures, absolute zero would have to be crossed. While I'm sure it is easier to cool to absolute zero, I am curious if it is possible to go the other way.

No. Adding energy to a system isn't the same as heating it. Heat only exists when there is thermal contact between two systems. So, for an isolated system (without thermal contact) there is no such thing as heating.

The physicists made a Bose-Einstein Condensate (BEC) of about 10,000 potassium atoms, trapping them in a 3-D optical lattice (a crystal made of photons), in a near perfect vacuum, at near absolute zero Kelvin. Then they accomplished their feat: they were able to add energy to the system and decrease the entropy of the system at the same time. This means that the temperature became FORMALLY negative, even though the -T system is now HOTTER than any system at absolute zero Kelvin. This weirdness is brought to you by the thermodynamic definition of temperature: the partial change in entropy with respect to internal energy. Normally, the entropy of a system increases when you add energy. So, the temperature goes up too.

Thank you, guys. This is why I like ars; someone like me, with a degree in art and art education, and just a real interest in science can learn about our crazy wonderful universe without feeling looked down on for being ignorant on a subject.

Correction: My full answer regarding HEAT is as follows: HEAT exists if and only if there is THERMAL CONTACT (the ability to conduct, convect, or radiate energy as heat) between TWO systems that have DIFFERENT TEMPERATURES. So, for a SINGLE isolated system (no thermal contact with another system) there is no such thing as heating. Moreover, if two systems are in thermal contact and yet have the same temperature (are in thermal equilibrium), then no heating/cooling occurs.

That's why it makes no sense to run a heat pump when the outside temperature is close to the inside temperature. The efficiency of the heat pump goes to zero as those two temperatures approach the same value.

Thank you, guys. This is why I like ars; someone like me, with a degree in art and art education, and just a real interest in science can learn about our crazy wonderful universe without feeling looked down on for being ignorant on a subject.[/quote]

I'm glad you like the responding posts. Try reading the works of Alan Lightman, especially "Einstein's Dreams." Here is a description of him and his work from his webpage at MIT: "As both a distinguished physicist and an accomplished novelist, Lightman is one of only a small number of people who straddle the sciences and the humanities. He was the first professor at MIT to receive a joint appointment in the sciences and the humanities. His essay "In the Name of Love?" was the first article about love and language published in Nature, the prestigious international science journal (October 8, 2001), and his "The First Law of Thermodynamics" was the first short story published in the physics journal Physics Today (May 2005). He has lectured at more than 100 universities nationwide about the similarities and differences in the ways that scientists and artists view the world." http://writing.mit.edu/people/faculty/h ... #BIOGRAPHY accessed 7 January 2013 1:57 PM

I'm glad you like the responding posts. Try reading the works of Alan Lightman, especially "Einstein's Dreams." Here is a description of him and his work from his webpage at MIT: "As both a distinguished physicist and an accomplished novelist, Lightman is one of only a small number of people who straddle the sciences and the humanities. He was the first professor at MIT to receive a joint appointment in the sciences and the humanities. His essay "In the Name of Love?" was the first article about love and language published in Nature, the prestigious international science journal (October 8, 2001), and his "The First Law of Thermodynamics" was the first short story published in the physics journal Physics Today (May 2005). He has lectured at more than 100 universities nationwide about the similarities and differences in the ways that scientists and artists view the world." http://writing.mit.edu/people/faculty/h ... #BIOGRAPHY accessed 7 January 2013 1:57 PM

I will look into his works. They look very interesting. I am a firm believer that humanities and science are interconnected and should be taught as such. There is a reason the government pushed art classes in the '50's and '60's for scientists and engineers to get us into space and on the moon.

Now imagine a system where there's an upper limit on the energy state an atom can occupy.

This doesn't make sense. Energy levels are well defined in atoms. If we take electronic energy levels for example, they will have a well defined set of quantum states that the electrons can be promoted too. If we were to give it an upper limit, I would say it is where ionisation occurs. If we look at nuclear energy levels, it is the same situation, nucleons go into quantised levels in a potential well, and if they are excited to such a point where their E > energy of the shells, they will be emitted. Their are rotational and vibrational states, but once again, too much energy into the system would cause ionisation.

This concept makes no sense- or it hasn't been explained correctly . The paper it's referring to is a tough read, and without a degree in physics I don't think anyone could understand it!

Now imagine a system where there's an upper limit on the energy state an atom can occupy.

This doesn't make sense. Energy levels are well defined in atoms. If we take electronic energy levels for example, they will have a well defined set of quantum states that the electrons can be promoted too. If we were to give it an upper limit, I would say it is where ionisation occurs. If we look at nuclear energy levels, it is the same situation, nucleons go into quantised levels in a potential well, and if they are excited to such a point where their E > energy of the shells, they will be emitted. Their are rotational and vibrational states, but once again, too much energy into the system would cause ionisation.

This concept makes no sense- or it hasn't been explained correctly . The paper it's referring to is a tough read, and without a degree in physics I don't think anyone could understand it!

I wish they published more detail on the characteristics of the material being used. In semiconductors it is possible to give enough energy to valence electrons to pack the dopant band, yet not enough for them to jump to the conduction band.

This weirdness is brought to you by the thermodynamic definition of temperature: the partial change in entropy with respect to internal energy. Normally, the entropy of a system increases when you add energy to it. So, the temperature of the system increases too.

Maybe the fundamental quantity being measured is really the partial of energy to entropy. This -1 times the reciprocal of the temperature (barring edge cases). The values would run from negative infinity (just above zero Kelvin) to zero (infinite temperature) and then into positive numbers (negative absolute temperature) eventually heading toward positive infinity (just below zero Kelvin). "Heat" always flows from the system with the higher measurement on this scale.

Does such a scale have any real physical meaning?

Edit: Had a brain-fart... it's just the reciprocal rather that -1 times the reciprocal. So the scale runs from positive infinity (just above zero Kelvin) down to zero (infinite temperature) and then into negative numbers (negative absolute temperature) eventually heading toward negative infinity (just below zero Kelvin). "Heat" always flows from the system with the lower measurement on this scale.

Though I think normalizing with "times -1" makes the scale more intuitive.

"Heat" always flows from the system with the higher measurement on this scale.Does such a scale have any real physical meaning?

There might be a physical application in terms of heat transport.An analogy in mass transfer would be, where under the right conditions, molecules can flow from lower concentration regions to higher ones. This is useful in creating thin films on various surfaces.

This weirdness is brought to you by the thermodynamic definition of temperature: the partial change in entropy with respect to internal energy. Normally, the entropy of a system increases when you add energy to it. So, the temperature of the system increases too.

Maybe the fundamental quantity being measured is really the partial of energy to entropy. This -1 times the reciprocal of the temperature (barring edge cases). The values would run from negative infinity (just above zero Kelvin) to zero (infinite temperature) and then into positive numbers (negative absolute temperature) eventually heading toward positive infinity (just below zero Kelvin). "Heat" always flows from the system with the higher measurement on this scale.

Does such a scale have any real physical meaning?

Edit: Had a brain-fart... it's just the reciprocal rather that -1 times the reciprocal. So the scale runs from positive infinity (just above zero Kelvin) down to zero (infinite temperature) and then into negative numbers (negative absolute temperature) eventually heading toward negative infinity (just below zero Kelvin). "Heat" always flows from the system with the lower measurement on this scale.

Though I think normalizing with "times -1" makes the scale more intuitive.

Thanks for pointing out my mistake. I was typing too fast. The full correct equation defining thermodynamic temperature is: 1/T = (dS/dU)v. Namely, the reciprocal of temperature (unit = per-Kelvin ?) is equal to the ratio of the change in entropy (S) to the change in internal energy (U), under constant volume (v). Therefore, T = (dU/dS)v; (absolute) thermodynamic temperature is equal to the ratio of the change in internal energy (U) to the change in entropy (S), under constant volume (v).

Thanks for pointing out my mistake. I was typing too fast. The full correct equation defining thermodynamic temperature is: 1/T = (dS/dU)v. Namely, the reciprocal of temperature (unit = per-Kelvin ?) is equal to the ratio of the change in entropy (S) to the change in internal energy (U), under constant volume (v). Therefore, T = (dU/dS)v; (absolute) thermodynamic temperature is equal to the ratio of the change in internal energy (U) to the change in entropy (S), under constant volume (v).

We do have Hertz which defined as (1/seconds), so it's not without precedent. There currently aren't any SI units named after me, so the coast is clear

In the context of the article, I think ∂(entropy)/∂(energy) makes more sense... only problem is that annoying sign reversal. Even annoying sign reversals have precedent, such as price elasticity of demand.

Thanks for pointing out my mistake. I was typing too fast. The full correct equation defining thermodynamic temperature is: 1/T = (dS/dU)v. Namely, the reciprocal of temperature (unit = per-Kelvin ?) is equal to the ratio of the change in entropy (S) to the change in internal energy (U), under constant volume (v). Therefore, T = (dU/dS)v; (absolute) thermodynamic temperature is equal to the ratio of the change in internal energy (U) to the change in entropy (S), under constant volume (v).

We do have Hertz which defined as (1/seconds), so it's not without precedent. There currently aren't any SI units named after me, so the coast is clear

In the context of the article, I think ∂(entropy)/∂(energy) makes more sense... only problem is that annoying sign reversal. Even annoying sign reversals have precedent, such as price elasticity of demand.

Theoretical physicists tend to use thermodynamic Beta (= 1/T) as the fundamental parameter rather than Temperature in order to avoid the annoying sign reversal and the jump discontinuity between maximum positive temperature and minimum negative temperature. Using absolute temperature, the theoretical 'hotness' scale goes from coldest to hottest like this: 0 K, 300 K, 3000 K, ..., max +K, min -K, -300 K, -3000 K, max -K. The startling implication here is that, if these data are confirmed, the physicists were able to take one of the coldest substances in the universe, a Bose-Einstein Condensate of 10,000 potassium atoms, and make it hotter than most things in the universe (so hot, so to speak, that it had a negative absolute temperature) and keep it stable for 600 ms.

Theoretical physicists tend to use thermodynamic Beta (= 1/T) as the fundamental parameter rather than Temperature in order to avoid the annoying sign reversal and discontinuities. Using standard temperature, the theoretical 'hotness' scale goes from coldest to hottest like this: 0 K, 300 K, 3000 K, ..., max +K, -K, ..., -3,000 K , -300 K, ?[b]. This is why many argue such a scale has little, if any, real meaning. Consider the following recent article: http://arxiv.org/abs/1301.0852 .[/quote]

Thanks for pointing out my mistake. I was typing too fast. The full correct equation defining thermodynamic temperature is: 1/T = (dS/dU)v. Namely, the reciprocal of temperature (unit = per-Kelvin ?) is equal to the ratio of the change in entropy (S) to the change in internal energy (U), under constant volume (v). Therefore, T = (dU/dS)v; (absolute) thermodynamic temperature is equal to the ratio of the change in internal energy (U) to the change in entropy (S), under constant volume (v).

We do have Hertz which defined as (1/seconds), so it's not without precedent. There currently aren't any SI units named after me, so the coast is clear

In the context of the article, I think ∂(entropy)/∂(energy) makes more sense... only problem is that annoying sign reversal. Even annoying sign reversals have precedent, such as price elasticity of demand.

Theoretical physicists tend to use thermodynamic Beta (= 1/T) as the fundamental parameter rather than Temperature in order to avoid the annoying sign reversal and the jump discontinuity between maximum positive temperature and minimum negative temperature. Using absolute temperature, the theoretical 'hotness' scale goes from coldest to hottest like this: 0 K, 300 K, 3000 K, ..., max +K, -K, -300 K, -3000 K, min -K. The startling implication here is that, if these data are confirmed, the physicists were able to take one of the coldest substances in the universe, a Bose-Einstein Condensate of 10,000 potassium atoms, and make it hotter than most things in the universe (so hot, so to speak, that it had a negative absolute temperature) and keep it stable for 600 ms.

Thanks, that's what I was looking for. Beta does seem to make a lot more sense in this situation.

Forgive my naivety on the subject, but does this mean theoretically one could heat particles to absolute zero. It sounded like as matter was heated to higher temperatures entropy dropped, and before the switch to gaining entropy after crossing back to positive temperatures, absolute zero would have to be crossed. While I'm sure it is easier to cool to absolute zero, I am curious if it is possible to go the other way.

No. Adding energy to a system isn't the same as heating it. Heat only exists when there is thermal contact between two systems. So, for an isolated system (without thermal contact) there is no such thing as heating.

The physicists made a Bose-Einstein Condensate (BEC) of about 10,000 potassium atoms, trapping them in a 3-D optical lattice (a crystal made of photons), in a near perfect vacuum, at near absolute zero Kelvin. Then they accomplished their feat: they were able to add energy to the system and decrease the entropy of the system at the same time. This means that the temperature of the BEC became FORMALLY negative, even though the BEC became HOTTER than it was before (its internal energy increased even though its entropy decreased). This weirdness is brought to you by the thermodynamic definition of temperature: the partial change in entropy with respect to internal energy. Normally, the entropy of a system increases when you add energy to it. So, the temperature of the system increases too.

Doesn't that just mean the thermodynamic definition of temperature is wrong or worded incorrectly?

How did they add the energy ?

Using a laser you create heat because it would touch the atoms heating them up.

Perhaps its the whisky but i'm confused. Not enough info was provided in the article.

Looks like somebody caught a bad definition and used it to get in the news and get some useless fame.In other words wasted time that could have been used to do real science.

I think that would justify the firing of those "scientists"

Or is this another case of a news site misinterpreting the results of a science experiment ?

On the whole, yes. Local decreases in entropy are nothing new, though. In fact, that's the only way things get done. When you air condition your house, the temperature inside goes down while the outside of your house gets hotter. Doing any kind of work results in an increase in entropy somewhere along the line. And I guarantee you that it took a lot of doing on the outside to make the entropy of this small system of potassium atoms decrease. The interesting thing here isn't that entropy went down, it's that the scientists created a "negative temperature."

Thank you for that. I always hate these approximations that do not include the full picture. In my electrical engineering classes I always had to create an equation for the full problem, just solving it piece wise was never enough for me. I guess that's why I stuck to software...