US boffins say that the tendency of laptops to run hotter as performance increases is out of control - and notebooks will be as hot as the surface of the sun by 2030.
The American brainboxes reckon to solve the problem by using nanotech demons to violate the Second Law of Thermodynamics.
"Laptops are very hot now, so hot that …

Thermodynamics

Surely the heat output of a processor is a function of the power input? In order to be as hot as the sun, either the battery life will be miniscule, or the power supply will be huge.

Have they done a straight line extapolation from historic and existing temperatures? That sort of reasoning seems a little flawed - a bit like saying that because cars are safer now than they were 20 years ago, by 2030, it will be possible to drive a car into a wall at full speed without any injury.

That's fine by me

Gaah!

Two comments. Firstly, saying laptops will be as hot as the surface of the sun by 2030 is like the Victorian predictions that London would be 12 feet deep in horse manure in the twentieth century - it's a dumb extrapolation. Secondly, I thought that Feynman showed that a mechanism based on the concept of Maxwell's Demon wouldn't work?

If your laptop's running that hot

by 2030 similes will be as dumb as a plank

Really, who comes up with these notions?

Don't they ever stop to think that maybe, just maybe there'll be a limiting factor that will kick-in before their conclusions are reached. Or do they just want to manipulate people with an eye-catching headline (hint: you succeeded, but I have no respect for your ability to draw conclusions and therefore for any of the "facts" you state). Worse, do they thnk that the general readership is stupid enough to believe this stuff?

What Nick Said

The bleeding obvious

Not sure about nanotechnological demons although it's bound to give rise to a heap of user reported errors to go along with "byte rot" and "drive worms": "nano gremlins" The easiest thing to do is to use superconducting materials that don't generate heat. We'll probably all be fine using our laptots (sic) in the LHC at CERN! In the meantime replacing DRAM would go a long way to reducing power consumption as would dropping x86 compatability.

Why not save the world?

Why tinker with laptops, when we can give the whole of mankind cheap energy and greatly reduce global warming. Maybe these folks have finally managed to persuade world legislators to repeal the second law of thermodynamics. Law of gravity next to fall?

Hot as the sun?

Extrapolating to the ridiculous

While wattage for microprocessors have indeed increased dramatically since the 80s, mostly due to higher and higher clock frequencies, the tendency today is to reduce clock speed and get performance through parallelism. This works because wattage increases as the square of the clock frequency (all else being equal) while wattage increases linearly with added processor cores. Granted, wattage may move from the cores to the interconnection network, but it is possible to limit this by not having full interconnection but only neighbour-to-neighbour and a few longer links.

Ans, as Nick stated, you can't get ridiculous temperatures without ridiculous battery drain, so this definitely won't happen with laptops. But one day your water heater may be your stationary home computer. :-)

Next: Schroedinger's cat harnessed to bring dead kittens back!

I just hope this is ultra-bad reporting otherwise these "brainboxes" deserve to be handed their IgNobel prizes once they leave the auditorium.

There is a reason why plants don't grow under red light, ya know.

There is a reason that heat cannot be converted back to a usable energy differential, and it's not Maxwell's Daemon who is going to help, in the same way that Turing's "Oracle" concept will help you build more powerful computers (yeah NewScientist, that was for you).

But what has that to do with reducing heat production? In principle you can make your computer so efficient that the only heat generated is the one made when erasing bits (e.g. dumping the electrons in a condenser to ground). See: reversible computing for an idea along those lines.

Forget the LHC

First it was laptop batteries exploding - soon it will be "My laptop was left on overnight and has burned its way down to the Earth's core!!" - is this the next threat to our existence once we have forgotten about mini black holes?

Knees?

Balderdash

If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

Maxwell's Demon

I wish I could have heard this at first hand. Maxwell's demon is a thought experiment, and from what I was taught many years ago, it provided no loophole for breaking the second law of thermodynamics (which states that the degree of entropy in a closed system must increase). That's because the very decision making process on whether to open the trapdoor (to let only the faster gas molcules through to the second chamber) does, itself, generate more entropy than was lost by separating the fast (that is hotter) gas molecules from the slower (cooler) ones.

So on that basis, it doesn't matter what technology that you use to implement an analogue of Maxwell's - it just makes things worse. At hear thermodynamics comes down to statistical probabilities, mathematics and information theory. The very basis of loss of organisation is what defines increases in entropy, the physical consequence of which is essentially everything ends up the same termperature and the ability to do useful work disappears.

Unless these two guys have got some Nobel award-winning theoretical work on their side, then I rather doubt either them or, more likely, the way this has been reported. Incidentally, Clorophyll and photosynthesis does not break any known laws of physics. it takes in very low entropy energy sources in the form of photons) and converts them to rather higher entropy forms (at pretty low efficiency - burn a tree and you get nothing like the energy back that is put into it in the form of absorbed sunlight; photovoltaic cells are much more efficient, save from the little problem that they can't reporduce themselves). The fact that humans can't yet reproduce the ability of photosynthesis in some aspects of its chemistry does not mean it viloates laws of physics just in the same way that we can't yet produce sustainable power from nuclear fusion on Earth is evidence that the Sun breaks laws of physics.

Nb. on a side issue, Flanders & Swann have a wonderful song called the First and Second Law which puts to song the principles involved (and even includes the heat death of the universe, albeit under a different name).

Six months out!

Bad reporting

This is not New Scientist. There may very well have been some bad reporting - but if so it took place in the Virginia U press office, not at the Reg. Live by the press release, die by your double-edged sword, academics ...

Too bad they won't be able to patent.

The US patent office on principle does not look at patents for devices that violate the second law of thermodynamics as this implies a perpetual motion machine.

Now there may be opportunities at the logic gate level. Entropy and information go hand in hand. Every AND & OR & etc. gate takes two inputs and provides only one output - thus losing information. This prodigious loss of information adds to the entropy of the universe and entropy = heat (more or less). Finding a way to recycle discarded logic bits could lead to inherently cooler chips. I leave this as an exercise for the class.

@Lewis Page

Looks like the are skewered on this one. It's also in Science Daily. I'm reminded of the later Professor Eric Laithwaite who headed up the department of heavey electrical engineering at my old university. In the physics department he was viewed either as an annoyance or as a clown once he started straying into the area of physics with his confounded theories on gyroscopes. It was great as a parlour act on Parkinson, but not much beyond that.

Anyway, any time that somebody talks about beating the Second law of Thermodynamics then smell a rat as solving that leads directly to solving the problem of a perpetual motion machine.

@Tony W

"""Law of gravity next to fall?"""

Didn't you read that goldenballs story a week or two ago? DARPA is trying to do levitation with quantum fringe magic or something.

I think the real problem here is not hot laptops, but consumers who demand a quad core raid0 setup with an 18 inch wide screen. For some reason people have got it in their heads that a laptop can replace a desktop without drawbacks (except maybe price, but people will spend money on /anything./)

Oh and the researchers mentioned in this document seem dead set on wasting a hell of a lot of time in the near future. Maybe it's just their excuse for only publishing the occasional contentless paper while still sitting on the university payroll.

Interesting concept!

@ Destroy All Monsters

I had to laugh at the subject line because that was almost exactly what I thought as I read the article.

All in all what these scientists are proposing is all a nice theory but they are making some rather large assumptions. Not to mention attempting to apply Maxwell's Demon in a way that they will "discover" isn't going to so much work out. I say "discover" because I suspect at the root of all this is a good marketing play on their part to score a large sum of grant money.

Maxwell's Demon was beaten ages ago

I'm guessing these guys don't read Make! Magazine. They published an article that showed you how to conduct the Maxwell's Demon experiment in real life and get an, apparently impossible, result. Maxwell's Demon supposedly lets you create a temperature differential from a constant temperature source by using a friendly demon. However by using a cyclonic concentric air-stream you can produce hot and cold air from a room temperature feed.

Maxwell's Demons

The demons or Imps have been discussed on a number of occasions not least in science fiction.

I do not remember who the the 1960s writer was who provided a highly entertaining piece which imagined several different type of imp, one passing fast electrons for the generation of power, one Brownian motion for heat others for selecting atoms for refining minerals. There imps were also self propagating.

Of course there was a downside, when the thermal imps got loose they started fires everywhere. So, when this technology is sorted just be careful where you put your laptop because your bum might catch fire.

The opposite will happen

In the longer term the overall performance per watt will keep climbing but the need for the most performance possible will be reduced. There won't likely be any truely revolutionary new battery technologies so what we are left with is the other problem laptops have, their short runtime.

Laptop heat levels will be reduced because people will want longer runtime more than a performance level that allows doing exotic things with their laptop. They'll want other factors that coincide to less heat and more runtime. Smaller, lighter, cheaper, a basic terminal to interface with the cloud is what everyone will be using instead of today's crude mobile version of a desktop PC.