Yes Intel did demo a solar cell powering a Pentium, but that was merely to make a point about the inefficiencies of near-threshold voltage (NTV) CPUs. They have no particular focus on Solar powered processors.

Near-threshold voltage (NTV) CPUs are the focus of Intel's research here.NTV transistors can switch at voltages just the threshold for the device's powered state, and CPUs made of these can idle along at extremely low voltage doing real work (slower) or they can ramp up the power and work much faster.

The idea is to have devices run at low voltages and power consumption rates that would be akin to a sleep mode in today's chips. And NTV techniques are not just limited to processors used in hand-held devices like smartphones and tablets, but to everything all the way up to exascale supercomputers, says Rattner. The important thing is that NTV techniques allow a chip's performance and power to scale as voltage scales up and down, and to do so across a wide dynamic range.

Marketing spin aside, the "near-threshold voltage" chip is quite an achievement. Intel first revealed in March 2010 that it had a prototype chip running at such low voltages, but Claremont's creators took that technology and baked it into a full IA architecture processor. Based on a Pentium core, Claremont can not only be throttled down to "within a couple of hundred millivolts of the threshold voltage of the transistors," said Intel engineer Sriram Vangal, who demoed the chip during Rattner's turn, but – equally important – it also has a high dynamic range that allows it to be cranked up to deliver ten times the low-power performance by increasing the voltage.

Once again, the Register does a better job of reporting than Techworld.

Power is proportional to switching frequency and to the square of the supply voltage. Reducing the supply voltage is the main vehicle to reduce power consumption, but with standard CMOS you run into the problem that transistors leak a little current when they're run at or near the threshold voltage because they don't turn off completely (you need significantly more than the threshold voltage for that.)So in a totem pole circuit (used in standard CMOS) current leaks straight from Vcc to ground - not good. They must have designed some tricky circuits that avoid this current path although the transistors are still conducting a little.Of course the real reason behind this is that even standard CMOS designs suffer from leakage -- the smaller the more leakage -- so they can apply these techniques to standard designs as well. That will probably be a necessity at some point beyond 22nm.

Reducing the supply voltage is the main vehicle to reduce power consumption, but with standard CMOS you run into the problem that transistors leak a little current when they're run at or near the threshold voltage because they don't turn off completely (you need significantly more than the threshold voltage for that.)

Of course they do; CMOS transistors are analog circuit components. Yes, usually they're driven into a state where their non-linearity makes them behave almost like binary components, but they're very much not that. The closer you drive them to the limit, whether through raising the speed or through lowering the voltage, the more they behave like the analog devices they truly are.

So it's essentially a throttle? You can use however little power you have time for? So my netbook can render a big Blender animation on a single battery charge, I'd just have to wait for a few weeks? Sounds very useful indeed.

Essentially a throttle, but more likely a demand based system, such that non-busy processors can run at the lowest possible speed and voltage, and when work stacks up, it ramps up.

Great for the smart phone in your pocket which has nothing to do for hours at a time other than check the email and listen for calls.Since its screen is off, you really don't care how fast it does those things as long as they are just barely fast enough.

There is a great deal of "stare time" that happens when people look at computers, and the processors are spinning away all the time while you are reading this. They could just as well drop to an extremely low power state, and wait for a mouse move, finger tap, or something else.

This much we've been doing all along, for the last 20 years. But power consumption still remained high, because even simple tasks like checking the clock to see if its time to increment that digital time read out took processing power, and historically any use of the processor kept it awake at something like full power for that task.

Now, those tasks can be performed at extremely low power, without ramping up the speed. Only when the processor can't meet the demand would the system increase the voltage and speed up the chip.

Another method which (I assume) addresses the problem is running a more full-featured BIOS [betanews.com] that could operate the "basic" applications like web browsers and skype without having to load a full-fledged OS.

You have essentially instant-on access to the most basic popular applications, and then boot to a real operating system when you have to edit that film or play that game.

Toshibas(and probably others) had the ability to play audio CDs from within the BIOS way back in 2003 when I was fixing laptops fo

So many people are worried about how technological advances are ruining the environment. What many often forget is that technology is also the answer (unless you want to go back to a hunter-gather lifestyle and I hear that the drum/smoke-signal bandwidth really sucks, it's takes forever to download the latest movie.)

We're in a race - computational speed, new materials, new efficiencies versus the rate in which we're polluting the environment. Many things make me optimistic: photovoltaic paints for one - and now processing power so efficient that it can be solar powered. Wow. We may win this race after all.
.

So many people are worried about how technological advances are ruining the environment. What many often forget is that technology is also the answer (unless you want to go back to a hunter-gather lifestyle and I hear that the drum/smoke-signal bandwidth really sucks, it's takes forever to download the latest movie.)

We're in a race - computational speed, new materials, new efficiencies versus the rate in which we're polluting the environment. Many things make me optimistic: photovoltaic paints for one - and now processing power so efficient that it can be solar powered. Wow. We may win this race after all..

You insensitive clod! Smoke signals release carbon into the atmosphere!

In regards to your second statement regarding noise pollution, I believe AC/DC found an alternative to this issue, they formulated in the 1970's that Rock And Roll was not Noise Pollution and was sustainable indefinitely (Aint gonna die.):)

When I consider that the human brain is many orders of magnitude more powerful than any electronic computer, and uses only a few hundred calories a day, it makes me realize that our electronic computers have a huge potential for improvement in both energy efficiency and power.

oh, how many 7 digit base plus 2 digit exponent floating point operations a second can your brain do? 0.01? the brain isn't a digital computer, rather some kind of funky kludgy signal processing system. it's not a question of less or more power, rather a different kind of power.

Saying the human brain is "more powerful" makes no sense by itself. It's better at certain tasks (like pattern recognition, jumping to conclusions and holding contradictory beliefs) because it's hard-wired to do them. When it has to use general-purpose computing (like when you try to do floating-point math), you'll find most computers a great deal faster and more efficient.

When it has to use general-purpose computing (like when you
try to do floating-point math), you'll find most computers a great deal
faster and more efficient.

True, there are "sweet spots" such as this where computers have an
advantage over humans. However, as the math gets more advanced, computers
rapidly start losing steam. Humans can prove advanced theorems such
as Fermat's Last Theorem that computers can't even begin to touch,
even with state of the art automated theorem provers.

the basic rule is that neural networks can solve problems without knowing *how* precisely, and digital computers can do anything if you know exactly how. See the difference? You can't compare brains and computers. They are good at diametrically opposed things and always will be. Thats the law (of physics and computation).

About 100 teraflops, according to what I could dig up on Google. By comparison, the highest end single GPUs can do about 2.5 teraflops (and at raw computation they destroy general purpose CPUs), and those generally consume a few hundred watts. Obviously supercomputers are a lot faster, but by input energy, the human brain is much faster than a computer. Our minds just aren't designed to handle numerical calculations, but they certainly could outperform a computer. Granted, someone whose brain was wired to d

Granted, someone whose brain was wired to do that would probably be completely non-functional, since we need so much power for our other activities, but it is certainly possible.

Indeed you are right at this point. I once saw a documentary about savants [wikipedia.org] and this is one of the ideas you got. These people master skills like no one else could imagine. The most famous savant, Kim Peek [wikipedia.org] (the inspiration for the character of Dustin Hoffman in Rainman) had memorized thousands of books. Other played instruments flawlessly even after hearing a new piece just once. Other could draw Manhattan with impressive detail after seeing only one photo. The list goes on.
The inabilities they had were qui

When it has to use general-purpose computing (like when you try to do floating-point math), you'll find most computers a great deal faster and more efficient.

Is that true? I thought that the human brain was very good at all sorts of calculation, but that was hidden from consciousness. The computational power required to walk and chew gum at the same time is impressively high, no?

while its accurate in that HYBRID processing systems are certainly a bright spot in the future, its amazing to me how many people totally fail to realize that neurons are analog and computers are digital. They solve problems in completely different ways and domains, and there are tasks suited to both but rarely at the same time. For instance, as mentioned on a sibling post, brains ain't gonna have FLOPS. More like FLOMS - bad jokes aside, digital computers are not going to be able to identify orthogonal pat

1) The brain is more parallel and fuzzy than traditional CPUs, but "more powerful" is getting really blurry with today's machines.

2) Remember that each "Calorie" is 1000 calories. You're going through at least hundreds of thousands, more likely millions, of calories of energy per day. All estimates of average human energy usage I've seen tend to be in the range of 200-300W. Though that's not just the brain, one can assume that a reasonable percentage of that is spent on it, and even the majority when sit

Sure technology helps, why not use all the available tools. But excreting business as usual with the expectation that technology will save you is not responsible. You don't have to increase pollution to accommodate new discoveries, why make a race between techonology and death? Einstein didn't have a computer, and he was no hunter-gatherer.

No, Einstein didn't have a computer. Technology amplifies your ability to do a task. It allows us to find research information faster, to disseminate important information FASTER. This is important. We're not only exponentially increasing the amount of information but also getting this information to people who can then use it to create something new.
I personally don't think we are in *danger* of losing this race (except for wars over resources ballooning out of control when it combines with religious fer

I'm not sure what transistor geometry Clairmont is manufactured at, but for really small transistors (e.g. 32nm), process variation is a serious problem, making it hard to scale voltage down that low. The results are unpredictable performance from die-to-die and within die and major reliability problems. Static RAMs are hit the hardest, because they use the smallest transistors. "http://www.cse.ohio-state.edu/~millerti/parichute-camera.pdf" is an example of a paper that explores the consequences of ultra-low voltage SRAMs and tries to solve it with forward error correction.

That is not hard with a Via C3. The board I have had VGA onboard and all other connectors (it looks like a ATX motherboard) but ran off of 5 watts at 5 voltsAll from outdated 5 year old tech from little old VIA.

1.2V @ 1GHz is not power efficient at speed. Existing Core designs are running much faster at lower voltages. Based upon what they've demonstrated so far, it's useful for devices that need moderate speed on an occasional basis, but spend the majority of their time at idle.

Now, if they can scale it up to 2-3GHz at around 1V and idle at less than 0.5V at a reduced freq, then it'll be something worth looking at for common applications.

True, but P=VI, and I=V/R, therefore P=V^2*R. R is unlikely to be an order of magnitude lower, therefore, voltage is a surrogate for power efficiency. The fact that existing chips run significantly faster and significantly lower voltages is an indication the these are not currently as efficient as they could be when under load.

No, did you miss the part where I said these would only be useful where they're running at idle most of the time? That's an idle speed, and yes, it's extremely low voltage, and presumably ultra low power. But 3MHz is fast enough to do much work these day, so it'll have to ramp up the speed and voltage to do any useful work.

You mean like how your phone sits idle waiting for network activity or background tasks to run? or when you're reading the content of that document? or between the key presses when you're writing the document? intel speedstep is already very quick and ramping up voltages and frequency

I don't about using my computer outside, especially in the summer when it's very hot and in the winter when it's very cold. I might be able to manage spring & fall but not on windy days as my papers would fly about.

Android is not linux, android is linux plus a whole lot of crud that's waking up too often. Blame userspace, not linux. Nokia's linux devices have far better battery life (over 2 weeks on my N9).And in a desktop/laptop context, you also have to remember that MS have got NDAs with the hardware manufacturers and BIOS writers regarding power control, which prevents linux from being as aggressive. Linux hackers are trying to reverse engineeer these interfaces, clearly, but progress is slow. Have you run powerto