Did this guy read "The Singularity is Near" by Ray Kurzwiel before he "Discovered" this? I don't see anything in this article that wasn't already laid out 7 years ago in that book. Even that chart looks like a ripoff of ones from Kurzwiels book. Koomey better have a conversation with Kurzwiel before he goes running around calling this Koomey's law.

That's only a coincidence; if an earth year was longer the number would be less unless you assume that the rate of technological progress is significantly impacted by longer or shorter seasons or days around the sun; which is doubtful.

Moore's 'law' had become a self fulfilling prophecy, it spurred us on to make computers that were twice as fast about every two years, and trapped us into a mindset that forbid us from doing much better. RIP.

The one thing that the technology optimists are failing to take into account is the USEFULNESS of ever-increasing computing power. We are rapidly reaching a point of technological plateau. Why? Because our processors already work about as fast as most of us could ever need, and so even if Moore's law continues with respect to processor power, it will necessarily become less and less useful to have that additional power. This means that we will replace our products less and less often, thus limiting the demand for insane amounts of power.

This is sheer ignorance, more computer power per inch means electronic products can get smaller and smaller. Small size enable Google's Project Glass, and nanobots in the future.
What we learn from the history of technological progress is that comments stating we have reach a final end point will be a joke a decade later

Yeah, it sure will be great when my phone can do a trillion computations in a nanosecond, or when my television screen has a billion pixels.
Of course, I won't actually notice those improvements because of one inescapable limiting factor: I am human, and thus my perceptual/cognitive capabilities limit the usefulness of those improvements. Still think I will BUY that stuff?

Advance in computer power will in the future shrink devices so small that you can put on your eyes like contact lenses. Now that's a product I'm going to buy.
The thing with current smartphone is their battery duration is not impressive at all, and their screen is very small.
Futurists even predict that nanomachines will become so small they can enter your blood stream and enhance your human perception/ cognitive capabilities.
The possibilities are endless!

As a consumer, what you don't see is that consumer devices these days are just made to display information sent to them by remote servers. Huge datacenters and super computers are gathering vast amounts of data, and are boiling it down for your use. With the rapidly growing number of smart devices, these data centers need more and more power. With those datasets, data scientists can apply machine learning and AI techniques to optimize user experience, customise advertising, or do a bunch of other things. So you may think that these advances don't affect you, but they do.

I agree with your statement completely. But don't think that the groups who require lots of computer power are just names like NASA, Google, or Facebook. Even small startups require more processing power than consumer electronics offer. Until they become bigger, they usually rent servers (technically VPSs) from companies like Amazon, Google, or even Microsoft. I might be biased because I am a mobile and web developer :P

The R-squared value on the chart is impressive; I wish more charts quoted that value. But since technology does not go backward, it would be hard for R-squared to get really low (Q: what would it be for a quarter-circle arc, starting horizontal, ending vertical?)

Also, it would help (for this chart and others) if the web version started out with no trend-line, and a user click were needed to show it.

The new focus on mobile computing will be very important for customer sales. While businesses will always be using desktop computers and laptops, households are moving their work and play to their phones and tablets. One note, as computing power increases more power is used. It is not an equal gain/loss, but most new phones are more powerful but have equal battery lives. This article is very exciting for me as a consumer, I can't wait to see what technology we have in the next decade.

In statistics, we are taught to be suspicious of any regression results with high R squared and in this case, R squared is 98.3%.

Obviously the error is not with the data neither did Dr Koomey’s make an error.

Its the the implication of quantum theory or in the phychology observer effect. The very act of observing alters the position of the particle being observed. For us here when we deal with human beings, they will simply fullfill the prophesies as its the optimal goal t a achieve , commercially. Tech companies work to fullfill the prophesy.

In actual fact if energy-efficiency of computing continues to double at every 1.6 years, why not just wait for 3 years and take a great leap that would have taken 20 years ?

It is surpising that the economist did not spot this, in as much as the newspaper has the motto of "to take part in a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress.

Well, let's keep making computers! Seems like it'll be incredibly efficient in enough time. Mine even has a button these days that puts it into "Eco mode". An actual hardware button. Technology rocks. Even though I never use Eco mode, sorry!

This article shows how far we have come in less then a 100 years with computers, what we have now is science-fiction to the first computers around 1950's. Also, what we don't know how much of that science-fiction we see on TV is going to happen in our life-time, Koomey's law may have replaced Moores law but for how long? We are indeed in a era of great advances in technology and evolution my friends. This is era is indeed when the term the present becomes part of the past, faster then ever in human history.

The concept of computing efficiency, defined in this article as the number of computations per KWh, is not very useful without some reference to the speed of the calculations. I'm not sure but one can probably design computing platforms that work some factor k more slowly while using more than k times less energy, thus increasing the computing efficiency arbitrarily high while accomplishing very little in a given amount of time.

The speed of cellphone’s development can be much faster than the computer’s. Since people are more likely to change their cellphones rather than computers. So the mobile phone companies can get much more benefit, which they could invest in their research or buy other companies’ cutting edge technology.

Density is NOT the fundamental factor, but rather a side effect of reduced component size and spacing: small components mean less power to switch; smaller specing means less power for components to communicate with each other. Moore's law depends on the sizes also. Both depend on fabrication techniques and control of purity and doping; both also appear to have limits imposed by quantum mechanics.

This is more or less deducible from Moore's law since component density historically has been nigh directly proportional to energy consumption, since in any circuit the power lost in the circuit is directly proportional to its length. Now add the fact that making small efficient processors is easier than making large efficient processors and you can easily see that the power draw for a given processor with a decrease in component size will decrease somewhat (but not much) more than what a processor of constant area with the same component shrink gains in computing power.

Power gating, variable frequencies and similar techniques allow modern processors to gain maximum computing power and use more components (and therefore longer circuits) without a similar increase in average power draw. As far as I know, however, these are fairly new techniques born out of fairly new mindsets with little impact on the researchers data set.

If anything, all this chart show is how little is required to get in the news. There's nothing new to see here. This might have an educational effect on non-technical people, at most. Calling it a law is ludicrous. Want some invisible shoes with those fine new invisible clothes of yours, emperor?

As stated, Moore's Law is more an observation that will eventually, given physical constraints, meet the obstacle of heat-output without some nice leap in technology we can't predict. Efficiency I'd expect to hold out longer than Moore's, because if we can do the same calculations with less heat, than we can pack more transistors in the same area without things melting and generally breaking down.

One of the most fascinating things is the architecture on Intel's new Sandy Bridge chips that can be overclocked WITHOUT degrading the power-efficiency. Perhaps they have already found a way to jump the heat-barrier?

You are missing something big here. There are different classes of computers. A high end ARM CPU (like the ones I work on) can run at 1Ghz. It has dual cores and runs on about 1W of power. This CPU is probably equivalent to the PC or server cpu of 5 years ago. The sort of high end ARM is nothing compared to the really low power CPUs out there. The stuff I work on is what is inside your tablet. The power/performance trade-off is decided mostly by need. We tablet makers loose much more power to the back light than the CPU.

If you want to talk about low power plot a line of low power devices. Start with the First Acorn chip, through the Palm Pilot, to the Apple Ipad today. I think you line will look a bit different.