What Then?

[This article was originally published in Enterprise Conversation, a UBM/DeusM publication. ]

As 2012 passes into the bit bucket of history, and the world
once again fails to end on cue, a new litany of unfulfilled dire consequences
is emerging. Back in 2006, the trend
toward data center consolidation was first taking off and what we now call
“cloud computing” was first catching on.
At that time, the U.S. Environmental Protection Agency estimated that
within five years, the rapid growth in power demand by data centers would lead
to a doubling of the number of kilowatt/hours (kWh) that power stations
produced.

The End

A few years ago, I compared the EPA’s numbers to the Dept.
of Energy’s electricity delivery estimates, based on the rate of population
growth in the U.S. Census. I concluded,
and later independent estimates confirmed, that if the EPA’s estimates were
accurate, the amount of supplemental power draw required by new data centers
would be greater than the power draw required by new people — new members of the U.S. population — in 2011 and beyond.

The cost of this supplemental power draw was left, as with
so many other topics that are too unpleasant to be thought about, as an
exercise for the reader. Since I was up
for the exercise, I calculated the budget cost per year for U.S. government
contributions to the construction of new power stations just to satisfy this
new demand. I came up with $21 billion
per year, which was 56 percent greater than the entire per-annum discretionary
resources budget for the Dept. of Transportation.

A New Beginning

In the intervening years, independent analyses
of the efficiency benefits gained from cloud computing told absolutely the
opposite story: not catastrophe, but rather paradise. Citing the very same Environmental Protection
Agency, consulting firms produced evidence of tremendous carbon emission
reduction on the part of businesses that had shifted their application
workloads to Microsoft cloud-hosted SaaS.
A 2010 Accenture study (PDF
available here) revealed that carbon dioxide emissions among users of
Microsoft Exchange were reduced by 52 percent per user among large enterprises,
and over 90 percent among SMBs, simply when they switched to Hosted Exchange
from on-premise.

The conclusion Accenture drew: Lower carbon emissions comes from more
efficient use of power, as a result of using less of it. “Generally speaking, the comparatively
smaller carbon footprint of cloud computing is a consequence of both improved
infrastructure efficiency,” the report read, “and a reduced need for IT
infrastructure to support a given user base,” before going on to directly link
data center efficiency to power usage effectiveness.

The following year, in its
contribution to the Carbon Disclosure Project, Verdantix used the EPA’s carbon
reduction numbers to project energy savings among some 2,653 U.S. firms whose
annual revenues were $1 billion or above.
The firm projected the combined net total energy cost savings for these
firms in 2013 to be around $2.5 million per annum, rising to about $12.3
million per annum in 2020 (probably also accounting for rising energy prices),
simply for having moved to cloud services delivery models.

So in seven years’ time, conceivably, about half the annual
cost of producing new power to meet growing demand for large power centers
would be offset, if these predictions hold true, by reduced power demand among
general businesses. The suggestion is
that, within our lifetimes, it could all come out in the wash.

A New End

Not exactly. As anyone
who builds his own PCs knows, the real power drain comes not from the
processor, but from the cooling system.
In fact, thanks so very much to the laws of physics at the subatomic
level, the smaller a processor is scaled and the less power it draws, the more power may be required to keep it
cool, unless you innovate the way
it’s cooled (thus, the tablet PC).

A 2010 General Services
Administration report (PDF
available here) made the clear case that the cost of cooling systems was rising faster than not only the cost of using them, but even of acquiring them in the first place. The year in which cooling the data center
would become more expensive than owning
a data center: 2012.

But it didn’t stop there.
Because of the way the nation’s energy infrastructure is delivered, it
went on, for every 33 units of energy that a data center actually uses, another
66 units was wasted actually getting it there from the generator. That calculation suggests that the offset in
energy savings for corporations from processing efficiencies is more than offset (perhaps doubly) by the
catapulting costs of our energy infrastructure.

So did this catastrophic confluence of predicted events
actually happen? What is perhaps most
astonishing of all is that we don’t know;
it would appear that no one — not the GSA, EPA, DOE, DOT, PBS — has bothered to
check. Not being able to grasp the scope
of the problem in front of us may be becoming the defining characteristic of
America in the 21st century.

Scott Fulton On Point

First there was the wheel, and you have to admit, the wheel was cool. After that, you had the boat and the hamburger, and technology was chugging right along with that whole evolution thing. Then there was the Web, and you had to wonder, after the wheel and the hamburger, how did things make such a sudden left turn and get so messed up so quickly? Displaying all the symptoms of having spent 35 years in the technology news business, Scott Fulton (often known as Scott M. Fulton, III, formerly known as D. F. Scott, sometimes known as that loud guy in the corner making the hand gestures) has taken it upon himself to move evolution back to a more sensible track. Stay in touch and see how far he gets.

Scott M. Fulton, III, is the author of this blog, and all text contained therein is his own unless otherwise noted explicitly. Some content may have appeared in other publications first, before being reprinted here, and is reprinted according to publishing agreements. Scott Fulton is always responsible for his own content.