Mobile cloud sucks power grid harder than data centers

Wi-Fi and broadband cellular will produce more CO2 in 2015 than 4.9M new cars.

Think mobile devices are low-power? A study by the Center for Energy-Efficient Telecommunications—a joint effort between AT&T's Bell Labs and the University of Melbourne in Australia—finds that wireless networking infrastructure worldwide accounts for 10 times more power consumption than data centers worldwide. In total, it is responsible for 90 percent of the power usage by cloud infrastructure. And that consumption is growing fast.

The study was in part a rebuttal to a Greenpeace report that focused on the power consumption of data centers. "The energy consumption of wireless access dominates data center consumption by a signiﬁcant margin," the authors of the CEET study wrote. One of the findings of the CEET researchers was that wired networks and data-center based applications could actually reduce overall computing energy consumption by allowing for less powerful client devices.

According to the CEET study, by 2015, wireless "cloud" infrastructure will consume as much as 43 terawatt-hours of electricity worldwide while generating 30 megatons of carbon dioxide. That's the equivalent of 4.9 million automobiles worth of carbon emissions. This projected power consumption is a 460 percent increase from the 9.2 TWh consumed by wireless infrastructure in 2012.

The high end of the growth projected by CEET is based on a scenario where wireless networks reach two billion users worldwide by 2015, generating traffic of 4.3 exabytes a month. A more conservative estimate of 32 TWh used a "low-uptake" scenario for wireless broadband services, with just 1.6 billion users worldwide by 2015 and 2.2 exabytes of traffic. These figures included users who accessed the cloud through in-home and public Wi-Fi and femtocell as well as through carriers' cellular infrastructure.

Mmm. My laptop sleeps when I'm not using it. I guess my router goes into low power (I hope) when not active. But I guess my wifi hardware (the wifi part of my home router, and separate wifi access points at work) is always powered up, always active 24/7, sending out signals, waiting for a mobile device to latch on.

I'm not a radios guy, but that's how it seems to me. I can see how it adds up.

the wifi part of my home router, and separate wifi access points at work) is always powered up, always active 24/7, sending out signals, waiting for a mobile device to latch on.

Your WiFi AP is up all the time, but that doesn't mean it's using up anywhere near as much power. It'll send out a "beacon" every second or so, but that's about it. If nothing is talking to it, it doesn't need to send out much of a signal.

And what's more, radio technology has to deal with the near-far problem. If someone nearby is broadcasting at the same power levels as someone who is further away, the more distant user is completely drowned out. My point being, our modern wireless devices are VERY smart, and cut way back on broadcast power, whenever possible. Not because they care about your electric bills, but because it's fundamentally necessary to do so.

In short, they're really complaining about LTE (base station) power consumption. WiFi is quite energy efficient, more so than even the upstream hard-wired networks of your ISP.

You too can save the planet by offering free public WiFi access. With 3rd party firmware like DD-WRT on a supported device (like a $35 D-Link D-632), you can even create a private, encrypted network for yourself, and a different, unencrypted network for the public. With QoS options, you could even throttle the public access as much as you want, to ensure that your own internet performance is always top-notch.

And also available is IEEE 802.11r... A standard to allow quick seamless hand-off between WiFi APs, as you move out of range of each. In short, the kind of seamless service we're used-to with cellular protocols, but with the higher speeds, lower latency, and vastly lower costs we're used-to getting with WiFi. In short, as WiFi continues to improve, public hotspot networks have the very real potential to replace telcos entirely, with far lower costs. Companies like Republic Wireless are already exploiting the availability of WiFi to make their unlimited $19/mo cellular service possible. T-Mobile also has WiFi calling, and both options work great in fringe areas, where a WiFi AP is cheaper than a cellular signal booster.

And what's more, radio technology has to deal with the near-far problem. If someone nearby is broadcasting at the same power levels as someone who is further away, the more distant user is completely drowned out. My point being, our modern wireless devices are VERY smart, and cut way back on broadcast power, whenever possible. Not because they care about your electric bills, but because it's fundamentally necessary to do so.

True, but power amplifier efficiency tends to tank when you back off an amplifier from its rated power output. In other words, it's a myth that you save a lot of power consumed by transmitting at lower power levels. Even people who work in communications often aren't ware of this. On the other hand, you do save a ton of power by 'standbying' an amp or turning if off completely when nothing needs to be transmitted.

43TWh versus the ballpark 140PWh the world uses means it represents 0.02% of our energy consumption. So even if by some miracle you could eliminate the energy demands of these devices to zero you would have made a negligible change in global energy usage.

Quote:(Phys.org)—"This much everyone knows: As technologies break new ground in speed and performance, mischief-makers also break new ground in finding ways to disrupt. Now an academic research group has warned a U.S. government agency of their findings, which show that the LTE high-speed wireless data networks of today and tomorrow are vulnerable to a jamming technique that could destroy service across a city. They say it could take nothing more complex than a laptop and $650 battery-operated radio unit aimed at portions of the LTE signal, to knock out an LTE base station, affecting large numbers of city residents".

In other words, it's a myth that you save a lot of power consumed by transmitting at lower power levels. Even people who work in communications often aren't ware of this. On the other hand, you do save a ton of power by 'standbying' an amp or turning if off completely when nothing needs to be transmitted.

Hmm... Plenty of electronics performs like that for sure, but cell phones in particular, are aggressively designed to minimize power consumption. Lots of work has gone into keeping the power consumption of cellular radios as low power as possible. Certainly, anyone who owns a cell phone, and has ever traveled into a fringe area, can attest to how much more power usage it causes, which is unmistakable as your cell phone gets quite warm in your hand.

I'm not saying 43 TWh is insignificant, but I wonder how much energy we're saving by, for example, routing goods more efficiently, generating fewer things that people don't need, number of trees saved, etc.

OK greenies! Time to cancel your data plans! You can't be talking out your a$$ about conservation if you have a vehicle, house, computer, or smartphone. Time to lose it all and get a tent, sleeping bag, and bicycle if you are really serious about saving the world...

I may be missing something but how, exactly, are data centers and cloud-computing infrastructure producing CO2 emissions?

Most likely the biggest factor is dirty energy generation. Construction also has a carbon cost.

Seems like a good reason to expand hydro and nuclear capacity to me.

Or, you know, wind, which is cheaper than nuclear and less environmentally destructive than hydro.

Wind is great, but it still costs a lot to maintain. And it only works when it is windy, but not too windy. And the amount of land needed to compete with one modern nuclear facility is staggering. Multiple sources of energy production are needed.

Maybe its just me, but the line about increasing datacenters and using less powerful client devices doesnt make loads of sense. I suppose moving all processing to datacenters might decrease the amount of traffic to the client devices, but would that not increase the need for datacenters by a fair margin? Those datacenters need power, so what is the net saving on power be?

I'm also no fan of thin clients and putting more of my data on the cloud.

Mmm. My laptop sleeps when I'm not using it. I guess my router goes into low power (I hope) when not active. But I guess my wifi hardware (the wifi part of my home router, and separate wifi access points at work) is always powered up, always active 24/7, sending out signals, waiting for a mobile device to latch on.

I'm not a radios guy, but that's how it seems to me. I can see how it adds up.

I could be wrong here, but I think what the article is saying is that the cloud itself (the datacenters) is the energy problem. Which would make sense, considering they are running 24/7.

Then again, aren't a lot of the new datacenters being designed and constructed with a lot of attention paid to energy footprint? For example, aren't Apple's datacenters using 100% renewable energy?

Actually, after rereading the article, I find it to be very confusing. Not exactly sure what the conclusions and findings are.

I may be missing something but how, exactly, are data centers and cloud-computing infrastructure producing CO2 emissions?

Most likely the biggest factor is dirty energy generation. Construction also has a carbon cost.

Seems like a good reason to expand hydro and nuclear capacity to me.

Or, you know, wind, which is cheaper than nuclear and less environmentally destructive than hydro.

Wind is great, but it still costs a lot to maintain. And it only works when it is windy, but not too windy. And the amount of land needed to compete with one modern nuclear facility is staggering. Multiple sources of energy production are needed.

Besides that, when people and animals start glowing we'll have less need for lightbulbs like those innovative newwave Japanese energy savers. But really, the focus of this miniscule energy usage in comparison to the total usage is like the talk in Washington on areas to cut the deficit. The focus should be on the obvious waste first and not on the functional and usable spectrum of things. Besides, there is an obvious monetary benefit for communication companies to reduce energy consumption where necessary here thus I would think energy savings is a constant focus wherever possible of course and unless their energy needs are being subsidized by the government and then the sky's the limit on waste.

According to the CEET study, by 2015, wireless "cloud" infrastructure will consume as much as 43 terawatt-hours of electricity worldwide while generating 30 megatons of carbon dioxide.

TWh is a measure of energy, not power (that would be TW).

So, for the 43 TWh figure to make sense, you need to know the time interval over which that energy is used. So, are we talking about 43 TWh / hour? 43 TWh / day? 43 TWh / year? From today until 2015? Or what?

Or better yet, just do the math and convert it to power instead of energy. There are 8760 hours in one year, except for leap years, so if the 43TWh figure is for one year it would mean about 5 GW. You would need 5 nuclear power plants to cover that.

There are about 6 billion mobile phones worldwide. This would mean that on average a phone is responsible for 0.8 W of the power used, if we ignore other wireless devices.

According to the CEET study, by 2015, wireless "cloud" infrastructure will consume as much as 43 terawatt-hours of electricity worldwide while generating 30 megatons of carbon dioxide.

TWh is a measure of energy, not power (that would be TW).

So, for the 43 TWh figure to make sense, you need to know the time interval over which that energy is used.

I think the obvious implication of the reference to the year 2015 is that the time period is one year. If they had said "by fist quarter 2015" the time period would have been implied to be the quarter.

Maybe its just me, but the line about increasing datacenters and using less powerful client devices doesnt make loads of sense. I suppose moving all processing to datacenters might decrease the amount of traffic to the client devices, but would that not increase the need for datacenters by a fair margin? Those datacenters need power, so what is the net saving on power be?

I'm also no fan of thin clients and putting more of my data on the cloud.

The net savings is zero. This is just a rebirth of mainframe computing over WAN instead of LAN. We've hit a wall on getting CPU sizes smaller, so in order to keep the trend of micro-sizing everything, computing power must be moved off of end-user devices and into the 'cloud'. Then it only needs to stream and display data as opposed to also computing it. Less power means smaller chips. Smaller chips mean smaller product.

Its basically the opposite of the TV trend. TVs are becoming 'Smart' as opposed to just 'dumb' displays. While they want to move mobile phones from being miniature computers into only displays.

These figures included users who accessed the cloud through in-home and public Wi-Fi and femtocell as well as through carriers' cellular infrastructure.

That had me wondering about the cellular vs wi-fi figures, and then I read this comment...

rcxb wrote:

Some actual information, lacking from the article:

Wifi: 0.4 micro-Joules per bit.4G LTE: 73 to 136 micro-Joules per bit

Backbone/ISP networks (wired): 0.64 micro-Joules per bit

In short, they're really complaining about LTE (base station) power consumption. WiFi is quite energy efficient, more so than even the upstream hard-wired networks of your ISP. ...

... which makes perfect sense to me. I kind'a figured the lion share of the power usage was likely to be the cellular network, as rcxb states.

I came to a different conclusion about how to "save the planet" though; the reality is, most home users aren't going to start sharing their personal internet connections with strangers... when weighing the advantages of reducing society's collective carbon footprint against their own personal security against hacker activities and potential culpability in the event that a criminal action occurs on their internet connection, most people will choose security over activism.

However, if these activists (like Greenpeace) really wanted to push for the kind of changes that would make a difference, they'd probably do a better job of it by promoting broader distribution of home based Femtocells. Femtocells use your home internet connection to route the call, and as such, they put the physical wire that much closer to the wireless device, yielding the obvious advantage of less overall power consumption than a connection through your local cellular tower. Additionally, they can (usually at the consumer's discretion) be locked down to serve only trusted users, or opened up to permit any compatible cell phone to gain access... depending upon how much of an "activist" you want to be.

Maybe its just me, but the line about increasing datacenters and using less powerful client devices doesnt make loads of sense. I suppose moving all processing to datacenters might decrease the amount of traffic to the client devices, but would that not increase the need for datacenters by a fair margin? Those datacenters need power, so what is the net saving on power be?

I'm also no fan of thin clients and putting more of my data on the cloud.

The net savings is zero. This is just a rebirth of mainframe computing over WAN instead of LAN. We've hit a wall on getting CPU sizes smaller, so in order to keep the trend of micro-sizing everything, computing power must be moved off of end-user devices and into the 'cloud'. Then it only needs to stream and display data as opposed to also computing it. Less power means smaller chips. Smaller chips mean smaller product.

Its basically the opposite of the TV trend. TVs are becoming 'Smart' as opposed to just 'dumb' displays. While they want to move mobile phones from being miniature computers into only displays.

Except very little processing, as opposed to storage, is being done in the cloud.

In short, they're really complaining about LTE (base station) power consumption. WiFi is quite energy efficient, more so than even the upstream hard-wired networks of your ISP.

That's a little misleading, since WiFi is only one hop and is only sending your data a few tens of feet. WiFi may still be the most energy-inefficient single link in the chain from server to your device (the data you quoted doesn't provide an answer one way or the other).

ARS usually writes very coherent articles, but this isn't one of them. Of course cell phones use more energy per bit than wifi or wired. But this is not directly related to the cloud (as has been said). Thankfully I've always been able to depend on a large number of intelligent comments. The concern should not be a fear to use energy but rather a focus on using it wisely to the benefit of civilization. Energy is the ability to do work and thus is to be used not feared. That work can be used to reduce entropy which generally results to the good if used constructively.

ARS usually writes very coherent articles, but this isn't one of them. Of course cell phones use more energy per bit than wifi or wired. But this is not directly related to the cloud (as has been said). Thankfully I've always been able to depend on a large number of intelligent comments. The concern should not be a fear to use energy but rather a focus on using it wisely to the benefit of civilization. Energy is the ability to do work and thus is to be used not feared. That work can be used to reduce entropy which generally results to the good if used constructively.

Nobody should fear using energy.

What we should concern ourselves with, however, are the side effects of using this energy. The methods by which we generate most of our energy has the unfortunate effect of pumping millions of tons CO2 that has, until the last 150 years, remained trapped in the Earth's crust. There are (potentially serious) ramifications to doing this. And the work that we do with this energy (in cases relevant to the article above) is often checking our facebook feeds *right now* to see whether any of our 1000 BFF's have eaten a really photogenic muffin. Not an especially intelligent tradeoff.

The answer, however, is not to restrict the use of mobile technology (as much as I think it would actually improve our culture by a fair amount). It just won't happen. But we do need to heavily invest in alternate methods of:

generating the energywasting less of it (So many devices are left on/running/plugged in when it isn't necessary that they do so)using the energy (better and more efficient technology that actually leads to a reduction of use, not just a reduction in the rate of increase)

the wifi part of my home router, and separate wifi access points at work) is always powered up, always active 24/7, sending out signals, waiting for a mobile device to latch on.

Your WiFi AP is up all the time, but that doesn't mean it's using up anywhere near as much power. It'll send out a "beacon" every second or so, but that's about it. If nothing is talking to it, it doesn't need to send out much of a signal.

And what's more, radio technology has to deal with the near-far problem. If someone nearby is broadcasting at the same power levels as someone who is further away, the more distant user is completely drowned out. My point being, our modern wireless devices are VERY smart, and cut way back on broadcast power, whenever possible. Not because they care about your electric bills, but because it's fundamentally necessary to do so.

Ah, unless it is different than the several Wifi APs and Routers I have tested...no, they don't really have a low power mode. As one example, my Netgear 3500L (yes, rather old, but also VERY low powered compared to a lot of newer routers) "sucks" 4.5w "at idle" with the Wifi radio on and a single GbE LAN connection (no WLAN). Disabling the Wifi radio reduces power consumption to 4.2w. Removing the single connected wire brings it down to 3.8w. Connecting the WLAN port uses the same power as a LAN port, you get roughly .4w per port connected and .3w for the radios.

The only "significant" extra power consumption is if you connect the USB port to storage and are actively accessing (read or write) that storage off the router. Idle it increases power load by only about .1-.2w having a HDD connected, but reading/writing increases load by about .2w.

The difference between idle and "max throughput" on the LAN or Wifi side of things is less than .1w.

By comparison, the craptastic Actiontec router Verizon stuck me with (no longer routing, just MoCA bridging) sucks 11.5w of power doing nothing. Add about .4w if the Wifi radios are on and around .5w per LAN port connected.

Most current routers, and most are Wifi routers, seem to use in the 5-14w range of power (most in the 8-10w range).

So if you total all of that up, I can see how that is a pretty significant contributor. If you want to say there are 40 million Wifi routers in the US households (that comes out to something like 1 per internet connected household, +/- 20% either way is my guess) not including public/corporate/gov't. 40 million times 10w times 24 hours time 365 days is a lot of power. That is 3.5TW/hrs of power per year to drive all of the household Wifi around the US.

I am sure my numbers are possibly reasonably far off by a pretty wide margin, but I bet it gets it somewhere in the ballpark though.

Anyway, by my sketchy math, that means home Wifi is using roughly 1/3rd of the estimated power usage of the "US Wireless economy".

Sean Gallagher / Sean is Ars Technica's IT Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland.