Does Your iPhone Use As Much Electricity As A New Refrigerator? Not Even Close.

For 14 years, the coal industry has been pushing the myth the Internet is an energy hog. For 14 years, I (and other scientists) have been debunking that myth. Last week, I promised a detailed debunking of the iPhone=Refrigerator calculation from Dr. Jon Koomey, the world’s foremost authority on the electricity consumption of the Internet. Here it is — JR.

Last week, several friends alerted me to a claim that the iPhone supposedly uses as much electricity as two refrigerators — when you count the energy needed to make it, run it and power the “behind-the-wall” equipment to deliver data to the device. Discussion of the original report (“The Cloud Begins with Coal,” hereafter CBC) showed up on the Breakthrough Institute site, Time Magazine Online, MSN News, the Huffington Post, MarketWatch, and Grist, among others (with most focusing on the comparison between a smart phone and one refrigerator).

When I heard this claim, it took me back to the year 2000, when Mark P. Mills and Peter Huber first made the claim that the networking electricity for a wireless Palm VII exceeded the electricity for running a refrigerator (1000 to 2000 kWh, they claimed, the lower bound of which was a bit higher than the average installed base for US fridges at that time). It didn’t sound plausible, and so I and some colleagues investigated, finding that Mr. Mills and Mr. Huber had overestimated the electricity needed to feed data to a wireless Palm VII by a factor of 2000 (Koomey et al. 2004).

Just as happened last time, Mr. Mills, in the CBC report, has made attention-getting claims that don’t stand up to scrutiny (Kawamoto et al. 2002, Koomey 2000, Koomey 2003, Koomey 2008, Koomey et al. 1999, Koomey et al. 2002, Koomey et al. 2004, Romm et al. 1999, Roth et al. 2002). He cherry picks numbers to achieve his desired results, and his report has vague or non-existent references (but lots of footnotes). This appears to be an attempt to create a patina of respectability for his calculations while obfuscating his methods, but I don’t know for sure.

The big story here is why the media is paying any attention to this report at all. Mr. Mills proved more than a decade ago that he is not a reliable source on the issue of electricity used by information technology. His recent work simply confirms this conclusion. Unfortunately, it also confirms what seems to be an inability of most media outlets to report sensibly about technical topics, in part because of the pressure to generate attention-getting headlines, regardless of their veracity. This sorry episode does not make me optimistic for our ability as a society to deal with complex issues like climate change in the 21st century unless we change the way media reporting is conducted on technical issues.

Background

To assess whether the current claim about iPhones and fridges is true, you need to understand a few things.

1. A lot has changed since 2000. In 2011, new fridges used 574 kWh/year, not the 700 kWh/year that was true in 2000. Smart phones are now commonplace in developed countries, and the amount of data flowing to each mobile device is much higher than in 2000. The energy efficiency of the network has also improved, doubling approximately every two years.

2. Estimating the electricity intensity of electricity use for cell phones requires knowledge of how much electricity the cellular network uses, the total amount of data flowing over that network, and the amount of data going to each phone annually. This way you can allocate the electricity use in proportion to the amount of data for which each phone is responsible. This is the method I used back in the early 2000s in assessing the claim about the wireless Palm VII (as far as I know, no one had done that before).

An equivalent way to think about this is to divide the total electricity use of the network by the total data flows to get an electricity intensity number in kWh of electricity use per gigabyte (GB) of information transferred (kWh/GB), and then multiply this electricity intensity by the number of gigabytes used by each phone over the course of a year. That will give us an estimate of the amount of network electricity associated with each phone.

3. Network electricity use doesn’t change much with usage. In other words, the routers and switches that deliver data to your phone don’t change their electricity use significantly, even when the devices go from zero to full load. There’s very little marginal effect of your accessing the network — almost all of the electricity is needed to keep the system running. This means that the allocation method used in point #2 is a simplification that may not yield correct results in all cases. It allocates that fixed electricity use to the network’s users in a sensible way, but implies that if I don’t access the network that electricity won’t be used, which is not true–it will be used whether I access the network or not.

4. There are large changes over time in the efficiency of information technology (IT) equipment, including mobile phones and the supporting systems. There are also big changes in the amount of data downloaded by each device. Because these systems change so fast, even a year or two matters when comparing data on electricity use or data flows, and large differences exist between different generations of technology.

Recent data from Sweden for 2010 that is now under peer review at The Journal of Industrial Ecology (Figure 9, from Malmodin et al 2013) shows that 2G phone systems (which exist in modest numbers in Sweden but are dominant in the developing world) have an electricity intensity of 37 kWh/GB transferred, while 3G systems use approximately 2.9 kWh/GB, and (according to the latest data from Australia) 4G systems use between 0.4 and 0.8 kWh/GB transferred (taken from this report, as explained in an email to me from Kerry James Hinton on August 20, 2013). Geographic differences can also be quite important (because the density of subscribers in a cellular network has a big effect on how electricity intensive it is).

Data flows also differ greatly for 2G, 3G, and 4G devices, so simple averages of “typical cell phones” will mislead you, especially if you apply monthly data flows for 4G phones to the electricity intensity of 2G/3G phones (as the Breakthrough Institute post and an associated email to the Time Magazine reporter appear to do). Careless choice of data can therefore result in inconsistent comparisons that are meaningless.

5. Mr. Mills has a well-documented history of substantially overestimating electricity used by IT equipment (for the most complete and up-to-date summary, read the Epilogue to the 2nd edition of my book, Turning Numbers into Knowledge). He overestimated the electricity used for the Internet by a factor of eight, the electricity needed for all office equipment (including the Internet) by a factor of 4, and the electricity associated with the Palm VI by a factor of 2000, as stated above. All of these findings are carefully documented in the peer-reviewed literature, as summarized here. It’s therefore a mystery to me why anyone would pay attention to what Mr. Mills says about the electricity used by IT equipment.

Analysis

With that background, let’s proceed to the analysis. I’m only going to focus on the iPhone vs refrigerator comparison in this post.

Here’s what the CBD report claims in the text:

“Reduced to personal terms, although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year.1” (p.3, executive summary)

Here’s the relevant footnote, which is #1 on p. 44 of the report (I reproduce it here exactly as it appears in the report, using a screen shot, then retype it so it’s searchable):

Four Initial Problems

1. A savvy Internet user would infer that the blue underlined text in the footnote above is associated with web links, but when you read the PDF version of the CBC report, you find that it’s just blue underlined text, with no links. This presentation is misleading, and I don’t know why this footnote is formatted in this way. A researcher who wanted you to read the underlying materials would have given actual links, to make it easy on the reader. It’s possible that this is just a software glitch with whatever PDF creation software they used, but there’s no way to know without more information.

2. The footnote and text lump the tablet and cell phone together, as if they were the same thing, but they most certainly are not. From an embodied energy perspective, the cell phone is much smaller, and therefore requires much less energy to manufacture than a tablet. Apple gives environmental reports for the iPhone 5 and the iPad with retina display, and these sources show embodied carbon emissions about twice as big for the iPad as for the iPhone. In addition, cell phones generally use less data than tablets in operation, so operating electricity use for the network will be less for phones (in my quick search I couldn’t find reliable data on this point, but it stands to reason that people watch more movies on iPads than on cell phones).

3. There are no references in the footnote for some key information, namely the embodied energy to produce a tablet, the HD video data flows (claimed to be from Netflix, but what’s the source? Is this for hard-wired connections or mobile connections?), the embodied energy associated with a refrigerator, and the claim that “cell network operating energy equals annualized embodied energy of network equipment used for 5 years”. Without such references, its impossible to know how Mr. Mills came to choose these particular numbers.

4. I’ve learned from experience that I need to check everything Mr. Mills says, and it’s not clear where his number of 350 kWh/year for new refrigerators comes from. He refers to Energy Star, but gives no reference. That program recognizes the best new appliances that substantially exceed typical new refrigerator efficiency.

I consulted my colleagues at Lawrence Berkeley National Laboratory, who have analyzed refrigerator efficiency for more than three decades, and they told me that the average new US refrigerator in 2011 used 574 kWh/year. Other relevant data on typical new refrigerators can be found here and here. I haven’t yet hunted down all the details, but Mr. Mills seems to have underestimated the electricity used by a new refrigerator (although it is true that there are superefficient Energy Star refrigerators that use that little). The main point for readers to note is that Mr. Mills has taken the energy use of the most efficient refrigerators on the market against which to compare iPhone electricity use, thus biasing his comparison, without explaining that this is what he’s doing.

Boundaries Of Mills’ Calculations

Footnote 1 in CBC states that the networking electricity for a tablet or cell phone is 300 kWh/year, the embodied energy for a tablet is 100 kWh/year, with the implication that the embodied energy for the cellular network (i.e. that contained in the manufacturing and construction of the base stations and related equipment) is 300 kWh/year per tablet or cell phone (that makes 700 kWh/year total). As stated in the footnote, data center operations and direct electricity used by tablets/cell phones are not included.

A Better Method

Creating an estimate of cell phone energy use comparable to that presented by Mr. Mills requires transparent calculations of three factors:

1. How much energy cellular networks consume in their operations.

2. The embodied energy contained in the devices that compose the cellular network.

3. The embodied energy contained in the phones and tablets that use the network.

I’ll cover each in turn in the sections below. Paralleling Mr. Mills, we ignore data center/core network electricity use and direct electricity used by cell phones and tablets, because these are small.

1. Operating energy of cellular network

To estimate the operating energy of cellular networks, we need the electricity intensity of that network (in kWh/GB) and the number of GB of data that flows to the mobile phone in a year. In effect, as described above, this allocates the network electricity associated with the data flows to that cell phone in proportion to the fraction of total data flows for which the phone is responsible.

Mr. Mills gives the key numbers for his calculation: 2.8 GB/hour for HD video, 1 hour of HD video downloaded per week (52 GB/year), and 2 kWh/GB for electricity intensity. When you multiply these numbers together you get 2.8 GB/hr x 52 hours/year x 2 kWh/GB, or 291 kWh/year, which matches his claim of 300 kWh/year. So far, so good. These results also imply data downloads of 2.8 x 52/12 = 12 GB/month.

Let’s begin with the data flows. Mr. Mills doesn’t give an exact source for his 2.8 GB/hour figure, although he refers to Netflix. I asked my research assistant to dig around for that, and he found this site. It says 2.8 GB/hour is the peak download rate if viewing HD video at best quality. But who does this on their smart phone?

In the meantime, I found the data calculators for AT&T; and Verizon, the two biggest cellular companies in the US.

These allow you to figure out how much data you would use under different assumptions, so you can choose the appropriate wireless plan. When I plugged in 1 hour per month of video into the Verizon calculator, I got numbers that ranged from 0.25 to 0.35 GB for smartphones using 3G and 4G networks, respectively. Using the AT&T calculator for tablets I found a range of 0.12 to 0.31 GB for one hour of standard and HD video, respectively. These values are one tenth of the number Mr. Mills assumed for data flows.

There is additional confirmation that Mr. Mills’ data flow assumptions are too high. In a survey of 1000 smart phone users to which the article in Time linked, we see monthly data flows of around 1 GB/month for six months in 2012 (range 0.6 to 1.4 GB/month over that six month period). The iPhone data (a subset of the total) are for only 100 phones, and they show a bigger range (from 0.2 to 1.6 GB/month, depending on the carrier) in September 2012. I assume that these are data for the US, but the web page is not explicit about that.

The important thing to understand about these smartphone users (and particularly the iPhone users) is that these are almost all use 4G networks, which are much faster (and much more energy efficient) than 2G and 3G networks. The data flows for 3G or 2G-only phones would be much lower (because it’s too painfully slow to do large downloads on these kinds of networks).

In addition, you should note that Mr. Mills assumed 12 GB/month, while the survey results showed only 1 GB/month. Using the AT&T and Verizon web calculators and comparing to this survey of phone users, it is clear that Mr. Mills overestimated data flows by a factor of 10 to 12.

Now let’s turn to electricity intensity. Mr. Mills gives a number of 2 kWh/GB, and refers to the CEET study by the University of Melbourne. I emailed Kerry Hinton, the principal author of that study, and asked him about this number. He replied “I cannot see how they got 2 kWh/GB from our White Paper”. In reply to my query, he calculated electricity intensities from their analysis for 2012, giving a range of 0.4 to 0.8 kWh/GB (depending on the density of cell users).

You need to specify exactly which type of network you’re operating on when you describe data flows and efficiency, otherwise you will make big errors, as Mr. Mills appears to have done. The University of Melbourne study focused on 4G/LTE technology, which is much more energy efficient than 2G or 3G. How much more efficient? Malmodin et al. (described above) gives numbers for Sweden of 37 kWh/GB and 2.9 kWh/GB for 2G and 3G systems, respectively. 4G networks are therefore 62 times (37/0.6) more efficient than 2G networks and 4.8 times more efficient than 3G.

The survey of smart phone users was for 4G (because it was 2012 and presumably in the US), resulting in about 12 GB/year of data flows. We need to pair that number with the 4G efficiency of 0.6 kWh/GB (average of the range) to get a consistent estimate of 7.2 kWh/year. Thus,Mr. Mills’ 300 kWh/year for data flows should be 42 times lower, or 7.2 kWh/year.

Mr. Mills gives a electricity intensity range of 2 to 19 kWh/GB, and chooses the lowest value, thus seeming to be a reasonable fellow, but let’s examine that range. 2 kWh/GB is a bit lower than the Swedish average for 3G intensity, but it’s in the same ballpark, so let’s call it 3G. That means he’d need to pair that intensity with data flows for 3G phones, but such data are not easily available.

The 19 kWh/GB is very high, and looks like the intensity of some mixture of 2G and 3G networks (recall that the Swedish intensity for 2G in 2010 was 37 kWh/GB). The reference that Mr. Mills used in footnote 1 (AT Kearny) gives that 19 kWh/GB number for 2011 in an infographic on p.69 of their report (incorrectly listed as kW/GB) and then again in an infographic on p.85. It is paired with an estimate of 33.6 kWh/GB for 2009.

Page 85 of the AT Kearny report refers to this report from the GSMA. The GSMA document describes their sample of 34 cellular systems as containing 16 from developed economies and 18 from emerging markets. And that makes it almost certain that they are talking about a mixture of 2G and 3G technologies, which would be much slower and vastly less efficient than 4G. The size of the intensities that report presents (33.6 kWh/GB in 2009 and 27.2 kWh/GB in 2010) confirms that conclusion.

The 19 kWh/GB figure is nowhere to be found in the referenced study, so its origins are a mystery, but we can reverse engineer it in a crude way. If we assume that all the developed economies in the GSMA sample use 3G and all the developing nations use 2G, I calculate an average electricity intensity of 21.5 kWh/GB, which is very close to the 19 kWh/GB at the top of Mr. Mills’ range (I use the 2G and 3G electricity intensities from the 2010 Swedish data for this purpose).

As quoted in the Time article, the Breakthrough Institute did a calculation (which was endorsed by Mr. Mills) where they applied the 19 kWh/GB electricity intensity to the highest data flows from the survey I describe above (1.6 GB/month). This is also a mistake, because the 19 kWh/GB (wherever it came from) is more representative of 2G/3G technology, while the data flows are representative of the highest users on the 4G network.

2. Estimating the embodied energy of the cellular network

On p. 40, the CBC report cites a report called “Cellular Networks with Embodied Energy, IEEE Network.” I could find no report with this name online, but I did find Humar et al. (2011), which I assume is the one to which Mr. Mills is referring.

This report describes embodied energy for cellular base stations, which is a major part of the electricity used for the cellular network. Figure 3 in that article (on p. 45, reproduced below) shows that embodied energy is about half of the total lifetime operating energy for a base station. This means that the annualized contribution of embodied emissions to our calculation should be about half that of the operating energy. Using our recalculated operating energy, that yields 7.2 kWh plus 3.6 kWh, for a total of 10.8 kWh/year.

3. Estimating the embodied energy of tablets and smart phones

I only know of one old source (Toffel and Horvath 2004) for embodied energy of PDAs, with no more recent source. Since Mr. Mills gave no source for his claim, I have no way to evaluate it in detail. The Apple data sheets only give carbon dioxide equivalent emissions (CO2e), and a large part of the CO2e burden of manufacturing is not energy related. I will leave that more detailed investigation for another day, once I hear back from colleagues who have conducted studies of this type.

The embodied greenhouse gas emissions of a cell phone appear to be about half of those for a tablet, based on the Apple data sheets, which implies that the energy use associated with manufacturing is almost certainly lower for the phone than for the tablet (but Apple doesn’t split out the energy data). This result makes sense, because the phone is so much smaller than the tablet, and uses much less material. At a minimum, we can adjust Mr. Mills’ estimate for a tablet by a factor of two for a cell phone, yielding 50 kWh/year (but that is almost certainly an overestimate, given that a large fraction of greenhouse gases in manufacturing are from chemicals in the manufacturing process, not energy use).

Adding 50 kWh/year to 10.8 kWh/year from the previous section yields 61 kWh/year for a smart phone, which is 9% of Mr. Mills’ estimate of 700 kWh/year. My result is almost certainly an overestimate, but even if it isn’t, there’s no way an iPhone uses as much electricity as a refrigerator (or two, as Mr. Mills claims). And don’t forget, Mr. Mills underestimated the electricity used by a refrigerator. If we correct his claim to reflect actual electricity use for new refrigerators (let’s call it 550 kWh/year, to give him the benefit of the doubt), two refrigerators would use 1100 kWh/year, and the final result I calculated would be only 5.5% of the total for two refrigerators, or a factor of 18 lower than the fridges.

Table 1 summarizes the results of my recalculations.

Summary

Just as happened last time, Mr. Mills has made attention-getting claims that don’t stand up to scrutiny. He cherry picks numbers to achieve his intended results, and his report has vague or non-existent references (but lots of footnotes). This appears to be a deliberate attempt to convey an air of scholarly care while at the same time obfuscating his methods, but I’ll let readers draw their own conclusions from the issues I describe above.

One way to support the claim that the electricity use associated with an iPhone is as large (or twice as large) as a refrigerator is to combine the high electricity intensity of 2G/3G cell phone networks with the largest plausible data downloads from 4G networks (as the Breakthrough Institute has done). Another way is to use 3G electricity intensities and exaggerate data flows by a factor of 12 (as Mr. Mills has done). In either case, the electricity intensity and data flow numbers are inconsistent and incomparable, so the results are nonsensical. When corrected for more sensible assumptions (including an increase in the electricity use of new refrigerators to reflect the best current data), Mr. Mills is about a factor of 18 too high in his estimates of electricity use associated with a smart phone, assuming we take the written statement on p.3 of the CBC report literally.

The big story here is why the media is paying any attention to this report at all. Mr. Mills proved more than a decade ago that he is not a reliable source on the issue of electricity used by information technology, and his recent work simply confirms this. Unfortunately, it also confirms what seems to be an inability of most media outlets to report sensibly about technical topics, in part because of the pressure to generate attention-getting headlines, regardless of their veracity. In my view, this sorry episode does not bode well for our ability as a society to deal with complex issues like climate change in the 21st century unless we change the way media reporting is conducted on technical issues.