Post navigation

Two recent publications provide corroborating evidence that LEED-certified buildings, on average, do not save primary energy. One of these looks at energy consumption for 24 academic buildings at a major university. The other looks at energy consumption by LEED-certified buildings in India. In both cases there is no evidence that LEED-certification reduced energy consumption.

The study of academic buildings is found in the article entitled “Energy use assessment of educational buildings: toward a campus-wide susainability policy” by Agdas, Srinivasan, Frost, and Masters published in the peer-reviewed journal Sustainable Cities and Societies. These researchers looked at the 2013 energy consumption of 10 LEED-certified academic buildings and 14 non-certified buildings on the campus of the University of Florida at Gainesville. They appear to have considered site energy intensity (site EUI) rather than my preferred metric, source energy intensity. Nevertheless their conclusions are consistent with my own — that LEED certified buildings show no significant energy savings as compared with similar non-certified buildings. This is also consistent with what has been published now in about 8 peer-reviewed journal articles on this topic. Only one peer-reviewed article (Newshem et al) reached a different conclusion — and that conclusion was rebutted by my own paper (Scofield). There are, of course, several reports published by the USGBC and related organizations that draw other conclusions.

The second recent publication comes out of India. The Indian Green Building Council (IGBC) — India’s equivalent of the USGBC — of its own accord posted energy consumption data for 50 of some 450 LEED certified buildings. Avikal Somvanshi and his colleagues at the Centre for Science and the Envionment took this opportunity to analyze the energy and water performance of these buildings, finding that the vast majority of these LEED-certified buildings were underperforming expectations. Moreover, roughly half of the 50 buildings failed even to qualify for the Bureau of Energy Efficiency’s (BEE) Star Rating (India’s equivalent of ENERGY STAR). The results were so embarrassing that the IGBC removed some of the data from their website and posted a disclaimer discounting the accuracy of the rest. In the future no doubt the IGBC will follow the practice of the USGBC of denying public access to energy consumption data while releasing selected tidbits for marketing purposes.

How long will the USGBC and its international affiliates be afforded the privilege of making unsupported claims about energy savings while hiding their data?

Like this:

There is this standing joke about the three great Amercian lies: 1) “the check is in the mail;” 2) “of course I will respect you in the morning;”, and 3) well … let me skip the last one. I think it is time to add a fourth lie to the list — this green project will lower energy use.

In my last post I mentioned that my home town of Oberlin, OH recently purchased new, automatic loader trash/recycling trucks and spent an extra $300,000 so that three of them included fuel-saving, hydraulic-hybrid technology. Town leaders claimed these trucks would save fuel and reduce carbon emissions. Simple cost/benefit calculations using their cost and fuel savings figures showed that this was an awful investment that would never pay for itself (in fuel savings) and that the cost per ton of carbon saved was astronomical.

A few weeks ago I requested from the City fuel consumption data for the first six months of operation of the new trucks. The City Manager and Public Works Director, instead, asked me to wait until after their July 6 report to City Council on the success of the new recycling program. They both assured me that fuel usage would be covered in this report. I was promised access to the data following their presentation.

Last Monday, in his presentation to Council, the Public Works Director highlighted data which showed that for the first six months of operation the City recycled 400 tons — as compared with the 337 tons it had recycled in the comparable period prior to acquisition of the new trucks. This represents a 19% increase in recycling. Unfortunately there was no mention of fuel usage or savings.

Yesterday I obtained fuel consumption data from the Public Works Director for Oberlin’s new garbage/recycing trucks along with comparitive fuel data from previous years using the old trucks. The new trucks are on track to use 2,000 gallons MORE diesel fuel than were used by the old trucks, annually. That’s right, not less fuel, but MORE fuel. This is a 19% increase in fuel usage. Gee what a surprise!

Soon the spin will begin. City Adminisrators will point out that fuel usage would be even worse were it not for their $300,000 investment in the hybrid technology. They will point out that the increased fuel usage is due to the new, automatic loading technology included in these trucks (though they failed to mention any expected increased fuel usage when the project was being sold to the public) — which enabled the use of larger recycling containers and the improvement in recycling. What they will fail to tell us is that they could have achieved the same increase in recycling using the older style truck without automatic loaders.

This is the second recent City project for which the public has been mislead regarding expected enegy savings. The first was the LEED-certified Fire Station renovation. This green building was supposed to save energy. It, of course, is bigger and better than the building it replaced — oh yes, and it uses more energy. But the increase in energy use wasn’t as much as it might have been because it was a green building. Now we have the same result for the trash and recycle trucks.

Oberlin College is in the process of constructing a new, green hotel — called the “Gateway Project” as it will usher in a new era of green construction. But people should understand, this new green hotel will use more energy than the old hotel — it will be bigger and better, and its energy use won’t be as big as it might have been — and this should make us feel good.

And in the next few months Oberlin residents will be asked to approve additional school taxes to construct new, green, energy-efficient public school facilities. But don’t be surprised when these new facilities actually use more energy than did the old ones. Don’t get me wrong — they will be more energy efficient than the old facilities, but they will be bigger, and better and — use more energy.

This is the new lie — that our new stuff will use less energy than our old stuff. But it isn’t true. Fundamentally we want bigger and better stuff. People like Donald Trump just build bigger and better stuff and proudly proclaim it. But isn’t pallitable for most of us — we feel guilty about wanting bigger and better stuff. So instead we find a way to convince ourseles that our new stuff will be green, it will lower carbon emission, it will make the world a better place — oh, and yes, it will be bigger and better.

We need our lies to make us feel good about doing what we wanted to do all along. Don’t get me wrong — sometimes the check is in the mail and sometimes the green project does save energy. But more often than not these lies are offered for temporary expediency, And, of course, I really will respect you in the morning.

Yesterday the Wall Street Journal published an article by Greg Ip which summarized the findings of an economic study conducted by Michael Greenstone, Meredith Fowlie and Catherine Wolfram. (Their original paper is entitled “Do Energy Efficiency Investments Deliver? Evidence from The Weatherization Assistance Program.”) These researchers looked at the actual energy savings and costs of a specific Weatherization Assistance Program (WAP). What they found was that the homes that took advantage of the WAP only achieved about 40% of the energy savings that engineering calculations had projected. When they compared the actual savings (not estimated savings) to the costs they concluded 1) that the investments would never pay for themselves (i.e., the cost of the energy saved over 16 years was less than the amount spent on the energy efficiency investments), and 2) the amount of money spent per metric tonne of carbon saved (over these 16 years) is $338/tonne — about 10X more than estimates for the longterm cost to society to solve the carbon emissions problem.

This article caught my attention for two reasons. First, this simply illustrates again the large gap between measured energy savings and those estimated by promoters of energy effciency programs. In particular, I have seen this over and over with green buildings. All the data I have analyzed show that, on average, LEED-certified buildings do not achieve the energy savings that their designers predict. Many organizations pride themselves on their portfolio of green buildings yet the fact is, these buildings consume no less primary energy than other other buildings. Society will not arrest climate change with this approach — even though it leads to all kinds of green awards.

But the second reason this caught my attention is due to the parallel these investments have with what is going on in my community of Oberlin, OH. The Oberlin City Council has made a commitment to make the City climate-positive (I guess it is like giving 110% effort). Apparently all divisions of the City are instructed to act in accordance with the City’s Climate Action Plan. The City’s Municipal Power Company has contracted with Efficiency Smart to promote energy efficiency programs for its customers. Efficiency Smart reports to the City on how much energy its programs have saved — savings that are based on projected estimates not measurements.

A year ago the City had the opportunity to purchase new garbarge/recycling trucks. The City spent an extra $300,000 in order to include hydraulic-hybrid, fuel saving technology in these trucks. The City Public Works director estimated the deisel fuel savings to be 2,800 gal annually. At a cost of $3.75/gal this represents an annual return of $10,600 on a $300,000 investment. Since the trucks are expected to last only 10 years the invesment will never pay for itself.

What about the carbon savings? If you work through the math you find that the reduced carbon emission (associated with less fuel usage) comes at a cost of about $600 per ton CO2. This is equivalent to $2,400 per metric tonne carbon savings. It was an utterly foolish decision to spend money this way. And this was made based on projected savings. In a few months we will see how much fuel the trucks have actually used.

City of Oberlin refuse truck

Don’t get me wrong. I am an advocate for energy efficiency that leads to real, cost-effective savings. But there must be a cost/benefit analysis. We cannot afford to throw money away on schemes that yield such little return. And we cannot base our decision on “projected” savings. I like the way that Wal-mart approaches energy efficiency. Perform the up-front calculation to find the projected savings. If these look good, retrofit a couple stores and measure the actual savings. If the trial study confirms the savings — roll out the same changes to all the other stores. If not, move on.

it claims that the model data contain no buildings less than 5,000 sf in size

with regard to the elimination of buildings < 5000 sf the EPA writes, “Analytical filter – values determined to be statistical outlyers.”

the cumulative distribution for this model from which ENERGY STAR scores are derived is said to be fit with a 2-parameter gamma distribution.

All of the above statements/descriptions are false. The filters described by the EPA do not produce an 82 record dataset, and the dataset produced do not then have the properties (min, max, and mean) described in Figure 2 of the EPA’s document. And a regression using the EPA’s variables on the dataset obtained using their stated filters do not produce the results listed in Figure 3 of the EPA’s document. In short, this EPA document is a work of fiction.

I have given the EPA the opportunity to supply facts supporting their claims by filing three Freedom of Information Act (FOIA) requests, the first (EPA-HQ-2013-00927) for the list of 1999 CBECS ID’s that correspond to their 82-building dataset, and the second (EPA-HQ-2013-009668) for the alpha and beta parameters for the gamma distribution that fits their data, and the third (EPA-HQ-2013-010011) for documents justifying their exclusion of buildings <5000 sf from many models, including Medical Offices. The EPA has closed the first two cases indicating they could not find any documents with the requested information. 17 months after filing the third request it remains open and the EPA has provided no documents pertaining to the Medical Office model. The EPA is publishing claims for which they have no supporting documents and that I have demonstrated are false. The details of my analysis are posted on the web and were referenced in my ACEEE paper.

In November 2014 the EPA corrected errors in other Technical Methodology documents yet it saw no need to correct or retract the Medical Office document. Why is it so hard for the EPA to say they messed up?

It is common for scientists to correct mistakes by publishing “errata” or even withdrawing a previously published paper. No doubt EPA staff once believed this document they have published was correct. But how is it possible the EPA remained unaware of the errors while it continued to publish and even revise this document for nearly a decade? How can the EPA continue to publish such false information six months after it has been informed of the errors?

Is the EPA lying about its Medical Office building model? I cannot say. But it is clear that the EPA either has total disregard for the truth or it is incompetent.

If these follks worked for NBC they would have to join Brian Willams on unpaid leave for six months. Apparently the federal government has a lower standard of competence and/or integrity.

On January 28, 2015 the District of Columbia published the second year of energy benchmarking data collected from private buildings. This year’s public disclosure applies to all commercial buildings 100,000 sf and larger while last year’s public disclosure was for all buildings 150,000 sf or bigger. Data published are drawn from the EPA’s ENERGY STAR Portfolio Manager and include building details such as gsf and principal building activity along with annual consumption for major fuels (electric, natural gas, steam), water, and calculated green house gas emission (associated with fuels). Also published are annual site EUI (energy use intensity) and weather-normalized source EUI metrics, commonly used to asses building energy use.

DC commercial buildings continue to be exceptionally efficient. The median reported ENERGY STAR® score for private commercial buildings in the District was 74 out of 100—well above the national median score of 50.

Buildings increased in efficiency from 2012 to 2013. Also, overall site energy use went up by 1.5% among buildings that reported 2012 and 2013 data. However, when accounting for weather impacts and fuel differences, the weather-normalized source energy use for the same set of buildings decreased by 3% in 2013.

These claims are simply unjustified.

In particular consider the second point — that 2013 source energy used by DC buildings is 3% lower than it was in 2012 — demonstrating improved energy efficiency. This claim is based on weather-normalized source energy numbers produced by the EPA’s Portfolio Manager. The problem is that the EPA lowered its site-to-source energy conversion factor for electricity from 3.34 to 3.14 in July 2013 — a 6% reduction. Because of this simple change, any building that has exactly the same energy purchases for 2013 that it did in 2012 will, according to Portfolio Manager, be using 4-6% less source energy in 2013 (depending on the amount of non-electric energy use). In other words — the District finds its buildings used 3% less source energy in 2013 than in 2012 when, in fact, by doing nothing, all US buildings saved 5-6% in source energy over this same time frame.

It is said that “a rising tide lifts all boats.” In this case the Washington DC boat did not rise quite as much as other boats.

More seriously, such small differences (1% – 3%) in average site or source energy are not resolvable within the statistical uncertainty of these numbers. The standard deviations of the 2012 and 2013 mean site and source EUI for DC buildings are too large to rule out the possibility that such small changes are simply accidental, rather than reflective of any trend. Scientists would know that. Politicians would not — nor would they care if it makes or a good sound bite.

Let me now address the other claim. It may well be true that the median ENERGY STAR score for district buildings is 74. I cannot confirm this – but I have no reason to doubt its veracity. But there are no data to support the assumption that the median ENERGY STAR score for all commecial buildings is 50. All evidence suggests that the national median score is substantially higher — in the 60-70 range, depending on the building type. My recent analysis shows that the science that underpins these ENERGY STAR scores is wanting. ENERGY STAR scores have little or no quantiative value and certainly DO NOT indicate a building’s energy efficiency ranking with respect to its national peer group — despite the EPA’s claims to the contrary.

The claim that the median score for US buildings is 50 is similar to making the claim that the median college course grade is a “C.” Imagine your daughter comes home from College and says, “my GPA is 2.8 (C+) which is significantly higher than the (presumed) median grade of 2.0 (C). You should be very proud of my performance.” The problem is the actual median college grade is much closer to 3.3 (B+). Its called grade inflation. Its gone on for so many years that we all know the median grade is not a “C.” Until recently ENERGY STAR scores were mostly secret — so the score inflation was not so apparent. But the publication of ENERGY STAR scores for large numbers of buildings as a result of laws such as those passed in Washington DC has removed the cloak — and the inflation is no longer hidden.

ENERGY STAR scores are no more than a “score” in a rating game whose ad hoc rules are set by the EPA in consultation with constituency groups. It seems to have motivational value, and there is nothing wrong with building owners voluntarily agreeing to play this game. But like fantasy football, it is not to be confused with the real game.

A few weeks ago NYC released Energy Benchmarking data for something like 15,000 buildings for 2013. 9500 of these buildings are classified as “Multifamily Housing” — the dominant property type for commercial buildings in NYC. While data from Multifamily Housing buildings were released by NYC last year, none included an ENERGY STAR building rating as the EPA had not yet developed a model for this type of building.

But a few months ago the EPA rolled-out its ENERGY STAR building score for Multifamily Housing. So this latest benchmarking disclosure from NYC includes ENERGY STAR scores for 876 buildings of this type. (Apparently the vast majority of NYC’s multifamily buildings did not qualify to receive an ENERGY STAR score — probably because the appropriate parameters were not entered into Portfolio Manager.) Scores span the full range, some being as low as 1 and others as high as 100. But are these scores meaningful?

Earlier this year I published a paper summarizing my analysis of the science behind 10 of the EPA’s ENERGY STAR models for conventional building types including: Offices, K-12 Schools, Hotels, Supermarkets, Medical Offices, Residence Halls, Worship Facilities, Senior Care Facilities, Retail Stores, and Warehouses. What I found was that these scores were nothing more than placebos — numbers issued in a voluntary game invented by the EPA to encourage building managers to pursue energy efficient practices. The problem with all 10 of these models is that the data on which they are based are simply inadequate for characterizing the parameters that determine building energy consumption. If this were not enough the EPA compounded the problem by making additional mathematical errors in most of its models. The entire system is built on a “house of cards.” The EPA ignores this reality and uses these data to generate a score anyway. But the scores carry no scientific significance. ENERGY STAR certification plaques are as useful as “pet rocks.”

Most of the above 10 models I analyzed were based on public data obtained from the EIA’s Commercial Building Energy Consumption Survey (CBECS). Because these data were publicly available these models could be replicated. One of the models (Senior Care Facilities) was based on voluntary data gathered by a private trade organization — data that were not publicly available. I was able to obtain these data through a Freedom of Information Act (FOIA) request and, once obtained, confirmed that this model was also not based on good science.

Like the Senior Care Facility model, the EPA’s Multifamily Housing ENERGY STAR model is constructed on private data not open to public scrutiny. These data were gathered by Fannie Mae. It is my understanding that a public version of these data will become available in January 2015. Perhaps then I will be able to replicate the EPA’s model and check its veracity. Based on information the EPA has released regarding the Multifamily ENERGY STAR model I fully expect to find it has no more scientific content than any of the other building models I have investigated.

One of the problems encountered when building an ENERGY STAR score on data that are “volunteered” is that they are necessarily skewed. Put more simply, there is no reason to believe that the data submitted voluntarily are representative of the larger building stock. ENERGY STAR scores are supposed to reflect a building’s energy efficiency percentile ranking as compared with similar buildings, nationally. When properly defined, one expects these scores to be uniformly distributed in the national building stock. In other words, if you were to calculate ENERGY STAR scores for thousands of Multifamily Housing Buildings across the nation, you expect 10% of them to be in the top 10% (i.e., scores 91-100), 10% in the lowest 10% (i.e., scores 1-10), and so on. If this is not the case then clearly the scores do not mean what we are told they mean.

Meanwhile, it is interesting to look at the distribution of ENERGY STAR scores that were issued for the 900-or-so Multifamily Housing facilities in NYC’s 2013 benchmarking data. A histogram of these scores is shown below. The dashed line shows the expected result — a uniform distribution of ENERGY STAR scores. Instead we see that NYC has far more low and high scores than expected, and relatively fewer scores in the mid-range. 24% of NYC buildings have ENERGY STAR scores ranging from 91-100, more than twice the expected number. And 31% of its buildings have scores 1-10, more than 3X the expected number. Meanwhile only 12% have scores ranging from 41 to 90. We expect 50% of the buildings to have scores in this range.

Of course it is possible that New York City just doesn’t have many “average” Multifamily Housing buildings. After all, this is a city of extremes — maybe it has lots of bad buildings and lots of great buildings but relatively few just so-so buildings. Maybe all the “so-so” buildings are found in the “fly-over states.”

I ascribe to the scientific principal known as Occam’s Razor. This principal basically says that when faced with several competing explanations for the same phenomenon, choose the simplest explanation rather than more complicated ones. The simplest explanation for the above histogram is that these ENERGY STAR scores do not, in fact, represent national percentile rankings at all. The EPA did not have a nationally representative sample of Multifamily Housing buildings on which to build its model, and its attempt to compensate for this failed. Until the EPA provides evidence to the contrary — this is the simplest explanation.

The thrust of my presenation was to discuss what we know about primary energy savings reduction in green house gas emission for LEED-certified buildings. Despite the fact that there are roughly 11,000 U.S. commercial buildings certified before Jan. 1, 2013 under LEED New Construction (NC), Core and Shell (CS), Existing Buildings (EB:OM), and LEED for Schools — all LEED programs that address whole building energy use — we have published data from just 2% of these buildings. This paltry amount of data is mostly gathered by voluntary submissions by building owners willing to share their energy data. You can bet that such data are skewed towards the better performing buildings.

And even so, the data available show that, on average, LEED-certified buildings show no significant source energy savings or reduction in GHG emission relative to comparable, non-LEED buildings. That was the thrust of my presentation.

Note that promoters of LEED certification continue to claim energy savings — but these claims are based on design projections not actual performance measurements. For instance, promoters of Ohio’s Green schools claim 33% reduction in energy use. But there has never been a study of energy used by Ohio’s LEED-certified schools to demonstrate this assumed savings. Such claims of energy savings are based on “faith” not “fact.”

Post navigation

Follow Blog via Email

Professor of Physics at Oberlin College. I was originally trained as a condensed matter experimentalist. In the last 15 years my research has focused on photovoltaic devices, PV arrays, wind energy, energy efficiency, and energy use in buildings.