Energy Trends Insiderhttp://www.energytrendsinsider.com
Energy industry news & analysisMon, 28 Nov 2016 04:54:36 +0000en-UShourly1http://wordpress.org/?v=3.5ConsumerEnergyReporthttps://feedburner.google.com20 Billion Barrels Of Oil Were Not Discovered In The Permian Basinhttp://www.energytrendsinsider.com/2016/11/27/20-billion-barrels-of-oil-was-not-discovered-in-the-permian-basin/
http://www.energytrendsinsider.com/2016/11/27/20-billion-barrels-of-oil-was-not-discovered-in-the-permian-basin/#commentsSun, 27 Nov 2016 21:06:48 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19489Some readers may have noticed that I haven’t been posting as many articles here as I have in the past, but it’s simply because I am busy keeping up with deadlines elsewhere. I have an article due once a week for Forbes, and two weekly articles and one longer biweekly article for Investing Daily. Add to that occasional articles I do for other sites, and I am writing around 200 articles a year with firm deadlines. On top of that, I have a regular day job as an engineer, and this year has been exceptionally busy.

This column doesn’t have a firm deadline. It’s a place I can write when everything else is caught up. But lately those other commitments have been taking up most of my spare time, and I have been lucky to get one column posted a month here. So, that’s the reason my posting frequency here has declined.

I have had some people tell me that they don’t like dealing with the ads on the Forbes site, and they have asked if I could repost some of my Forbes articles here. I am allowed to do that after they have appeared exclusively at Forbes for a few days. So today, I want to reproduce a modified version of one that got pretty good traffic at Forbes, and has gotten a lot of attention in the press.

————–

Earlier this month the U.S. Geological Survey (USGS) announced the largest estimate of continuous oil that it has ever assessed. The area assessed is in the Permian Basin, a region that has been producing oil continuously for nearly 100 years. Below I will get into the specifics of what this news means, but the mainstream media mostly interpreted it as CNN did in Mammoth Texas oil discovery biggest ever in USA. But this is a fundamental misunderstanding by CNN of what the news entails.

The Permian Basin lies underneath western Texas and southeastern New Mexico. The geology of the Permian Basin is rich and complex, both horizontally and vertically. The Permian Basin has commercial accumulations of oil and gas in stacked layers, at depths ranging from 1,000 feet to more than 25,000 feet. The greater Permian is made up of several subsidiary basins, the largest of which are the Midland and the Delaware.

The new USGS assessment is of the Wolfcamp shale in the Midland Basin portion of Texas’ Permian Basin. The new USGS survey estimates that there are 20 billion barrels of undiscovered, technically recoverable oil in the Wolfcamp alone. Quoting from the USGS news release:

“The fact that this is the largest assessment of continuous oil we have ever done just goes to show that, even in areas that have produced billions of barrels of oil, there is still the potential to find billions more,” said Walter Guidroz, program coordinator for the USGS Energy Resources Program. “Changes in technology and industry practices can have significant effects on what resources are technically recoverable, and that’s why we continue to perform resource assessments throughout the United States and the world.”

It is important to understand what this assessment actually means. This oil has been assessed as an “undiscovered resource.” This scientific assessment means the forecasters have a certain degree of confidence that the oil is there. For this particular assessment, the 50% confidence level is that there are at least 20 billion barrels there. The study further estimates that there is a 95% chance that there are at least 11 billion barrels there, and a 5% chance that there are at least 31 billion barrels there.

However, the fact that the assessment refers to the “resource” means that they are estimating the technically recoverable oil in place. This says nothing of the economics of recovering this oil. The amount that would be economically worthwhile to recover at prevailing commodity prices — which would be classified as “proved reserves” — will be a smaller subset of the assessed amount. It would even be zero at a sufficiently low oil price. This is merely an attempt by the USGS to estimate the amount of oil that could be extracted over time if cost was not a concern.

But given the history of the Permian Basin, it’s a pretty safe bet that there is still a lot of oil still left to produce there. The Permian Basin began producing oil in 1921. The Texas side of the Permian has already produced nearly 30 billion barrels of crude, as well as 75 trillion cubic feet (Tcf) of natural gas. Permian crude oil production has more than doubled since 2010 largely as a result of hydraulic fracturing in six low-permeability formations: Spraberry, Wolfcamp, Bone Spring, Glorieta, Yeso and Delaware.

According to the Energy Information Administration’s (EIA) most recent Permian Region Drilling Productivity Report, the Permian is presently producing 2 million barrels per day (bpd) of oil and 7.3 billion cubic feet per day (Bcf/d) of natural gas. This accounts for more than 23% of current U.S. crude oil production, and exceeds the combined oil output of the Bakken and the Eagle Ford:

Source: Energy Information Administration.

I have heard some characterize this news as a new oil discovery. It it not. This new USGS assessment is merely an attempt to put some framework around how much oil may exist in one of several producing formations within the Permian.

But to the critics that would hand wave this assessment away as much ado about nothing, I would remind them that the Permian has seen oil production more than double in the past six years. That is significant. There is also still a lot of oil to be produced there. So while you are on solid ground when correcting anyone who calls this a new discovery, you shouldn’t underestimate the Permian Basin.

]]>http://www.energytrendsinsider.com/2016/11/27/20-billion-barrels-of-oil-was-not-discovered-in-the-permian-basin/feed/5Ethanol From Carbon Dioxide Is Still A Losing Propositionhttp://www.energytrendsinsider.com/2016/10/27/ethanol-from-carbon-dioxide-is-still-a-losing-proposition/
http://www.energytrendsinsider.com/2016/10/27/ethanol-from-carbon-dioxide-is-still-a-losing-proposition/#commentsThu, 27 Oct 2016 19:26:15 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19469If I told you that I had created a process to extract pure gold from seawater, you might deem it an amazing accomplishment. If I issued a press release stating these facts, it very well could go viral.

In fact, the oceans do contain an estimated 20 million tons of dissolved gold, worth close to a quadrillion dollars at the current spot market price. But you may have noticed that I have omitted a very important fact.

I haven’t mentioned how much it costs to produce a troy ounce of gold using the process I have designed. That seems like an important detail, so I explain that the production cost is only $50,000 or so per ounce (which today is worth about $1,265), but I am sure that with enough investment dollars — and maybe a few government subsidies — I can get that cost down to something more reasonable. (This is how we subsidize some advanced biofuels where production costs are an order of magnitude above what could be considered economical).

Readers immediately understand the problem. You don’t spend more to produce something than you can sell it for. But change the equation to energy instead of money and people suddenly forget that lesson. Or they fail to recognize that is what is taking place.

That brings me to the point of today’s article, one I’m forced to reiterate often: in the world of energy as in most others, there is no free lunch.

Earlier this month a research paper was published by the Department of Energy’s Oak Ridge National Laboratory (ORNL) called “High-Selectivity Electrochemical Conversion of CO2 to Ethanol using a Copper Nanoparticle/N-Doped Graphene Electrode.” The paper reports on some truly interesting science, and the researchers were measured and cautious in their conclusions.

But something got lost in translation as media outlets sought to portray this as a “holy grail,” “game changer,” “major breakthrough” or “solution to climate change.” The benefits, one story said, were unimaginable. Part of the problem, in my opinion, is that the press release from the Department of Energy was titled Scientists Accidentally Turned CO2 Into Ethanol.

The word “accidental” plays into the misconception people have of how science is done. Many take the romantic view that game-changing, eureka discoveries are merely awaiting the next lucky accident, so when they read this headline the translation becomes something like “New Discovery Solves Climate Change.”

That’s because the public loves its energy miracles. People love the idea of a car that can run on water or the car that gets 400 miles per gallon (which of course GM and Ford suppressed) or the magic pill you can pop in your tank that greatly enhances fuel efficiency. So it isn’t surprising that this kind of story goes viral (in notable contrast to the articles debunking these viral stories.)

In order to understand what’s really going on, let’s consider a fundamental principle of thermodynamics.

If you burn something containing a combination of carbon, hydrogen, and oxygen — e.g., gasoline, ethanol, wood, natural gas — that combustion reaction is going to produce heat, carbon dioxide and water. These are the combustion products.

It is possible to reverse the combustion reaction and convert that water and carbon dioxide back into fuel. But you have to add heat. A lot of heat. How much? More than you can get from burning the fuel in the first place. No new catalyst, and no discovery, accidental or otherwise, can get around that fundamental issue without overturning scientific laws observed and confirmed over 150 years.

Given that, what can we say immediately about this process? Going back to the fundamentals of thermodynamics, we can say, without a doubt, that the process consumes more energy than it produces. In other words, to produce 1 British thermal unit (BTU) of ethanol will require the initial consumption of more than 1 BTU of energy (and generate CO2 emissions.) The resulting 1 BTU of ethanol would ultimately be consumed. The net effect once the ethanol is consumed is more than 2 BTUs’ worth of emissions per BTU of ethanol produced. Or, to be blunt, unless the process can be run on excess renewable or nuclear power (more on that below), converting carbon dioxide into ethanol would actually worsen net carbon dioxide emissions.

Now the researchers involved certainly know this. They actually acknowledged in the paper that the process is unlikely to be economically viable. To my knowledge they haven’t intentionally misled anyone.

But the public has been misled in the retelling of the story. I have heard this research presented as “an efficient way of removing carbon dioxide from the atmosphere.” No, that’s not at all what the researchers claimed. They claimed a Faradaic efficiency in the process of 63%. In other words, 63% of the electricity used in process was utilized in the reaction. They further said that 84% of what was produced was ethanol. That’s the “high-selectivity” part of the title.

But that says nothing at all about the energy consumption required to remove carbon dioxide from the atmosphere so it can participate in this reaction. That is an enormous energy cost because carbon dioxide exists at only 400 parts per million in the atmosphere. Or in the case of passive removal (which is what plants do by means of photosynthesis), the process is very slow.

The high Faradaic efficiency and selectivity also provide little information about the overall energy requirements to turn purified carbon dioxide into purified ethanol, but we already know that it’s more than the energy contained in the ethanol. And it could be a lot more, and that could result in a lot more carbon dioxide emissions.

There is a way that a process like this that is an energy sink could be viable, and that would be if you had cheap, surplus energy that might otherwise be wasted. For example, if a wind farm or nuclear plant produced far more electricity than the grid could handle, you could envision dumping the excess power into such a process. That could in theory reduce carbon dioxide emissions, but there are a lot of caveats that would warrant a longer discussion. Such an intermittent process brings up its own set of issues, and then there’s the question of whether that would really be the best use of the surplus energy.

The bottom line here is if someone presents a scheme for turning air, water, or carbon dioxide into fuel, it is necessarily consumes more energy than it produces. It is an energy sink.

Now, I need to get back to processing ocean water, just as soon as I finish writing this grant proposal for the process.

]]>http://www.energytrendsinsider.com/2016/10/27/ethanol-from-carbon-dioxide-is-still-a-losing-proposition/feed/11Where Hubbert Went Really Wrong On Peak Oilhttp://www.energytrendsinsider.com/2016/09/16/where-hubbert-went-really-wrong-on-peak-oil/
http://www.energytrendsinsider.com/2016/09/16/where-hubbert-went-really-wrong-on-peak-oil/#commentsFri, 16 Sep 2016 05:28:10 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19449If you happen to be interested in the topic of “peak oil”, you almost certainly know the name M. King Hubbert. While you may know that Hubbert is widely credited with accurately predicting the peak of U.S. oil production, you may not know the full context of his predictions — which are legendary in peak oil circles.

The history of the scientific study of peak oil dates to the 1950s, when Hubbert, a Shell geophysicist, reported on studies he had undertaken regarding the production rates of oil and gas fields. In a 1956 paper, Nuclear Energy and the Fossil Fuels, Hubbert suggested that oil production in a particular region would approximate a bell curve, increasing exponentially during the early stages of production before eventually slowing, reaching a peak when approximately half of a field had been extracted, and then going into terminal production decline.

A peak in oil production, that is the maximum rate of production after which a field, country, or the world as a whole begins to decline is at the core of the peak oil issue. A country is said to have peaked, or reached peak oil after it becomes apparent that oil production in the region is steadily declining year after year.

Hubbert’s fame in peak oil circles comes primarily from the assertion that he accurately predicted the 1970 U.S. peak. Because of this prediction, Hubbert is widely-regarded among peak oil adherents as a visionary. He has been called an oracle and a prophet. A recently published article — What Hubbert And Pickens Got Right About Oil, And What’s Next — recounts the uncanny accuracy of his prediction.

The truth, however, is much more nuanced. Hubbert got a lot of things tremendously wrong, and his much-heralded 1970 prediction contains a large caveat of which most people are entirely unaware. Here is what his 1956 paper actually stated.

Hubbert estimated that the ultimate potential reserve of the Lower 48 U.S. states and offshore areas was 150 billion barrels of oil. Based on that reserve estimate, the 6.6 million barrels per day (bpd) extraction rate in 1955, and the fact that 52.5 billion barrels of oil had been cumulatively produced in the U.S. already, Hubbert estimated that oil production in the U.S. would reach maximum production in 1965. That was his base prediction. He wrote “the curve must culminate at about 1965 and then must decline at a rate comparable to its earlier rate of growth.” Hubbert illustrated this 1965 peak in his paper:

Source: Nuclear Energy and the Fossil Fuels by M. King Hubbert

As shown in the illustration, Hubbert projected a U.S. oil production peak in 1965 at an annual production rate of about 2.8 billion barrels, or 7.7 million barrels per day (bpd). However, note that there is another curve rising above and extending beyond the 1965 peak. This was Hubbert’s “contingency case.” He calculated that if the U.S. oil reserve was 200 billion barrels, peak production would occur in 1970, a delay of five years from his base case. However, he indicated skepticism about the reserve being that high, noting that this would imply “an amount equal to eight East Texas oil fields” beyond the 150 billion barrel estimate. Nevertheless, if the U.S. reserve was as high as 200 billion barrels Hubbert estimated a 1970 U.S. oil production peak at 3 billion barrels, or 8.2 million bpd. Oil production in the U.S. did in fact peak in 1970, albeit at 9.6 million bpd.

While Hubbert’s prediction was in the ballpark, those who cite him don’t seem to be aware that his “perfect” 1970 prediction was based on a secondary case about which he expressed skepticism, and it was about 15% too low on the production rate. Hubbert’s base case — a prediction made in 1956 of a 1965 peak — was off by 5 years and was 20% too low. Or to put it another way, his base case at that time was that U.S. oil production would peak in 9 years, but it actually peaked after 14 years and at 15% higher production than he projected.

My point here is to address his oil production predictions based on what he actually wrote. Still, as someone who frequently makes predictions, I will say that his predictions about U.S. oil production were pretty good. They weren’t prophetic, or nearly as exact as many peak oil adherents claim. But they were in the ballpark.

Yet when we look at what he had to say about global production and natural gas production, his predictions were way off the mark. He arrived at an estimate of the ultimate conventional oil production of the world by comparing a number of estimates. He settled on an estimate of 1.25 trillion barrels for the ultimate potential conventional oil production. We now know that this estimate was far too low. But based on this estimate, Hubbert projected that the global peak in crude oil production would occur around the year 2000 at 34 million bpd. In reality, crude oil production in 2000 was more than twice as high at about 75 million bpd. Further, while conventional crude oil production did flatten around 2005, more than a decade later there is no evidence that it has begun to decline. (Overall global production has continued to grow, primarily because of the rise of shale oil production). So this was a big miss.

Hubbert’s defenders will argue that he only really missed the date of the conventional crude oil peak by 5 years. But, his methodology specifies a peak and decline. That is not what we have seen. In fact, until conventional crude begins to decline in earnest we really don’t know how far off the mark his peak 2000 prediction may be. The longer conventional production holds steady (or even modestly increases), the further off the mark his prediction.

In any case, 34 million bpd was not remotely in the ballpark of the production rate in 2000. And Hubbert noted “variations of this assumed maximum rate will advance or retard the date of the culmination.” If you plug in the actual production rates instead of the much lower rates he assumed, his estimated global peak would have been moved far back, likely into the 1980′s. So the cumulative miss over time is huge. By 2016 his curve shows production having fallen for 16 years from the 2000 peak, when in reality conventional production is up a lot since 2000, and has advanced somewhat since 2005.

Hubbert’s natural gas estimates also underestimated production volumes. Using the same methodology he used for oil, he projected a 1970 peak in U.S. natural gas production at a rate of about 38 billion cubic feet per day (Bcf/d). In reality, 1970 production in the U.S. was 57.6 Bcf/d, and would grow to 59.5 Bcf/d in 1973. (Today’s production rate of 74 Bcf/d is a result of an explosion of shale gas production, which Hubbert didn’t consider).

His estimates of coal production follow a similar pattern. He predicted a peak in global coal production of just over 6 billion metric tons per year around the year 2150. His global coal production estimate for ~2016 was about 4 billion metric tons per year. In reality, coal production has been over 8 billion metric tons in 3 of the past 4 years.

Why were Hubbert’s predictions consistently low? One reason is that even though he allowed for improvements in recovery techniques, he underestimated the impact. Indeed, ”eight East Texas oil fields” were discovered as a result of improved drilling and recovery techniques.

But the biggest issue was simply the underestimate of reserves. U.S. conventional oil production has already surpassed Hubbert’s stretch case for what could be ultimately recovered. Globally, by the year 2000 the world had already produced about a trillion barrels of conventional oil. But the remaining conventional reserve at that time — 1.3 trillion barrels — was still greater than his total estimate of 1.25 trillion barrels made in 1956.

Conclusions

While Hubbert’s defenders are correct that his predictions applied to conventional oil production, as someone who actively participated in peak oil debates a decade ago I can attest that essentially nobody believed that unconventional oil production would ramp up fast enough to prevent a global decline in oil production. The “conventional” part of Hubbert’s prediction only began to get a lot of attention once it was clear that total oil production was still growing steadily. Thus, the impact of unconventional oil on global oil supplies was underestimated. To the extent that Hubbert’s technique has been applied to provide quantitative estimates of future oil supply, or the timing of peak oil supply — it has failed. At the end of the day, the oil markets don’t care whether the oil is conventional or unconventional. What matters is how much is being produced and at what price. (The environmental issues are another matter entirely).

However, this should not be considered a repudiation of Hubbert’s work. As someone who is concerned about future oil supplies, I have found the ideas that Hubbert put forward useful in understanding qualitatively what’s going on. His work also spawned tremendous awareness about the issue of peak oil and resource depletion in general. To reiterate, the purpose of writing this is not to diminish Hubbert’s novel ideas, but rather to strip away the mythology and put his work in proper context. There are certainly things to be learned from Hubbert’s work, but the more it is idolized and treated as prophecy, the less useful it becomes.

]]>http://www.energytrendsinsider.com/2016/09/16/where-hubbert-went-really-wrong-on-peak-oil/feed/7U.S. Gasoline Demand Surges To New Recordhttp://www.energytrendsinsider.com/2016/08/19/u-s-gasoline-demand-surges-to-new-record/
http://www.energytrendsinsider.com/2016/08/19/u-s-gasoline-demand-surges-to-new-record/#commentsFri, 19 Aug 2016 23:01:11 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19431In last month’s Short Term Energy Outlook (STEO), the Energy Information Administration (EIA) projected that it now expects record U.S. gasoline consumption this year:

Motor gasoline consumption is forecast to increase by 130,000 b/d (1.5%) to 9.29 million b/d in 2016, which would make it the highest annual average gasoline consumption on record, beating the previous record set in 2007 by 0.1%. The increase in gasoline consumption reflects a forecast 2.5% increase in highway travel (because of employment growth and lower retail gasoline prices) that is partially offset by increases in vehicle fleet fuel economy.

This projected increase follows several years of lower gasoline demand that resulted from persistently rising gasoline prices over the past decade. From 2002 to 2012 the average retail price of gasoline rose nearly every year, from an annual average of $1.39/gal in 2002 to $3.68/gal in 2012. Consumers responded to these higher prices in multiple ways, which cumulatively led to falling gasoline demand. Some even suggested that U.S. gasoline demand had permanently peaked, as a result of more fuel efficient vehicles and increasing adoption of electric vehicles (EVs). We can now say those predictions were premature.

Gasoline prices has fallen over the past two years. With the oil price collapse that began in the second half of 2014, the average retail price of gasoline fell to $3.44/gal in 2014 and then plunged to $2.52/gal in 2015. The average retail price fell to under $2.00/gal earlier in 2016, and is on pace to average even lower this year than in 2015.

As recently as February of this year the EIA was forecasting that this year’s gasoline demand would be below the 2007 peak, but demand has surged since then. In fact, I have analyzed the EIA’s estimates for Product Supplied – Finished Motor Gasoline and found that the average for the first six months of 2016 was the highest ever recorded for the first half of a year at 9.38 million bpd. The previous record for the first six months of a year was in 2007 at 9.30 million bpd. Following that, July’s average gasoline demand of 9.75 million bpd was the highest monthly demand ever recorded by the EIA.

Some have suggested that this higher level of gasoline demand is a reflection of higher U.S. exports of gasoline, and not actual U.S. demand. While it is true that the U.S. is exporting about 400,000 bpd of gasoline, according to the EIA that is already factored into the calculation of “Product Supplied.” Quoting from the EIA’s glossary:

Product supplied: Approximately represents consumption of petroleum products because it measures the disappearance of these products from primary sources, i.e., refineries, natural gas-processing plants, blending plants, pipelines, and bulk terminals. In general, product supplied of each product in any given period is computed as follows: field production, plus refinery production, plus imports, plus unaccounted-for crude oil (plus net receipts when calculated on a PAD District basis) minus stock change, minus crude oil losses, minus refinery inputs, and minus exports.

So the definition is taking into account both gasoline exports and gasoline imports into the picture. And in fact, the U.S. is currently importing over 800,000 bpd of gasoline — far more than we export.

Thus, the bottom line is that despite the growth of EVs and an increase in the overall fuel efficiency of cars on U.S. roads, gasoline demand has risen to record levels.

When I built my electric bicycle back in 2007, I had been waiting for a battery that was less volatile than what had been available. I didn’t want to risk having a fireball under my seat. Tesla traded volatility for power density.

I think electric cars are great for all kinds of reasons, which is why I bought one in 2011. But like any car, they are not created equal, and as marketers begin the process of differentiating them to get us to buy them, that inequality will grow and diversify as it has for conventional cars. And for any fellow electric car enthusiasts out there who think electric cars are going to make a significant dent in carbon emissions in the foreseeable future, read Robert Rapier’s article on that subject. Even a strongly biased study by the UCS shows that electric cars, on average, presently produce about half of the emissions of conventional cars in a cradle-to-grave analysis. Eliminating fossil fuels instead of nuclear from our energy mix will improve that over time.

2007 cell phone photo of Hummer and Cherokee

Way back in 2007 when I was blogging for Grist, I took a picture with my cell phone (note the low resolution) of a Hummer parked next to a Cherokee. I drove a Cherokee at the time. I wrote a short blog post about it titled Not all SUVs are created equal:

…I spotted a yellow Hummer parked next to a yellow Cherokee (the original SUV) the other day. The contrast was startling. Status seeking has a natural tendency to escalate. You know the end of a fad is near when it finally spawns a ridiculous monstrosity like the Hummer.

That’s right. I actually saw a Hummer pulling a trailer with stuff in it yesterday. Although stunned, I recovered in time to get a shot of his trailer as he pulled away from the transfer station. Coincidentally, I was also pulling a trailer on my bike (also visible in the lower right hand corner). We smirked at one another as we passed.

He (me) accepts that a hybrid such as a Prius (shown being assembled in Toyota City, Japan) would use more energy during its lifecycle, given that CNW has placed its lifecycle at 109,000 miles while giving SUV 197,000 miles.

In my article I pointed out that it was the assumption that a Prius would only be driven 109,000 miles that made it worse than a Hummer. You would have been lucky to get that out of a Pinto. I was highly skeptical that a Prius, with assistance from an electric motor, would last no longer than a Pinto. Earlier this year Consumer Reports listed the Prius as one of the 10 Best Cars to Get to 200,000 Miles and Beyond.

Wife’s 2016 Prius getting its first oil change

The Hummer brand has been discontinued. A lot has changed in the decade since I wrote those articles, human nature, not so much.

As one might have predicted, a similar dust to dust study showed up in 2013 suggesting that the Tesla was as bad for climate change as a conventional car. I didn’t get into that one but I did take a quick look at the Union of Concerned Citizens and Scientists study (that may or may not be better than the other study), which of course, concluded the opposite. The UCS conclusion:

For all Americans, charging the average new EV produces far fewer global warming pollutants than driving the average new gasoline car.

Although not mentioned in the study, another conclusion that can be drawn is that when it comes to carbon emissions, a Prius is as good or considerably better than an electric car in most states.

Tesla Model X that lives in my neigborhood

I was also surprised to find, using the data in the UCS study that, according to their assumptions and calculations, the Tesla sedan would emit 36% more emissions than the Leaf over its lifetime, which is greater than the difference between an average car and a standard pickup truck. They were also careful not to point this out anywhere in the study.(1) And please, don’t shoot the messenger. It’s time we stopped comparing apples to oranges.

If a Hummer is less environmentally benign than a Cherokee, then a Tesla Model X is less environmentally benign than a Leaf.

(1)Standard 2WD pickups sold in 2016 average 18.7 mpg. The average car sold in 2015 gets 25 mpg. 2016 2WD standard pickups emit 34% more carbon emissions per mile than the average car.

I was recently invited to attend the first annual Clean Energy Forum, hosted by Energy Northwest in Richland, Washington, which included a tour of the Columbia Generating station.

The Tour

We were greeted at the security gate by three polite security guards who inspected the bus and checked our photo IDs against a list. This level of security isn’t unique to nuclear power stations. You would have to go through a similar procedure to take a tour of Hoover dam. We also had to leave our cell phones on the bus (which would also be the case should you ever get the chance to take the highly recommended Boeing, Everett factory tour).

Next, we had to pass through metal detectors very similar to the ones I had to walk through at the airport. Between the airports and tour, I passed through metal detectors four different times on this trip.

We were given radiation dose badges (to document that exposure levels were well-below any amount that could possibly affect health).

We saw the control room mock-up where crews are trained to staff the real control room. They ran through a simulated core shutdown from an earthquake (including a shaking floor and emergency lighting), which took only a few seconds to complete. It looked complex but I doubt that there were many more gauges, lights, and switches in that control room than you would find in a 747 cockpit (between 365 and 970 of them, depending on model). Because a control room does not have to fly, the gauges and switches were quite large and widely spaced in comparison.

Photo of 747 Cockpit National Air and Space Museum

The rest of the tour was pretty much like any other tour through an industrial facility, pipes, pumps, noise. The highlight for me was looking down onto the top of the nuclear power core. So much electricity from so little space.

We were also shown a cooling pool filled with used fuel. Divers are sometimes hired to do maintenance in these pools. The taxi driver who took me back to the airport explained that just three feet of water would shield the diver from radiation. More from my taxi driver later.

Sketch of Used Nuclear Fuel Cooling Pool

I was hoping to see the dry cask storage area and maybe take a walk around a cooling tower, but no such luck. These structures are located outside and ambient temperatures were approaching 100 degrees F. Too hot for a long stroll in the sun. Consider reading James Conca’s article: America’s Heat Wave No Sweat For Nuclear Power.

Below is a an animation showing how the Columbia power station works. It’s pretty much like any other thermal power station (solar thermal, geothermal, coal) except for the source of heat energy that makes steam.

Although this power station is over three decades old (middle-aged for a nuclear power station), costs of production have dropped 20% since 2009, in part because improvements have allowed it to produce 70 more megawatts of power. It also recently set a record by continuously producing power without stoppage for just short of two years.

Small Modular Reactors (SMR)

NuScale Power gave a short presentation. NuScale is a company developing small modular nuclear power stations which will be built in a factory and trucked to a construction site for assembly. The most appealing aspect of these to me is their potential to be used at the brown sites occupied by closed coal power stations which already have transmission lines and sources of cooling water, etc.

Their simplicity is also very appealing. In the event of losing all external power, the reactor will cool off passively without moving parts (pumps, valves, etc).

Correcting Public Misconceptions

In light of recent actions by antinuclear groups, Energy Northwest has been taking a more active role in correcting some misconceptions:

This was a “clean” energy forum, not a nuclear energy forum. Northwest Energy also owns and operates hydro, wind, and solar stations. Their Twitter page is titled “Green energy from nuclear, wind, hydro, solar.” A quick Google search for the terms green and clean energy finds that nuclear fits those definitions as well as hydro, wind, and solar. The definitions will sometimes require the source to also be renewable, but that’s cheating. Renewable is not a synonym of clean or green and contrary to what antinuclear organizations want you to believe, renewability is not the overarching concern right now.

With the exception of biomass and biofuels, nuclear, wind, solar, and hydro are all low, or zero emissions sources, which, in my humble opinion, is the term I think we should be using instead of vague ones like renewable, clean, and green. Better yet, how hard is it to list all of the sources you’re talking about (nuclear, wind, solar, and hydro)?

Polls have shown that 93% of the people who live in Richland are in favor of nuclear power. Oddly enough, polls have also shown that these same people think that only one-in-five other people favor nuclear energy. In other words, they think they’re surrounded by people who are antinuclear. But that turns out to be yet another misconception. From a 2015 national poll:

The survey finds near-unanimity on the value of energy diversity. Ninety-six percent of Americans believe it is important to maintain energy diversity; 76 percent consider it very important to do so. Similarly, 86 percent of Americans say “we should take advantage of all low-carbon energy sources, including nuclear, hydro and renewable energy, to produce the electricity we need while limiting greenhouse gas emissions.

My Conclusions

The antinuclear Seattle City Council (hundreds of miles away from Richland) recently unanimously passed a resolution opposing nuclear energy:

I could be wrong …but there appears to be a slight cultural divide between the two communities:

Snapshot of body-painted bicyclists I took at this year’s Seattle Fremont Solstice Parade

Another story told by my cab driver was being asked by visitors if he was afraid to swim in the Columbia river with all of the radiation leaking into it. His response was that it would be diluted to harmless levels by the river, and any leaks would be coming from liquid waste stored in old underground tanks from the production of military weapons, not from the operation of the nuclear power station. It was heartening to meet a cab driver from Richland who was better informed about nuclear waste at the Hanford reservation than both Bill Nye the Science Guy and the Seattle City Council.

The immediate concern for some U.S. nuclear power stations (but not all) is to weather the historically low price of natural gas. This could be done via a price on carbon or a modest subsidy per unit energy. And in fact, New York has just led the way in doing the latter, essentially putting a defacto price on carbon with a modest subsidy for some nuclear (both ideas reduce the use of natural gas).

Lisa: If I’m going to bail the country out, I’ll have to raise taxes, but in my speech I’d like to avoid calling it a “painful emergency tax carbon tax.”

Milhouse: What about, “colossal salary grab a nuclear subsidy?”

Lisa: See, that has the same problem. We need to soften the blow.

Milhouse: Well, if you just want to out-and-out lie tell the truth …Okay, we could call it a, “temporary refund adjustment zero-emissions credit.”

Lisa: I love it.

Milhouse: Really? What else do you love, Lisa?

Lisa: Fiscal solvency.

Milhouse: [disappointedly] Oh. Yeah, me too.

]]>http://www.energytrendsinsider.com/2016/08/07/first-annual-clean-energy-forum-at-the-columbia-nuclear-power-generating-station/feed/3Why I Am Skeptical Of Electric Vehicleshttp://www.energytrendsinsider.com/2016/07/29/why-i-am-skeptical-of-electric-vehicles/
http://www.energytrendsinsider.com/2016/07/29/why-i-am-skeptical-of-electric-vehicles/#commentsFri, 29 Jul 2016 15:06:25 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19345Before you start furiously typing out a retort, hear me out. First, I want to make it clear what I am not skeptical about. I am not skeptical about electric vehicles (EVs) continuing to grow rapidly for the foreseeable future. Indeed, I believe that will happen — although growth has slowed in the U.S. in recent years.

I am also not skeptical over the fact that EVs make sense for many people. Indeed, I would buy one myself if I could justify it economically. I have only put about 5,000 miles on my car in the past 2 years, so it’s hard to justify any sort of premium that could be paid off by fuel savings.

I am also not skeptical that EVs will get cheaper, and that improvements in batteries will extend their range. I believe tomorrow’s EV will be much better than today’s.

So far, so good. On these three points, I am on the same page with the most rabid EV enthusiast. But I am extremely skeptical about one thing.

I am skeptical that EVs are going to make any dent in our oil consumption in the foreseeable future.

Let me explain why by first examining global crude oil demand growth over the past three decades. In the 32 years since 1984, global crude oil demand has increased by 36 million barrels per day (bpd) – an average annual increase of 1.1 million bpd per year:

Year-over-year crude oil demand declined in only 3 of those 32 years, and in each case bounced back to the historical growth rate very quickly. Further, the average annual increase since 2010 has been well above the historical average at more than 1.5 million bpd per year.

Of course that’s history, which merely gives us an indication that the long-term trends for oil consumption have been up for a long time. The reason they continue to grow is that growth is being driven by developing countries. Demand in developed countries has been falling (although U.S. gasoline demand is at a record high this year). But that graph admittedly doesn’t necessarily tell us about the future. So we have to look for examples that may give some insight into the future.

I first give you Norway. Following years of very generous subsidies for EVs, Norway has the largest fleet of plug-in EVs per capita in the world. Norway’s growth rate for EVs has been higher than that of any other country, averaging an amazing 110% per year for the past seven years:

One would expect a decline in Norway’s oil consumption given those trends. After all, Norway is surrounded by members of the European Union (EU), where demand for oil since 2008 is down 14% (primarily in response to much higher oil prices). Nearby countries like Denmark (-14%), Sweden (-16%), and Finland (-21%) all had big declines.

But not Norway. Norway’s consumption has trended slightly higher while all the countries around it experienced double-digit declines in petroleum demand since 2008.

Some may immediately note that Norway’s consumption has been relatively flat for several years, but keep in mind that demand was declining across the developed world in response to $100/bbl oil. So what happened in Norway? Shouldn’t demand there have declined at least as much as in countries that didn’t have explosive EV growth?

The reason the huge growth in electric vehicles didn’t translate into a reduction in demand in Norway is because it is set against a backdrop of a rising population and a growing fleet of vehicles on the roads (as is the case worldwide). The problem is that the conventional car fleet is adding cars faster than EVs are adding cars:

Also important to note that Norway is adding a lot of diesel engines to the fleet, another factor that helps explain the flattening in their oil demand. But, as the graph shows since 2008 they added about 300,000 diesel and gasoline cars to the roads, but despite the explosive growth in EVs the total over the same time period is only about 80,000 cars. And Norway’s explosive EV growth rate is starting to slow as the country scales back its generous subsidies.

Consider that in the U.S., from 2014 to 2015, new car sales of conventional internal combustion vehicles increased from 16.5 million to 17.5 million. Yet EV sales in the U.S. actually decreased from 122,438 to 116,099. In other words, they have a very long way to go to even dent the growth in conventional new car sales, much less make an actual reduction in the fleet.

This is essentially the problem with most projections that assume that EVs will soon take a big bite out of oil consumption. The world currently consumes over 90 million barrels per day (bpd) of crude oil. That number is growing by more than 1 million bpd each year, yet most projections fail to account for this growth that is due to growing population, more people driving, etc. Like what happened in Norway. A recent Bloomberg article made this very mistake by assuming that fantastic growth rates in EVs could globally displace 2 million bpd by 2023, and that could crash oil prices. The only problem is that even in the unlikely event that EVs displaced 2 million bpd of petroleum demand by 2023, then global crude oil demand may only be 5 million bpd higher than it is today instead of 7. They assumed it would be 2 million bpd lower than today, again ignoring growth (and the reasons for that growth).

In any case, according to data at Inside EVs, in the U.S. 2016 EV sales year-to-date (YTD) are only about 16% higher than YTD sales a year ago. That would project to maybe an additional 20,000 EVs sold in the U.S. to reach nearly 140,000 for the year. Again, that’s against the backdrop of 2015 sales of 17.5 million conventional cars, which was up a million cars from the previous year.

Globally, EV sales are running 43% ahead of last year’s pace. That’s far behind Norway’s blistering pace that failed to reduce oil consumption, and well behind the 60% growth rate assumed by the Bloomberg article to cause a 2 million bpd drop in demand by 2023. If they assumed a lower growth rate of 45% — still unreasonably high in my view — they don’t impact 2 million bpd of demand until 2028. That’s another 5 years of demand growth for oil, but also importantly another 5 years of depletion of existing fields. Oil demand won’t continue to grow forever, because ultimately depletion will catch up and force prices much higher. In that case, what will happen isn’t the price crash that Bloomberg predicted, it’s the exact opposite.

We certainly need EVs, but I haven’t seen anyone put together a credible mathematical case that they will even arrest the growth in oil demand over the next decade. Inevitably, they rely on faulty assumptions of fantastic EV growth rates and zero growth for oil — which is contrary to our observations. That’s why I am skeptical. If you project out far enough then indeed you can see EVs making a dent, but that’s far further into the future than proponents like to admit, and oil prices are likely to be much higher — not lower — when that happens.

]]>http://www.energytrendsinsider.com/2016/07/29/why-i-am-skeptical-of-electric-vehicles/feed/262Opening The Floodgates On Clean Energy Deployment In The U.S.http://www.energytrendsinsider.com/2016/07/27/opening-the-floodgates-on-clean-energy-deployment-in-the-u-s/
http://www.energytrendsinsider.com/2016/07/27/opening-the-floodgates-on-clean-energy-deployment-in-the-u-s/#commentsWed, 27 Jul 2016 23:40:53 +0000Elias Hinckleyhttp://www.energytrendsinsider.com/?p=19337The biggest constraint to renewable energy growth in the US is the availability of tax equity to support project investment. There is not nearly as much tax equity investment as is needed to support financing and building all of the renewable energy projects in development – as a result the pace of project financing and construction is being severely constrained. Many new investors will begin to enter this tax equity investment space in pursuit of outsized returns with virtually no risk created by a significantly undersupplied investment market. These new tax investors will usher in a period of unprecedented growth in the construction of renewable energy projects.

The Strange Market of Tax Equity Investing

Investment in renewable energy comes from three sources. (1) Project Equity –the investment that actually owns the clean energy facility, this includes the risk of operation and the long-term value of the asset, and there are plenty of investors willing to participate as part of (or all of) this investment. (2) Debt – this is generally traditional project equity lending, and as with project equity there are plenty of lenders – big banks, small banks, private debt funds – ready to lend to all kinds of renewable energy projects. For these traditional sources of project financing project risks are increasingly well understood and, provided there is enough project revenue to cover debt repayment, this money is readily available. (3) Tax Equity – this third, and vital source of capital are investments made in the project that will be repaid primarily through tax credits and other tax savings to the tax equity investor. There simply is not currently enough tax equity to support the pace of growth in renewable power development in the U.S.

To the surprise of most people in the clean energy industry, renewable power was given long term, stable extensions of the two primary economic supports for renewable electricity generation projects when Congress approved multi-year extensions for both the Production Tax Credit (which primarily supports wind power, but can also be used for biomass, geothermal and other clean energy projects) and the Investment Tax Credit (which primarily supports solar) in December of 2015. These tax credits (as well as some other tax benefits) are the primary federal tool in place to support the development and deployment of clean energy in the U.S., and are typically the single largest incentive available to one of these projects.

While the absolute value of the tax benefits supporting a solar or wind project are substantial, actually realizing that value can be extremely challenging for the owner/developer of one of these projects. Investments in electric generating equipment are appealing because the underlying electric generating equipment has a relatively long life and these assets have historically provided relatively stable, if modest, long-term returns. The challenge with tax credit-based subsidies arises because long stable returns don’t match up well with getting fair value for the front-loaded tax incentives (the ITC is generally returned and realized in year one as a dollar for dollar reduction in taxes paid, but can be carried forward 20 years). The owner/operator of the renewable generation assets typically does not have enough income in the early years to use the tax incentives (also important to note that for many owners or investors – electric coops, municipal power companies, pension funds, etc. – there may never be an opportunity to use the tax incentives). The longer use of the credits or deductions are delayed, like any investment return, the less they are worth.

For reasons deep in the byzantine psychology of Congress, tax credits can’t technically be sold from the party that creates the right to another taxpayer that can use the credits. Despite this prohibition on sales it is widely understood that not allowing the transfer of tax credits creates huge problems (arbitrary value creation for a taxpayer vs. non-taxpayer, undermines investment in exact activity the subsidy is meant to spur, etc.), so there are actually ways to move the value of these tax benefits to a taxpayer that can use them effectively and efficiently: structured investments and shared ownership in the project where a tax investor makes an investment in exchange for the value of the tax benefits, and the capital of any other investors or lenders is returned using the cash value created by the project (cash flow or sales proceeds).

These structured tax investments used to mirror the sale of tax credits are how tax equity investments are made. If a project can’t attract a tax equity investment it is generally understood that the project is not economically viable, won’t get financed and won’t get built. In the current market many – quite likely the vast majority of – renewable energy projects in development cannot attract the necessary tax investments not because the projects are not good, economically viable projects, but simply because there is not enough tax equity in the market.

There are a very limited number of tax investors currently active in the U.S. clean energy market and as a result a very limited amount of available tax equity investment available for projects. This undersupply of these special investment dollars puts an absolute limit on how many projects can be financed and built each year—essentially an undersupply.

Outsized Returns and a Huge New Market

The undersupply of tax equity also means that the market for tax equity doesn’t operate naturally and has allowed tax investors to be far more selective and conservative in choosing projects than would happen in a fully supplied investment market. Returns are outsized to risk. Tax equity returns – these are heavily front-loaded and are typically insulated from much of the operational risk – commonly run higher than the returns for equity investors who are first in line to absorb project losses. This is exactly backwards to what would be expected in a normally operating market (and is backwards from how tax investment returns work in real estate tax investing).

With returns larger than the associated risk, it would be natural for new investors to flood the market, but that hasn’t happened. For more than a decade the renewable development industry has expected an inflow of taxpaying corporations to arrive and add liquidity to the tax equity market. New investors have been slow to the market for a few reasons. The continuity of these tax credits has been unstable because cycles of congressional infighting. This lack of stable long-term extensions has made potential investors cautious about investing in the necessary knowledge for these investments. Energy investing can be complicated. The necessary tax structuring is complicated and the related accounting is complex. Transaction costs can be high and poorly planned investments can have a short-term negative impact on reported earnings.

In 2015 U.S. corporations paid $344 billion in income taxes – total tax equity investment in the renewable power sector was reported to be $11.5 Billion. So there is plenty of investment available (there was more than $1.5 trillion in individual income tax paid during this same period, but tax investments by individuals is subject to a special set of passive loss rules make these investments extremely rare and require sophisticated professional guidance).

Long-term tax credit extensions coupled with huge growth projections for renewables has the industry again looking for an inflow of new tax investors from financial institutions and eventually a waterfall of corporate tax equity. This is starting to happen – anecdotally several new investors have emerged looking for guidance on whether to enter the market and how best to do it. The potential for several years of very strong returns on investment will help new investors overcome concerns about complexity and perceived risk.

This new inflow of tax equity will be the foundation for a huge uptick in financed and constructed renewable energy projects. As new investment comes into the market several non-core renewable energy markets, like rooftop solar for commercial and industrial buyers, corporate buyers of large utility scale renewables and projects with merchant power risk will all see easier access to project financing. Countless projects that wouldn’t otherwise get built will, unleashing a period of tremendous growth for clean energy.

There are significant rewards for those that can move fast into this tax investment market. Because of the undersupply of tax equity returns for these investments remain outsized to risk. Additionally, as more tax equity flows into the market, knowing how the transactions work will allow early entrants to better control transaction time and cost allowing these investors to find continued prosperity in an increasingly competitive market.

]]>http://www.energytrendsinsider.com/2016/07/27/opening-the-floodgates-on-clean-energy-deployment-in-the-u-s/feed/2Midyear Prediction Checkhttp://www.energytrendsinsider.com/2016/07/11/midyear-prediction-check/
http://www.energytrendsinsider.com/2016/07/11/midyear-prediction-check/#commentsMon, 11 Jul 2016 21:40:39 +0000Robert Rapierhttp://www.energytrendsinsider.com/?p=19311In January of this year, as I do every year, I made several energy predictions for the upcoming year. (See My 2016 Energy Predictions). Now that half the year is in the books, I thought it might be a good idea to check in and see how these predictions are tracking.

As a reminder, I strive to make predictions that are specific, measurable, and preferably actionable. If forecasts are broad and vague, one can almost always declare victory. I would also remind readers that my predictions are based on what I believe will happen, which isn’t the same thing as predicting what I want to happen. My desire for a particular outcome has absolutely no bearing on a prediction. I am simply trying to accurately gauge the most likely outcome.

Here are the predictions, along with an update through the first half of the year.

1. U.S. oil production will suffer an annual decline for the first time in eight years.

I noted that I had the most confidence in this particular prediction. The only way I could foresee this one going awry was if oil prices made a strong recovery to $60/bbl very early on — which I didn’t expect to happen.

In this case, I will just let the graphic speak for itself:

The numbers are only available through April of this year, but production has been falling pretty steadily now since last April. This still looks like the first time in eight years that year-over-year U.S. crude oil production will decline.

2. The closing price of the front month West Texas Intermediate (WTI) crude contract will reach $60/bbl in 2016.

When I wrote that prediction, the price of WTI was $36.14/bbl. The price was going to require a gain of 66% from that price in order to be proven right. I noted that this was a very aggressive prediction.

I also noted that I would grade this one on a curve due to the aggressiveness of the prediction. I would consider it a complete fail if prices failed to crack $50/bbl this year, which was still 38% above the price at the time I made the prediction.

The price of WTI thus far has closed as high as $51.23/bbl, in early June. Despite a recent headline-making prediction by analyst Gary Shilling that the price will still fall to $10/bbl (a prediction he has been repeating for over a year), we have almost certainly seen the lows for this cycle.

However, there is also still resistance to the upside. There is a lot of shut-in production that will begin to come online if prices head toward $60, and the Brexit vote has fanned concerns of a stronger dollar and weaker demand. Further, global inventories are still at very high levels. Considering that oil has already moved up some $15/bbl since I made this prediction, it isn’t a huge long shot at this point that it will be realized.

3. U.S. natural gas production will suffer an annual decline for the first time in 11 years.

In 2015 monthly natural gas production averaged 2.252 trillion cubic feet (tcf) per month. Through April of this year that number has declined ever so slightly to 2.246 tcf/month. 2015 saw two months above 2.3 tcf/month, while thus far in 2016 there have been no months above this level.

This one is too close to call, but it is tracking according to the prediction at this point. On the other hand natural gas prices have strongly advanced since January, and that may spur more production in the second half of the year.

Despite my attempt to always make predictions that are specific and measurable, I realized after the fact that I failed to make an important distinction for this one. I didn’t indicate that the XLE would rise this much for the full year. The XLE has already crossed the 15% mark in 2016, which is impressive considering that it was already in negative territory when I made the prediction.

However, since then it has pulled back a bit and now stands at +12.3% year-to-date. But instead of declaring victory on this prediction, allow me to offer the important qualifier I inadvertently left off in January. I expect the XLE to finish the year up 15%.

As I said when I made this prediction “fundamentals will ultimately win out, and I think we will see the industry’s prospects start to improve before the end of 2016.” That has certainly happened, but a lot can still happen in the 2nd half of the year.

5. Hillary Clinton will win the 2016 presidential election.

This prediction actually generated a lot of angry emails and comments on Facebook. I had people call me a moron for being so naive as to believe Hillary would beat Bernie Sanders, and I have had people tell me in no uncertain terms that “Donald Trump will win in a landslide.”

Meanwhile, Clinton has since dispatched Sanders (to which many of those previous Sanders’ defenders have invoked various conspiracy theories instead of acknowledging the defeat), while she is tracking well ahead of Trump in national polls.

I previously noted that the reason for this prediction is that it has significant energy policy implications. And I will once again remind readers that I predicted Clinton would win for one simple reason: I believe she will win, and I expect President Hillary Clinton to pursue energy policies similar to those pushed by the Obama Administration.

Conclusions

My midterm grades are looking pretty good at this point. I always note that things are always obvious in hindsight, but I got a lot of push-back on these predictions when I made them. I even heard from a Commander in the Indian Navy who said that my 2nd and 4th predictions “defy the current trends and assessment by other studies.”

For the second half of the year I feel like all are on track to be correct, with the greatest uncertainty in the prediction that oil prices will reach $60/bbl this year. That price is on the way, to be certain, but with inventories still as high as they are that could still push into 2017.

Melomys Rubicolahas been declared extinct. Had it been something like a fuzzy koala or panda instead of a rat, the world might have taken more notice, but maybe not. A Google search on the topic goes over 20 pages deep. This seems to have struck a nerve.

It’s possible that an undiscovered genetically identical population exists somewhere else. It’s not unheard of for a species declared extinct to show up again. But if it has been on that tiny island off the coast of Papua New Guinea long enough for speciation to occur, then it is extinct because to repopulate someplace else a pregnant female would have needed to leave the island and establish itself elsewhere, and that is extremely unlikely.

There have been some dubious claims of extinctions caused by climate change, as one would expect, and I’m sure there will be many more. But little by little, the real extinctions will arrive.

I poked around on the internet for critiques of this announcement and found three, two of which were not worth bothering with (one confused the ozone problem with climate change) so I settled on the one at Energy Matters, which is an excellent blog and on my regular reading list. The analysis provided on this particular topic is characteristically thorough but not thorough enough to convince me.

Roger Andrews found that there has not been an increase in the number or intensity of cyclones in that area since 1969. He also looked up the tide gauge records for the area since 2000 and created a crude best fit line through it to determine that the ocean level in that part of the world may have only risen maybe 2.5 inches since 2000. His conclusion was that because the highest point on this island is about nine feet (even with the seasonal fifteen inch increase in sea level rise during cyclone season) sea level rise since 2000 would not have made much difference. And according to the authors’ explanation, he’s right. Temperature changes are likely the main driver, not sea level rise.

So there you have it. The demise of Melomys rubicola had nothing to do with temperature, rainfall or sea level rise. The animal was a victim of storm surges that progressively destroyed its habitat.

This is where Andrew lost me. The researchers are the ones who stated that Melomys rubicola was the victim of repeated storm surges over the last decade that progressively destroyed its habitat. Given time, this is how it will end for other island species.

No evidence – not even a climate model – is presented to support the claim that these storm surges had anything to do with increasing atmospheric CO2.

But the report does present evidence. Keeping in mind that CO2 levels not seen for 800,000 years have led to warmer temperatures which have in turn led to rates of ocean level increase not seen in millennia (all three of which are measured, not modeled) and contrary to the tidal and cyclone data presented by Andrew, the repeated storm surges at that island over the last decade were obviously severe enough to eventually wipe them out after having been there for at least 1.7 centuries. On pages 24 through 26 of the report the authors list several severe surge events that occurred in the vicinity over this time frame and discussed how they are likely linked to climate change:

The increase in cyclonic activity on the east coast of Queensland since 2003 has been attributed to an alteration in the occurrences of El Niño and La Niña events under the influence of the Inter-decadal Pacific Oscillation (J.J. Callaghan, Appendix H in Harper2013). An analysis of three decades of data from across the entire Pacific Ocean basin determined that occurrences of coastal erosion and flooding are most closely tied to the El Niño/Southern Oscillation, with the Southern Hemisphere, including Australia, experiencing more severe conditions during La Niña due to increases in cyclonic activity, wave energy and sea surface elevation (Barnard et al.2015).The Torres Strait also experiences higher sea levels during La Niña years, whereas lower sea levels occur during El Niño years (Suppiah et al.2010). Clearly, the damaging impacts exerted on coastal areas by the changing weather regimes are being driven by climatic oscillations (Barnard et al. 2015). The trend towards a strengthening in the intensity of La Niña conditions until at least 2012 has been linked to climate change, specifically the increase in global mean temperature (L’Heureux et al.2013), with the frequency of extreme La Niña events predicted to increase (almost doubling) with greenhouse warming during this century (Cai et al. 2015).

The overall concern with climate change is that the contribution from anthropomorphic sources have increased the rates of change from geologic time scales to time scales that will affect our grandchildren. This would all be a moot point if there were not also a hypothesis that humanity can slow the change by ending fossil fuel use and the destruction of carbon sinks.

A population repeatedly devastated by increasing levels of seawater inundation and vegetation loss will one day fail to recover. That’s how extinction generally happens. A population shrinks for some set of reasons to the point that it can’t reproduce fast enough to recover from losses normally incurred by things like drought, disease, predation, annual inundation, etc.

I found myself in the middle of a feeding frenzy when I made the mistake of asking why storm surges didn’t drive it to extinction millennia ago. What follows are some responses to my remark in the comment field:

Because this little sand spit probably wasn’t there millennia ago …Bramble Cay is effectively a sand bar in the estuary of the Fly River …While coral atolls have a self-regulating mechanism that maintains them at or around sea level sand spits are notorious for coming and going … The Cay may not have existed in the past.

Photo on Bramble Cay from Gizmodo taken by Natalie Waller

The hypothesis that Bramble Cay is an ephemeral sand bar that comes and goes in a river estuary seems unlikely considering that the river ends about 40 miles away across the ocean. But I also found three sources stating that Bramble Cay is primarily composed of rock. So we can throw out the ephemeral sand bar hypothesis suggesting that the mammal could not have been there for very long (long enough for speciation to occur).

Alternately the species was wiped out by storm surges in the past, and repopulated the Cay when it was carried down on vegetation by the Fly River in flood.

The odds that this one species (instead of a different species) has repeatedly repopulated this cay seems very low to me. It was more likely a one-time event. And to ice that cake, from the official report:

…the Bramble Cay melomys population possessed only one mtDNA genotype, suggesting that a single colonisation event took place on the cay

Several commenters thought it was important to demonstrate with links to sources how a mammal could have arrived there by riding on vegetation drifts. Why they thought that mattered, I have no idea. That’s a well-documented phenomena. How it got there is irrelevant. How long it has been isolated there is what matters. A specimen was collected in 1845, suggesting that it has been surviving there for centuries.

From another commenter:

It isn’t a reasonable hypothesis when you have no evidence to back up the claim only supposition and conjecture. Do we know that the creature has lived there for millennia? If so, has it always been present on that island? Sea levels have been rising for considerably longer than we have been pumping CO2 into the atmosphere.

We know it has been there for centuries. As for changing sea levels, see chart below. And of course it hasn’t always been present on that island. Tortoises have not always been present on the Galapagos Archipelago.

Graph adapted from Real Climate

…and from another commenter:

From what I can gather, the melomys is common on islands in the region and is therefore not even extinct. I’d have thought it almost certain that the Melomys are thriving in PNG. Where to hell do these biologists come from? Do they think the Melomys evolved from sea slugs on Bramble cay?

There are about 13 species of melomys (compare that to the six surviving species of tiger out of nine that existed just a short time ago).

I found the following comment somewhat appalling:

I live in a declared koala habitat where there are no koalas. Tabby (and Rover) can clean out more native fauna in a night than a year of floods or droughts. Not that our modern activist researchers would ever obscure such an obvious causative factor just to get an appreciative moo from the herd.

The fact that our choice of pets is destroying species far faster than climate change at this time is irrelevant.

The following was a reasonable comment but if he was really so interested, why didn’t he take thirty seconds to Google the answer like I did?

Sure would be interesting to know what characteristics separate this rodent from similar species on the PNG mainland.

Another commenter crafted a 462 word essay response, which I parse below. He accepts that the climate is warming but thinks it will be a good thing for nature. Although this view contradicts what some of the other commenters think, which is that climate change will have no meaningful impact, they all seem OK with his idea that it is already having a positive meaningful impact. The article author thanked him for what he thought was an “excellent response.”

Despite the name you don’t know much about species. Do you?

His ability to extract so much information from my short comment and moniker is a true gift on par with that of Sherlock Holmes. But his insult skills need honing ; )

This is life at the edge. A small patch of land appears, and gets quickly colonized by a few species. Through founder effect, genetic drift, and specific conditions they quickly diverge, but their life is precarious. If conditions become slightly worse the population is wiped out, and this is not a loss to the parent species. This is evolution in action, and has nothing to do with us.

His above statement is accurate until the second half of the last sentence. The hypothesis is that the rapidly rising ocean levels caused by warmer temperatures caused by greenhouse gases from the obliteration of vast carbon sinks and industrial discharge of ancient fossil carbon stores back into the atmosphere has accelerated those conditions that have “become slightly worse [until] the population is wiped out.” It’s an old and well-known fact that, statistically speaking, island species have been and will continue to be the first ones to go.

…this is not a loss to the parent species.

I don’t see the relevancy of that unless he’s arguing that we should not strive to preserve subspecies because of the continued existence of parent species, be they melomys, zebra, or Galapagos tortoise.

And the main cause is invasive species …Agricultural management and climatic change are the major drivers of biodiversity change in the UK

True but those particular strawman arguments are also irrelevant to the discussion. There are many causes. Reducing any of them would be a good thing.

If we discount these island species that we are losing, the idea of a mass extinction becomes silly.

Right, the idea that we might be causing a mass extinction is silly. He’s now arguing that human activity has not and will not continue to accelerate extinction rates. I’ve read every book written by E.O. Wilson (including Super Organism), and have his latest one on my shelf. So, if I have to pick between biologists, I hope he pardons me if I side with one of the greatest since Darwin (and many in his time also thought his theories were bunk…and still do come to think of it) on the issue of the sixth extinction event.

As a biologist I am concerned that instead of dedicating our efforts to the protection of wild populations and ecosystems all over the planet, as we have been doing in the developed world, we dedicate the money to fight a climate change that it is having surprisingly little effect on the biology

I’m with him on that one. In fact, when you look at hydro electric dams, the damage being done to bat and raptor populations by improperly sited wind farms, the usurpation of desert tortoise habitat by solar thermal projects etc, one could hypothesize that we are accelerating damage to the ecosystem.

On the other hand, the billions being wasted in Germany in an attempt to decarbonize without nuclear is not going to be handed over to conservation organizations. I would think that a significant number of people would shiver at the site of a melomys (which looks pretty much like most rats) and wonder why we would bother to save it.

…and most of it [effects of climate change] positive …Climatic change has had a wide range of impacts on species, with more species impacted positively than negatively in the short-term at least.

Assuming extinctions are going to result (and they are) he is essentially trading biodiversity for biomass. And being a rate problem, the phrase “in the short term at least” is all important. It’s almost like he threw that line in as an afterthought when at some level of consciousness he realized that his logic chain was missing a link.

This was totally predictable. An increase in temperatures produces an increase in energy and water and together with an increase in CO2 produces more productive ecosystems. Some species might respond negatively to the changes, but most species will respond positively.

In the big picture, every species on the planet today will eventually go extinct. The vast majority of all species have gone extinct. Most certainly in a given boundary at a given time you can find a net positive. Species evolve to survive in a given environment. But change the environment fast enough, and that species will be driven to extinction. An increase in biomass at the expense of biodiversity is what biologists are hoping to avoid. In past climate changes some species were able to survive by shrinking in numbers and holing up in environmental strongholds (a last remaining few mountain rainforests, whatever) until they could expand again with a favorable change in the environment. Other species became much more common as their environment expanded. Many others became extinct. We have overrun the planet. Many species today have nowhere to run, nowhere to hide. The number of extinctions will be huge this time around and very fast paced on a geologic time scale.

Anthropogenic effect on species is greatly negative, but not due to climate change.

But earlier he said “Some species might respond negatively to the [climate] changes …” And as is the case with the above sentence, most of the rest of his comment was a string of strawman arguments. I was not arguing which effect is greater at this point in time.

And finally, another commenter posited the following hypothesis:

May be a cat drifted out there and has since drifted on. As ridiculous as my suggestion is, it makes more sense than their explanation.

So, in the end, other than the drifting cat scenario, the other hypothesis presented in the comment field for why a mammal that has been on this cay for centuries is no longer there all had fatal flaws. Time will tell, as was the case with the ivory billed woodpecker and the passenger pigeon, if it is extinct.