RCP8.5 figures prominently in the most alarming of future climate scenarios. I am wondering if RCP8.5 is too extreme, in the sense that is may be impossible, given constraints on recoverable fossil fuel supply.

In particular, there is a fairly large number of papers arguing that assumptions about coal are incorrect: (list of papers courtesy of LK).

Wang et al. (2017); The implications of fossil fuel constraints on climate change projections — A supply side analysis [link]. From the Abstract:

Climate projections are based on emission scenarios. The emission scenarios used by the IPCC and by mainstream climate scientists are largely derived from the predicted demand for fossil fuels, and in our view take insufficient consideration of the constrained emissions that are likely due to the depletion of these fuels. This paper, by contrast, takes a supply- side view of CO2 emission, and generates two supply-driven emission scenarios based on a comprehensive investigation of likely long-term pathways of fossil fuel production drawn from peer-reviewed literature published since 2000. The potential rapid increases in the supply of the non-conventional fossil fuels are also investigated. Climate projections calculated in this paper indicate that the future atmospheric CO2 concentration will not exceed 610 ppm in this century; and that the increase in global surface temperature will be lower than 2.6 C compared to pre-industrial level even if there is a significant increase in the production of non-conventional fossil fuels. Our results indicate therefore that the IPCC’s climate projections overestimate the upper-bound of climate change. Furthermore, this paper shows that different production pathways of fossil fuels use, and different climate models, are the two main reasons for the significant differences in current literature on the topic.

Ritchie and Dowlatabadi (2017) explain that RCP8.5 contains a return to coal hypothesis, requiring increased per-capita coal use based on systematic errors in coal production outlooks. Ritchie and Dowlatabadi conclude that RCP8.5 should not be used as a benchmark for future scientific research or policy studies.

It looks like substantial technological advances would be needed to tackle the less recoverable resources. Given the climate change and environmental quality issues, how realistic is it to expect these technological advances to be pursued — rather than pursuing greener energy technologies?

.

So my main questions are:

Should RCP8.5 be used as a ‘worst case’ emissions scenario?

Is RCP8.5 impossible based upon the knowledge we currently have?

RCP8.5 may be useful for climate research, for considering processes in a substantially altered environment.

However, many ‘catastrophic’ impacts of climate change don’t really kick at the lower CO2 concentrations, and RCP8.5 then becomes useful as a ‘scare’ tactic.

re “Should RCP8.5 be used as a ‘worst case’ emissions scenario?” Based on the evidence cited RCP8.5 “could” be used as a “worst case scenario” one wants a way out in left field projection that is not realistic and based on faulty assumptions. The intention of course is to force movement away from clean carbon based fuels including natural gas, and coincidentally significantly detrimental economic growth and well being of society/humans. Shouldn’t “worst case scenarios” be realistic and within the realm of possibility, this clearly is not. Why is an unrealistic “worst case” scenario even shown… ans. it is a story line meant to alarm, raise fear levels, and cause anti fossil fuel political action. Scientists who make up the story line are unethical.

– lower trade flows, relatively slow capital stock turnover, and slower technological change;
– less international cooperation than the A1 or B1 worlds. People, ideas, and capital are less mobile so that technology diffuses more slowly than in the other scenario families;
-international disparities in productivity, and hence income per capita, are largely maintained or increased in absolute terms;
– development of renewable energy technologies are delayed and are not shared widely between trade blocs;
– delayed land use improvements for agriculture resulting in increased pollution and increased negative land use emissions until very late in the scenario (close to 2100);
– a rebound in human population demographics resulting in human population of 15 billion in 2100; and
– a 10 fold increase in the use of coal as a power source and a move away from natural gas as an energy source.

So it assumes that virtually all the recent technological, economic and demographic trends will be reversed. Not exactly a set of assumptions I would base a future plan on.

I don’t see any way to diminish the impact, unless you want to break into the offices of the NYT’s, tie everyone up, and insert an honest, fact-based article on the so-called worst case scenario, on the front page of the next edition.

I think its impact will be diminished by normal means. Willis Eschenbach has something up at WUWT. Roger Pielke has tweeted about it. Be that as it may, we’re still waiting on more counterpoints to it. We still have whatever Senate we have for the next 2 years to not do anything.

Of course it doesn’t stop with RCP8.5. What happens is the upper bounds of that scenario get used as the worst case. In NZ the government’s coastal hazards and climate change guidence recommends an RCP8.5+ scenario for managing sea level rise.

However one has to say that for policy purposes it is the uncertainty that is important, and to that end any mention of RCP8.5 should properly be followed by a mention of (say) RCP2.6. That way people start to appreciate the range, and the limitations of scenario studies.

Nearly all the world has falling fertility rates.
Nearly all the world’s developed nations have less than replacement fertility.
Nearly all the world’s developed nations have declining work force.
Nearly all the world’s developed nations have declining per capita CO2 emissions.
Most of the world’s developed nations have declining total CO2 emissions.

I am not sure that RCP8.5 matters except as a model parameter to produce all kinds of scary projections. At the level of Paris accord and pressure to reduce fossil fuel usage, the 450 scenario and now the 430 scenario provide the pressure points.
The IPCC fix is in. They assert that 1.5C increase in GMT is assured if CO2 stays below 430 ppm.

I would like to add to my previous comments over the last five years (which are based on my background in the oil and gas industry and conversations with individuals who studied coal reserves), that RCP8.5 CO2 emissions pathway has been dealt a blow by the 2018 International Energy Agency report, which predicts peak use of oil products by autos around 2025. I don’t necessarily agree with that prediction, but it would seem the US agency working on these climate reports should understand their work ought to be tied to agencies and work outside the narrow IPCC world.

Also, when you see figures reporting oil production approaching 100 million barrels of oil per day, do remember they may include liquids such as biofuels, synthetic oil from natural gas, and natural gas liquids. CO2 emissions for these components are not the same as they are for crude oil and condensate, which are fed into refineries to make products such as gasoline, diesel, kerosene, heavy diesel, and asphalt. Asphalt produced in some refineries is fed into units which yield coke and unstable liquids which can be hydrogenated to make fuels. So the full accounting requires a detailed understanding of the processing options.

And therefore the emissions from burning “oil” can be overestimated. I have seen nothing written in papers, IAM manuals, articles, and reports which explain the way the emissions are estimated. I worked for a large multinational and we had internal guidelines to prepare the estimates we prepared for the European Union, and they were more detailed than anything I’ve seen in the “climate science environment”.

Economics Is Not A Science Because It Doesn’t Use the Scientific Method

Charles A. S. Hall discusses the faults of the “Dismal Science”

“Economics is not a science because it doesn’t use the scientific method”
“ Don’t tell me dollars. Tell me energy. Because Dollars are only a lien on energy. That’s all they are”
“Encourage us not to teach fairytales in economics classes. We teach a million young people fairytales in our Economics classes”
“I had a wonderful talk at our biophysical economics meeting last week. And the speaker was an historian. He said the discovery of the 2nd law of thermodynamics absolutely transformed chemistry first, then physics, then all of the.. geology.. all of the sciences.. ecology.. Except one.. Economics. ”http://tinyurl.com/mbqowln

Keep up the good comments Fernando.
Charles Hall efforts to use EROI are an attempt to address systemic faults in our underlying system of measurement (economics)

Thanks David for the more through references.
We have an Economic system that is dysfunctional and not in accord with Bio-Physical reality. Instead of addressing this issue directly, TPTB are giving us this pseudo-scientific CAGW agenda as a “patch job” to try to mitigate some of the underlying deficiencies of our Economics system.
So they are screwing up “Science” with this Post-Modern CAGW “Sceance” instead fixing the dysfunctional “Dismal Science”. Meh!!
Demonizing Carbon as CO2 is a Proxy for HC Depletion, for those who dare not address HC Depletion directly. Real pollution issues have good technical solutions, which is why it was necessary to demonize C as CO2 if one wanted to use an environmental Hoax as a Proxy for the HC Depletion issue that one didn’t want to address directly.
(Yes, there is a real reason to migrate off Fossil fuels. That is because we are depleting them. !! )
cheers
brent
(old/former downstreamer)

Economically the world is locked into a growth cycle – despite any and all reservations and interventions. A high growth planet brings resources to solve people and environment problems. The clearest way to economic growth is markets – and the biggest risk is market mismanagement.

Climate science is overdue for a shakeup. Starting perhaps with the next globally coupled climate shift due in a 2018 to 2028 window. The kaleidoscope hypothesis – shake it up and a new pattern emerges. If you have not heard of this – I guess it will come as a surprise. Although I do keep pointing out that this is a double edged sword. But whatever happens in climate seems irrelevant to rational social and economic policy – whatever relevance deprivation syndrome that causes in climate scientists.

Smart money is on trade and innovation in stable economies. Stability requires a disciplined approach to markets, government regulation and spending and reserve bank interest rates. Economic growth – like a living system – requires energy. It is just as much vital for life as is water. Manage interest rates, moderate spending and taxes, prudential regulation, credibly balanced budgets, fair laws – and a steady if not exactly spectacular economic growth keeps – as much as possible – a steady hand at the economic levers. Whole economies can collapse if these things get out of hand. This does less than nothing for global peace and development.

Smart money is on energy growth – strongly in non-OECD countries. And this is devoutly to be welcomed. Continuing the economic miracle of billions emerging from the direst poverty through free markets and cheap and reliable energy is a humanitarian priority massively outweighing any massively dubious contingent consideration of climate change. For now that means gas and ultra clean coal technologies. Advanced nuclear is the front runner in emerging technologies.

We will transition to 21st century energy within decades – the creative/destruction of capitalism will make the transition breathtakingly fast. To misunderstand this is the misunderstand the core of capitalism – as they inevitably do.

With COP21 in Paris in late 2015 – the world has definitively chosen to access whatever energy resource is needed to facilitate growth and development. COP21 locked in an increase in energy emissions of 3.7 billion metric tons by 2030. If you are looking for ‘solutions’ to emissions it will have to come from elsewhere. But as electrical and heat emissions are a mere 25% of the total there of opportunities to harvest low hanging fruit with the many benefits – unrelated to AGW – that brings.

Carbon sequestration in soils and ecosystems has major benefits in addition to offsetting anthropogenic emissions from fossil fuel combustion, land use conversion, soil cultivation, continuous grazing and cement manufacturing. Restoring soil carbon stores increases agronomic productivity and enhances global food security. Increasing the soil organic content enhances water holding capacity and creates a more drought tolerant agriculture – with less downstream flooding. There is a critical level of soil carbon that is essential to maximising the effectiveness of water and nutrient inputs. Global food security, especially for countries with fragile soils and harsh climate such as in sub-Saharan Africa and South Asia, cannot be achieved without improving soil quality through an increase in soil organic content. Wildlife flourishes on restored grazing land helping to halt biodiversity loss. Reversing soil carbon loss in agricultural soils is a new green revolution where conventional agriculture is hitting a productivity barrier with exhausted soils and increasingly expensive inputs.

Increased agricultural productivity with what are the most modern farming practices, increased downstream processing and access to markets build local economies and global wealth. Economic growth provides resources for solving problems – conserving and restoring ecosystems, better sanitation and safer water, better health and education, updating the diesel fleet and other productive assets to emit less black carbon and reduce health and environmental impacts, developing better and cheaper ways of producing electricity, replacing cooking with wood and dung with better ways of preparing food thus avoiding respiratory disease and again reducing black carbon emissions.

The highest priority goal this century is build prosperous and resilient communities in vibrant landscapes – from the ground up rather than the top down global government model preferred by pissant progressives. And AGW will all come out in the wash.

Robert – you make a crucial point, which is related to the original question:

“Is RCP8.5 an impossible scenario?”

I make a similar comment elsewhere in this thread, but reading yours helps me clarify it.

The problem with RCP8.5 – or ANY – “climate” forecast is that it, and they, are models that are far too narrow to forecast *anything* with the accuracy implied in political or media circles.

They exclude too many “exogenous” factors that are the true drivers of long term environmental, social, and financial outcomes.

The true scientists dealing with these models will always express the outputs as ranges of uncertainty, not line or point forecasts.

By the way, the initiator of this blog is a hero of mine – because she has courageously introduced independent thinking into powerful political and economic cartels – which is not an easy thing to do

So my comment here is not critical, it is intended to help expand thinking and policy scope.

As I mentioned elsewhere, I have considerable experience with climate regulation (air pollution, CAFE, etc) – and have seen exactly how these models get interpreted, and applied in back rooms in the US, EU and elsewhere.

For example, I lived through the political identification of single “enemy pollutants” in regulations – from sulphur, to nitrogen, to carbon, etc. See history of Clean Air Act.

These were almost classic regulatory “movie scripts” in which the “enemy” substance was identified, and then simple solutions (like CAFE) were cast as the hero coming to the rescue.

Climate models appear complex, but they are simple in strategic structure.

The “Enemy” is identified as the output, and then relatively few simple inputs are identified as though they were “levers” to pull to weaken the “Enemy” a century from now.

“Kill CO2” is the current script.

Lots of data and calculations, but a very simple script.

The problem with all these “script models” is that they too often ignore the many much larger changes happening around them, that will determine the “environment” 50-100 years ago.

My professional work involves traveling the real world seeking new innovations that can scale to global proportion – through “demand pull” versus “command push”.

I look for things – like reliable cars (Toyota) and simple communications (internet”) that can be “pulled” into existence because they create both social AND commercial value for millions of people.

Not things that must be “pushed” by central commanders using coercive power. Most environmental regulatory failures have happened because small cartels of political/commercial “commanders” try to “push” simple solutions on complex societies.

The electric vehicle was invented more than 140 years ago and it has been pushed many times by companies (GM) and governments, but it has always failed, and is still struggling

Why? EV’s require massive changes in energy infrastructure, and for a century oil was much easier and cheaper to get into a motor than gasoline – for both society and industry.

Mobile communication expanded to literally 7 billion humans faster than any modern innovation because used cellphones were affordable to people living on $1 per day, and it allowed them to do the same things the richest people on earth can do. (I traveled the world for 10 years documenting this “user pull” phenomenon.

So…

All scenarios like RCP8.5 and their corresponding single-variable “push” strategies – “reduce CO2” – will be wrong, and cannnot be implemented as currently conceived

What WILL affect the climate 100 years from now?

Here are some of the ingredients scientists and politicians should be looking at, if they want to facilitate a better planetary climate:

1. China has the largest manufacturing economy on Earth and in human history. This is likely to remain important for decades. Their energy consumption will be far more important than that in the US. China’s population density, and their “command” government suggests they can deploy experimental climate technology faster and more broadly than US and EU. Chinese climate scientists have already defined “desert crust” as perhaps the most important natural form of carbon sequestration, for example. See how much of world is “desert”.

2. China will soon be the largest monetary economy on Earth. It is taking over the US lead now, and is organizing finance in developing parts of the world that US/EU ignore. Chinese finance is likely to determine the “accuracy” of environmental forecasts.

3. India is now – perhaps – the largest “technology services” economy on Earth. What we call “call centers” are the hubs of technology service workers now organized worldwide. India is likely to be the second largest economy during the RCP forecast period.

AND- technology services do not pollute at the same level as classic physical manufacturing, especially if the communication tech is fueled with solar and wind.

4. Notice that most of this non US/EU future development is CLOSER TO THE EQUATOR. That means the economic and technical development of the RCP forecast period is likely to need less energy per human participant, as compared to the “Northern Empires” that dominated past centuries.

5. In a related vein, notice that passive Geothermal energy is perhaps the most non-polluting form of energy. Notice that “adobe” and its derivatives are perhaps the most prolific buidling material on Earth – especially nearer the equator.

6. Notice that both China and India have significant future economic/tech development efforts in Africa, LATAM – and other similar regions – where forms of passive geothermal energy reduce the high pollution inputs to the economy, and the climate.

7. And, even in the “Northern Empires” notice the potential for passive geothermal. Ironically mining companies, who must maintain their “tailings” for decades, have noticed they can extract passive geothermal energy from these tailing piles. Some are experimenting with using this energy to create “carbon neutral” housing by removing heating/cooling from the houses.

Etc…Etc…Etc.

This may sound crazy, or too simplistic…

….but look at the variables in the massive climate models. And think carefully about the simplicity of “CO2” as the only villain we should be chasing.

This blog was established – wonderfully – to broaden the thinking and vision of climate sciences.

And one of its main – globally valuable contributions – is that it has created a safe space for vision and critical thinking.

My purpose in making this comment is to encourage the brilliant, and courageous climate scientists to add more of the “gemba” to their assessment of climate models and the future of the climate.

Toyota has sold more than 10,000,000 economic and environmentally sound hybrid vehicles – at a profit.

They can be easily recycled, and their parts re-used.

Toyota has therefore done more than most governments and scientists to take the fear out of our climate future.

They did this by traveling the “gemba” – not just looking at blueprints in an office.

What are we meant to do with that? Climate models and the intertwined biological and physical Earth system are chaotic. The future is unpredictable – a limitation of science at the most fundamental level.

“Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable. They echo other famous limitations on scientist’s expectations, namely the undecidability of some propositions within axiomatic mathematical systems (Gödel’s theorem) and the uncomputability of some algorithms due to excessive size of the calculation.” James McWilliams

In the real now – however – there are readily identifiable pollutants – particulates, nitrous oxide, sulfur, mercury, excess nutrients, birth control hormones in rivers or cocaine metabolites in waterways below Milan, etc. The solutions are both technological – high efficiency low emission coal generation, precision farming, water sensitive urban design, rotational grazing and a host of other technologies across sectors and gases and aerosols – and cultural. The future of electric cars – btw – depends linear generator or battery technology. Until then EV are complex and expensive. Toyota was working on a linear generator. The future of EV is more like a desktop computer than a conventional car. Parts are plug and play – including the motors – and the whole thing can be custom built in local micro-factories. Potentially strong, safe, light, simple, cheap and with spectacular performance.

But it is also about bottom up management of the global commons. There is another Japanese word – Iriai – meaning to enter the joint use of resources. “The Iriai are here to help and give understanding to people.”

Emissions of carbon dioxide are problematic in the complex dynamical Earth system – it may push the planet past a tipping point with unforeseeable consequences. But the pragmatic response is far from a narrow focus on wind and solar power – or indeed geothermal at this stage.
The latter – while cheap – has limited accessible availability and deeper hot dry rock is a little more difficult. Advanced, small, modular nuclear seems clearly a front runner for a major 21st century energy source.

A. Think more broadly about environmental problems, and potential solutions, than the limits imposed by academically certified “climate models”.

B. Think about the subliminal biases in how models are constructed.

For those who do scenario work for industrial development or large scale industry change, the following principles are clear:

1. Any “scenario” approach that starts with today, and projects trends into the future – limits the strategists to the narrow variables of “things we know now”, versus “unpredictable solutions we might discover by visualizing a future unconstrained by current doctrine”.

2. The most successful application of scenario strategy is to visualize desired long term outcomes, and then work BACKWARD from those visions, to discover what we need to start today to LEARN how to reach the desired future.

(The “approved” climate models are almost exclusively the former. And they are anything but “chaotic”. Which is why they cannot accurately model the chaos that has alway been the crucible of large scale system change.)

Maybe this is worth a larger conversation.

On your electric vehicle comments.

The most sustainable, scalable social and commercial innovations are almost always the simplest, “low” tech innovations…
….that don’t require massive INFRASTRUCTURE CHANGE.

“Car” was “metal horse” that used the existing global horse road infrastructure

Google “search” was amazingly simple code that scaled beyond comprehension by borrowing a tiny fraction of the “internet” that was pulled into scale by “chaotic” individual adoption of almost free “internet code” on millions of personal devices.

The simplest shift to electric vehicles will almost certainly not be the most advanced battery or drive technology.

It will probably follow the current largest, simplest electric transport infrastructure in the World.

“Solving “unsolvable” problems like climate change are rarely done via central planning.”
I would counter that by saying many environmental problems are solved by central planning, also known as regulations. It doesn’t work if left to free market principles. The CO2 problem is analogous to an environmental problem. The consequences of warming are complex, but the amount of warming per emissions is not. On the principle of stabilizing the climate, the path is clear, if not the specific actions.

Think about how the global commons can be managed best – and it is not by government fiat. Here’s an example of government ‘success’ in environmental conservation – and an alternative bottom up ‘polycentric’ approach.

And existing infrastructure and technology was the point of the linear generator. A highly efficient generator using any of a number of liquid fuels to drive super efficient electric “Proton” motors delivering a peak power of 70 KW to a wheel – that’s 280 kW in the 4WD version. Supercar performance that ICE cars cannot keep up with. This is far from electric trikes or traction engined buses. I researched the purchase of a new car earlier this year. The Suzuki Vitara 1.3L turbo charged diesel met her specs rang her bell. It’s a great little car – goes like the clappers and we have got 19.1 km from a liter of fuel. But I suspect it is already a dinosaur.

Geothermal energy is cheap where it is available – but limited to regions were magma is close to the surface. Hot dry rocks are warmed by nuclear decay – but you need stable geology and a 4 km ‘hose’.

And climate models and the Earth system itself are indubitably chaotic systems in the mathematical and scientific sense – which is where it counts. Although they are of course different chaotic systems. The Earth system is a spatio-temporal chaotic system where models have only a temporal dimension. See Tomas’ post for the difference.

AGW – to distinguish it from vigorous natural variability – is far from an ‘insoluble problem’ – unless you are looking at the wrong problem as #jiminy does.

My background is in engineering, hydrology – with a lot of experience in rainfall/runoff and hydrodynamic modelling – with a Masters in Environmental Science. My core skill is in biogeochemical cycling. The transport of substances through biota and the abiotic environment from land to open oceans. I have environmentally planned, managed and monitored projects up to $10B in value. Scenario development to defining events and assigning and managing risk – again for the same projects. I am an award winning designer of integrated water systems – integrating water sources, stormwater and sewage in urban, mining and industrial and agricultural settings. Using technologies from the simplest to the most advanced water treatment systems available. I have written policy and law for coastal management. But what have I done lately ffs? .

Robert – responding to your comment below (I obviously can’t figure out how to get responses in linear sequence.)

Was not questioning anyone’s skill or accomplishments.

Just observing

I was defining “chaos” to mean what happens when the “completely unknown” confronts existing systems that people think they “understand”.

Like the enduring chaos that still derives from the Trinity Site explosion – on all dimensions, including scientific and mathematical.

I get that there are forms of what one might call “computational chaos” in various human wonderful *imaginative* disciplines – like mathematics, science, and forecasting the future.

I am not trying to claim any expertise, although I have traveled millions of miles of industrial, water, food, energy, and other ecosystems documenting their “network effects” – and learning from some amazing people like the workers in the Toyota and Honda supply chains.

I view my role as a sort of facilitator – a “nudge” – trying to de-stabilize established thinking and giving smart, often powerful people permission to be “wrong” in public….to help create entrepreneurial vision and opportunity.

A personal story about chaos, and the importance of stupidity in scientific research:

Early in my life I was documenting the industrial costs of changing global automotive supply chains to build the “cost/benefit” models that are still in US/EU fuel economy and emissions models. The “economic practicability” standard in the law.

I did this with a small team, including some of the leading experts at MIT.

We had permission to wander through hundreds of component and material plants, so we were building these cost/benefit models from real machines and people upward.

For several years we were schooled in the most sophisticated central-command control logistics and engineering systems the US and EU had to offer.

We also saw – exactly – how US government cabinet-level officers, and staffers at Congress used these central command and control logistics models to build the “cost/benefit” models still used today to determine central command environmental policy – like the subsidies used to stimulate electric vehicle sales.

Then we went to Toyota.

At that time the western technology and scientific expert viewpoint was that Japanese industry was crude, unscientific, and that Japan could imitate, but could not innovate.

Their plants had no sophisticated automation or computers.

After spending lots of time with Japanese managers, they sort of adopted me, and let me wander their supply chains freely, with “coaches” who made me stare at specific processes, for long times, so I could “sense” how very different their industrial “controls” operated.

One day we were in a high volume engine plant, that had very little automation, and suddenly my coach dragged me to the inbound loading dock and showed me an inbound kanban “box” that had been sitting there for more than 30 minutes or so, without that small batch of parts being pulled into the just-in-time engine assembly line.

He guided me to a waiting car, and we drove through traffic for about 20 minutes to the component plant, where these engine parts were made.

He then brought me to a machine in the parts factory that stood completely silent, and asked me to look at if for awhile.

I’m a pretty educated person…staring at a small machine that is not running, with no explanation from my coach.

Sort of an industrial meditation.

Then it hits me like a ton of bricks.

The ABSENCE of a empty Kanban box back at the parts plant, meant the unreturned box they sent maybe two hours earlier – had instantly adjusted the complex supply chain…for an engine plant making more than 1 million units per year…

Chaos that forced me to drop all my education and experience, and visualize the entire global Toyota system in one flash.

Toyota and Honda are some of the few global companies that have maintained positive operating cash flow and employment for 40 years, while most other auto companies have gone bankrupt, or had hidden bankruptcies covered with massive gobs of tax bailouts.

I have never forgotten that moment – and years later that silent machine helps me understand how modern genomics and the internet work.

When I tried to explain this back in the US to the highest level scientists and engineers in the EPA and DOT, they did not get it.

They argued that the Japanese can imitate but they cannot innovate – using “evidence” of the superiority of US enginering and science.

I see the same central command, central expert, big money certainty operating in the environmental “regulatory industry” today.

The politically certified climate models are similar to the mental models US experts held with respect to the superiority of US and EU engineering and manufacturiing.

In my lay person judgement, there is no way a group of fewer than 10,000 humans can forecast what will happen in the Earth’s complex environment with enough precision to build corrective industrial systems that will still be accurate years from now

I will bet that EV’s which have been around for 140 years, will probably scale in the massive, simple, local-human run scooter ecosystem in China…

….before it will scale with no environmental damage in the US or EU – with massive battery Tesla’s.

But EV’s are just technology. Like anything else they will succeed when there is a market. A sexy product at the right price. The technology requires simpler, cleaner and cheaper batteries and a way of extending the range efficiently. Many people are working on it.

If you look at my ‘Changing our to the environment’ you will find a detailed methodology for getting away from command style environmental regulation – which is failing – in a specific jurisdiction.

But let’s go back to the source for ideas applicable – inter alia – in health, policing and water, fish and forest management. .

“Lorenz was able to show that even for a simple set of nonlinear equations (1.1), the evolution of the solution could be changed by minute perturbations to the initial conditions, in other words, beyond a certain forecast lead time, there is no longer a single, deterministic solution and hence all forecasts must be treated as probabilistic. The fractionally dimensioned space occupied by the trajectories of the solutions of these nonlinear equations became known as the Lorenz attractor (figure 1), which suggests that nonlinear systems, such as the atmosphere, may exhibit regime-like structures that are, although fully deterministic, subject to abrupt and seemingly random change.” Julia Slingo and Tim Palmer – both in the high echelons of climate modeling. But you will have to work that out for yourself I’m afraid I can’t help you there. It is the third great idea in 20th century physics – along with relativity and quantum mechanics – so worth the effort.

As for markets existing only in the mind of classic economists – I just checked with my GP, the chemist, my local fish and chip shop and Coca Cola. They don’t agree with you either.

And my Paris-Dakar entry is all parts available from existing global suppliers – I checked. I am waiting for a couple of technologies to mature. Like 3D printed cars.

But what started me thinking down this track was Steampunk Girl’s post of a 1931 Bugatti Type 51 Dubos Coupe. That would work too. So poised – so elegant – so sexy. Both Steampunk Girl and the Bugatti.

… pissant progressives and skeptic curmudgeons endlessly bickering over the same tired old climate talking points… 20 years later the hockey stick can still incite the same arguments – although the passion has dimmed but that may just be old age.

When we drill down, what does this mean?
“The report, prepared with the support and approval of 13 federal agencies, and with input from hundreds of government and non-governmental experts, provides an comprehensive look at how climate change will impact the United States.”

“Chapters are centered around Key Messages, which are based on the authors’ expert judgment of the synthesis of the assessed literature.”

“Participating experts must grapple with many sources of uncertainties, evaluating their implications. Sources of uncertainty can include, for instance, measurement error in observations, incomplete understanding of Earth system dynamics, and alternative approaches for modeling impacts, responses, and possible societal development trajectories. Participating experts also must consider appropriate generalizations across results, summaries understandable for and relevant to decision-makers, and the social and cultural dimensions of assessment itself. Across such domains, effectively capturing, distilling, and conveying balanced overviews of understanding and uncertainties is not simple.”

There is far from complete understanding of Earth system dynamics. Societal development trajectories – Flip a coin. A ten sided one. Measurement error exists going back further than 150 years. “…is not simple.” is an understatement. So we can trust them, and start spending money. Because that’s what it’s going to take. That is the message. Spend money to save us.

Apologies if this has already been pointed out, but I think there are a couple of things to bear in mind. The RCPs are really concentration/forcing pathways, not emission pathways. Given uncertainties in the uptake of our emissions by the natural sinks, there isn’t a single emission pathway for a given RCP. Hence, it is possible that we could follow something close to RCP8.5 even if we follow an emission pathway typically associated with a lower RCP pathway. Also, as far as I’m aware, the association between RCPs and emission pathways does not typically include carbon cycle feedbacks. It’s possible that – as we warm – there could be emission from natural sources (such as permafrost emission) which, again, means that we could follow something close to an RCP8.5 pathway even if we follow a lower emission pathway. Although I think RCP8.5 is indeed probably unlikely, I don’t think it is sufficiently unlikely that we should ignore the possibility that we could follow it.

I agree that there isn’t a single emission pathway for a given RCP. However, since the AF has been declining, and with the upcoming cooling it will decline even more, it is likely that we could follow something close to RCP4.5 for example (or lower) even if follow an extreme BAU emission scenario.

When the IPCC set a target forcing equal to 8.5 watts per meter squared they had to explain how such forcing could be achieved. A group led by Nordhaus used the DICE integrated assessment model to build the emissions profile needed to achieve the target forcing. The DICE team found that achieving such an extreme forcing using DICE (which includes asumptions on everything which can impact forcing) required they pour huge amounts of CO2 and methane in the atmosphere.

The DICE model has simple and coarse fossil fuel and alternate energy modules (refer to the manual) which fails to account for the economic impact of ever increasing fossil fuel prices and ever decreasing replacement technology costs. This led the team having to recarbonize the economy, meaning that what they label as “oil” reaches an absurd peak in the second half of the 21st century, but the gap is filled by coal (which seems to appear out of nowhere).

The large amounts of CO2 the DICE model is forced to put in the atmosphere leads to reduced carbon sink efficiency (the model chokes the carbon sinks) and this allows CO2 concentration to rise and trigger the consequent rise in water vapor.

My concern with this pathway is the concatenation of absurd asumptions they used and the subsequent labelng as “business as usual” by IPCC officials at press conferences.

If you want to reach the 8.5 watts per meter squared forcing by claiming that bacteria will eat Siberia or any other phenomenom, then this should be outlined and backed up, so that the community can judge its likelyhood. But the use of arm waving isnt going to be enough to convince the world when the objective is spending a huge amount of money in energy conversion which seems quite impractical with today’s technology.

Given the certainty that oil and gas and coal will eventually cost too much for poor countries to afford them, it’s clear we do need to get cracking developing nuclear power and other technologies, and also pursue several avenues of geoengineering research (in spite of the ideological and medieval like objections to research what MAY be needed in 30 years if you happen to be right).

Good point ATTP, carbon cycle responses are a major source of uncertainty — they were one big reason Hansen’s 1988 “business as usual” temperature prediction to Congress was so far off from what actually happened.

That said, the uncertainty goes both ways — as happened with Hansen, we could get more emissions than expected under worst-case conditions, and still get concentrations far below RCP8.5. Greening has so far proved stronger than increased release but it’s not known whether that relationship will reverse, flatten, or perhaps even accelerate, as the recent airborne fraction trend pointed out by edimbukvarevic implies it might.

Speaking of worst-case, that has long-term implications for the survival of the human race, too — if the CO2 residence times are short enough and the true ECS value small enough, this interglacial may not be radically extended as today’s fashionable theories propose, and the eventual consequences for modern civilization might be worse than large-scale nuclear war.

My expertise is in mapping out and understanding large industrial and commercial systems, on the ground, worldwide.

This includes understanding complex supply-demand-recycling systems in transportation, food, housing, energy, communications, and other major industries.

I have significant experience in the politics of environmental science and the forecasting that supports environmental regulation.

I was on the team that created the Corporate Average Fuel Economy (CAFE) regulations. When Congress passed this law they, and their scientific experts, neglected to include the costs of the capital equipment in factories that would be required to meet the new fuel economy regulations.

They included only the incremental component costs of, say, fuel injectors that replaced carburetors.

This meant Congress and their economist/scientist experts were missing several tens of billions of dollars of manufacturing equipment required to convert the automotive fleet.

Using actual data from more than 5,000 factories, two colleagues and I established the metrics required to convert large systems to new levels of fuel efficiency.

These metrics and processes remain in the law today

So what?

Almost all of the climate models, and more importantly the models of industry and consumer response to climate “solutions, emulate this rather striking error of omission Congress committed in early “climate” (fuel economy and emissions) forecasts.

Current climate models also underestimate the flexibility of existing technologies to change dramatically, and quickly, should such dire climate change forecasts ever become reality.

For example, over the past 15 years I have traveled the world, with small teams, mapping out the complex mine-materials-factory-sale-use-scrap-recycling cycle of electric vehicles.

The technology of lithium batteries looks so simple in the scientific models of electric vehicles.

It looks radically different, and less environmentally benign when standing in a “salar” of LATAM and documenting how much water has to be used, in the desert, to extract the comparatively tiny amount of lithium now fueling the miniscule numbers of EV’s being sold.

The world will soon have more than 2 billion petrol-fired vehicles in its fleet.

Gasoline cars sold today will not completely exit the fleet for 23-25 years.

The “replacement rate” for these 2 billion vehicles is about 2-4% per year

This is the implicit target of environmental scientists who argue for electric transportation.

But, scientists who do not understand how to make cars, or the size of the global fleet, tend to grossly overestimate how quickly EV’s can expand.

They also tend to completely ignore the environmental damage to the world’s water suppy if EV’s *do* manage to replace a large amount of these 2 billion petrol fired cars.

If EV’s reach high volume, then the easy “water flooding” extraction of lithium will run out of capacity, and then battery makers will have to resort to “hard rock” mining of lithium, which requires blasting, chemical extraction, and high energy consumption in refining battery materials.

These environmental insults are not included anywhere in the IPCC models of the future.

By the way, the IPCC models also miss important details like the e-waste in large batteries. Tesla batteries, for example, represent more than 1,000 pounds of e-waste per car, and, despite rosy promotional materials, there are no currently viable means to recycle this kind of waste from hundreds of millions of vehicles.

So – climate models not only make many assumptions about the climate, they contain many implicit assumptions about easy and fast conversion of technologies offered as “cures” to climate change.

Scientists also often ignore the resiliency of the current transport, energy, food, and other large ecosystems – should step-function environmental threats appear.

Example:

Americans drive more than 3 TRILLION miles per year.

That is what consume fuel. Not the mpg sticker on the vehicle.

As proven in the oil shocks of the 1970’s – if American drivers are presented with sudden shocks – they quickly travel fewer miles, without changing anything else in the economy.

This means that if we do get sudden climate shocks, if 2 billion drivers take drive fewer miles/km per month, they dramatically cut energy use.

Or if they buy a new petrol car only one size smaller, en masse, they can meet all of the transport fuel reduction goals in many of the IPCC-style future scenarios

SO:

The current debates, and step-function, tipping point, forecasts like the one getting publicity this week –

– are missing the many obvious, and significant, non-extreme reservoirs of resiliency in the massive human/nature ecosystem we all experience.

And they often miss the significant, total life cycle, insults to the global environment that can result from “solutions” like EV’s, housing, and social behavior.

Modern science is so good, it proves itself “70% wrong” all the time.

See the history of “junk DNA” and its transformation to the “microbiome” for an example of the “accuracy” of long term scientific forecasts.

Thanks, the vast majority of political and media comment is completely oblivious of this, and yet in Australia many influential people who might otherwise appear intelligent insist that we do not question the IPCC et al but pursue huge emissions reductions in order to possibly slightly limit temperature increases which might in face prove net beneficial.

Should RCP8.5 be used as a ‘worst case’ emissions scenario?
No.
It is business as usual.
It is actually the average.
A better worst case, not the worst, would be to double the use of fossil fuels in the same time frame.
Now that would be scary. So scary no one even admits it is possible because it would seem to ridiculous.
So where is RCP 15?
–
Is RCP8.5 impossible based upon the knowledge we currently have?
No.
But the vidence says it is not happening.
By the way that opening graph shows the divergence of the scenarios from a common spot much later than when they were originally conceived. Was it modified to show divergence at a later date than original?
If so it would be wrong as well.

The Fourth National Climate Assessment (NCA4) once again highlights the need for concise but substantive explanations that the bulk of the population easily can understand to help them make political and other decisions. Following is my explanation.

Antarctic ice core analysis demonstrates that climate change mainly is due to recurring 100 thousand year cycles, not due to human effects. We are nearing the peak of a warming cycle that has not yet reached previous maximum levels (when humans were not around). That Earth is warming (if it is), therefore, is not surprising … and is unrelated to human-caused warming.

No evidence exists that human impact is significant. Equilibrium Climate Sensitivity (ECS), the arbitrary factor used in climate models to multiply the small effect of humans, actually is measured at less than half the level theorized to cause dangerous warming by humans. The Intergovernmental Panel on Climate Change continue to use an overstated ECS in their models.

Those who advocate that humans are driving calamitous global warming must provide real evidence (using an explanation, like the above, that an eight year old can understand) . ”Consensus” is not evidence (the “97%” is actually much less anyway; most of the 97% simply did not reject the possibility of human-caused global warming but also did not conclude significant human-caused warming exists). The scientific and academic community once had a consensus that eugenics, acid-caused stomach ulcers and many other now disproven examples were settled science. Acceptance of and actions based on those so-called ”truths” resulted in widespread damage to humankind. The same could happen from misguided climate related actions.

carlholm777, we blogged here critiquing the Fourth National Climate Assessment (NCA4) first draft about 16 months ago. There were several basic climate science facts that they seemed to get wrong. For example, they projected higher temperature extremes, both higher highs and lower lows. The theory physics, as well as data record, show that the later is un-physical. The diurnal temperature range recorded by land stations has been narrowing significantly over the last century. The daily minimums have been rising significantly faster than the highs. This is also true for the monthly and annual highs and lows. Also, the cooler areas of the planet, particularly, the NH higher latitudes is where the lows are getting less harsh. Now, this doesn’t help the Greenland ice sheet or the overall ocean temp, I admit. We do have global warming.

As much as I’m concerned about future use of resources and technological investment the integrity of science reporting concerns be at least as much. Some I feel see it as responsible to always err on the worst projection when reporting. But if I go to my doctor after a problem is checked out I don’t want shaded reporting that is thought to be “better for me.” I want the actual best judgement. And, if it’s serious I want a second opinion from a different hospital that knows they are hired only for the opinion and would not do the surgery to remove possible bias.

We need to all realize that the brand of science is as important a resource as any other.

AMEN!! The 97% is a “protective wall” that masks reality. If you don’t toe-the-97%-line you will never get published or funded. The truth is out there. Understanding the oceans is the key. We only have some decent oceans profiles starting in 1983. Far from what we need considering how much oceans cover this planet.

I agree that we shouldn’t really assign probablities to scenarios. My understanding is that RCP8.5 will indeed play a smaller role in the next Assessment Report than it has in previous Assessment Reports, which I think makes sense given that it has probably become less likely. However, this doesn’t mean – in my view – that we should now ignore it, since it is still possible that we could end up following such a concentration pathway (especially if we’re too quick to dismiss this possibility).

HAS,
The issue here is that it’s very difficult to really have a good way to assess what we will do on decadal timescales. I think we can say thing like “RCP2.6 is now virtually impossible” or “RCP8.5 is less likely now than it once was” but I don’t think it is really possible to really say how likely RCP8.5, for example, is in term of a percentage.

“I would add that often you can apply probabilities to assumptions, and scenarios are just the product of them.”

yes, If the scenarios were built bottoms up ( like SRES used to be) then
you might construct PDFs for the assumptions and even update over time. But RCPs I believe were constructed top down and are “consistent with” a variety of assumptions.. not sure how I’d go about it. Bottoms up
-say starting with population projections, and then economic development
and then carbon use, I suppose we could build scenarios with Probabilities that are traceable to assumptions and updateable over time.

van Vuuren et al “The representative concentration pathways: an overview” Climate Change 2011 and a companion paper by Riahi et al on rcp8.5 gives the detail. The concentrations was choosen to represent the higher end of what was current in the literature at the time, and in this sense one can draw from RCP8.5 inferences about the then state of climate modelling if not the real world. The pathway is then built up, bottom up, from assumptions, so assessment of likelihoods of this realisation can be made (and using IPCC parlence one would say their joint likelihood is “impossible”). This doesn’t mean there mightn’t be another way.

HAS,
Okay, yes, I agree you have a rough idea of how likely some scenarion is. I was more getting at that it’s quite difficult to really assess what decisions we’re likely to make, and also what events might occur that will influence these outcomes. As I think I already said, RCP2.6 is now probably no longer possible, and RCP8.5 is probably less likely than it was previously.

Now given that, the question for people that want to use 8.5 w/m2 in their risk analysis is “show a set of likely (or fairly unikely etc) circumstances that will generate it”.

I think one problem might be is that physical scientists tend to distance themselves a bit from political and societal factors. They’re simply trying to assess what happens if we, for example, end up with a change in forcing of 8.5W/m^2. They need to be guided by those who study the likely political/societal factors that will influence these pathways. As I understand it, the next set of pathways will focus more on lower forcing pathways, but will still consider an RCP8.5-like pathway.

I’m not sure I agree that you can say it is just a mismatch between political interests and those of the physical scientists. The RCP/SSP work is specifically produced for the IPCC that has a purpose in the political domain. It is applied research.

My sense is that in fact scenario work actually fails on both accounts. Good political management of risks and uncertainties is to understand those (eg the potential distribution of sea level rise is more important than an extreme), the drivers, and the potential ways to manage the risks, and that isn’t to take action today as if a 100 year slowly evolving, very uncertain, outcome is certain. But this is what the scenario work encourages.

From the physical science end it has also never been clear to me what value these scenarios add. I can understand modelling the detailed state of the atmosphere on short time frames to better understand the dynamics, but 100 year projections using the same techniques? But putting that aside if one wants to test 8.5w/m2, then just assume it, and if one wants to study/model the drivers of forcings then just do that.

I’m not sure I agree that you can say it is just a mismatch between political interests and those of the physical scientists.

I’m not saying that there is a mismatch in this way. I’m suggesting that physical scientists are mostly interested in how a physical system will respond to a perturbation. If the perturbation is coming from another physical system, then they could determine how likely such a perturbation is. If it’s from our emissions, then that becomes very dependent on societal/political choices that we will make.

To be clear, I’m not saying that we can’t have any idea how likely a particular scenario is, I’m just suggesting that it can be difficult to assign a robust probability. Even if we did, it wouldn’t necessarily be fixed. Something unexpected could happen in the future that could make a particular scenario more, or less, likely.

Be really careful on this — recoverable reserves are a function of price and technology. If oil and coal were 2-3x more expensive than today, that would (somewhat paradoxically) drive ever greater investments in technology to recover them.

When the AEC estimated total thorium reserves in 1999, they very wisely said “1,000 to 150,000 years” because the reality is no one knows, until prices rise, just how much of anything is out there that is economically recoverable.

We are probably going to keep burning fossil fuels until they are around 10X more expensive than today, because the pivot away from them is going to be just incredibly capital-intensive, perhaps humanity’s largest industrial engineering effort to date. At that point, the biofuels may retake the lion’s share of the liquid energy market, while fission replaces coal, and perhaps space solar and fusion could fill in some big gaps, but no one really knows. We do know the low-hanging fruit problem isn’t unique to fossil fuels, even if (for instance) nuclear power costs are much less sensitive to fuel prices.

Talldave2, it appears the world economy begins to slow down when oil prices get too high. The latest estimate I’ve seen for this price limit is in the $120 to 130 per barrel range in 2017 dollars. The high price impacts poor countries much harder than rich countries, thus while you may be able to afford $10 per liter for diesel, that price isn’t going to work for Pakistanis.

fernandoleanme — True, but remember, we are talking about where prices end up in 82 years. By then, median world GDP per capita is likely to exceed that of many OECD countries in 2018. Look how much it changed in the US in since 1950: https://fred.stlouisfed.org/series/A939RX0Q048SBEA

Talldave, the way I see it by 2050 oil will be incredibly expensive. This may be one reason why so many forecasters stop at 2040. I’m not worried about 2100, the problem is more immediate. Also, it’s important to focus on poor nations. Ask a Jamaican economist what will happen to Jamaica over the next 30 years given THEIR growth prospects, population trends, and governance, if oil product prices climb gradually to a point where oil is sold at $150 per barrel equivalent in 2018 dollars.

fernandoleanme — Well, most people 32 years ago thought oil would be incredibly expensive by now, if not entirely exhausted. So no one really knows.

But we do know prices are a function of demand and supply. As prices rise, several things will happen: more investment in oil production, more investment in efficiency, increased substitution. The key to RCP8.5 (on the emissions side, at least) is that oil consumption probably won’t fall off a cliff until average extraction costs get near the highest price that can be sustained at the incomes and efficiency of consumers, both of which will be rising.

Jamaica can probably endure $150 in 2050 without major trouble — their incomes should have more than doubled by then. Of course, as Venezuela proves, even if you’re sitting on a ton of oil you can still starve.

Well, I’m not sure we are very good at predicting the future 80 years in the future. The main wild card is technology which will probably surprise everyone. A breakthrough is possible in longevity research. We could get a lot better at nuclear power or pumped water energy storage just to give 2 examples. My guess is that things will not be nearly as bad as predicted by official bodies like the IPCC just because the historical track record of such things is that they are usually very pessimistic compared to the reality.

… and for about 20 years I have been arguing that the only thing we know for certain is that the future will surprise us; and that therefore our focus should be on increasing our capacity to deal with whatever future occurs. Instead, the reverse has been adopted, emissions reduction policies which reduce our capacity and flexibility.

As aTTP wrote, we have emissions. And we have the variability of natural sinks. Then we have the variability of what sensitivity is according to the IPCC. So a mitigation approach, walks into these two uncertainties. And that they are uncertain might stand between any mitigation we do and knowing the outcome of that. Which in a lot of fields in an important thing. Mitigation policies which cost X are of unknown value. And coming in from the field of Economics there is some future value that I don’t find a lot reason to go with because of more uncertainties. So as far as what the report says, since I haven’t heard of any recent breakthroughs in these two areas, I really don’t know? And you give all this to the politicians, and that just makes things worse. They’ll be more pork in those policies than you can shake a stick at. Probably the least bad ideas we have is more fracking for natural gas and nuclear power. Natural gas probably doesn’t need much of a subsidy, but a few more ports and pipelines would be nice. And nuclear power by its nature is concentrated and easier to keep an eye on when subsidies are involved. But of course both of these are tied up by old grudges, the environmentalists, who the leftists can’t kick out of their party, not even to save the planet. Meaning they don’t care about the planet.

The supply-side scenarios are not particularly rosy either. They say 610 ppm leads to 2.6 C by 2100, but that would be a transient warming towards 3+ C later as consistent with an ECS near 3 C that they assume. Even then, the end of the century sees rapidly declining fossil fuel usage while energy needs continue to grow implying rapidly rising fossil-fuel costs. Their assumption must be that a significant non-fossil replacement will soften this economic blow. The only difference from UN-driven policies is that the non-fossil replacement occurs more slowly in their scenario. Both have reducing per capita CO2 emissions after the 2030’s, one out of supply and one out of mitigation. The difference is not in the timing of the downturn but in the rate of reduction after 2030.

Methane hydrates are so difficult to extract they can’t compete with renewables and nuclear. Remember those of us who predict a limit to fossil fuel recovery see two crossing curves: one rises versus time as fossil fuels become harder to extract, the other drops as renewables and nuclear power become cheaper. If renewables and nuclear don’t emerge as viable replacements then the world population has to drop because we won’t have enough energy. We can’t sustain a country like Bangladesh or Mexico unless we find a reasonable source of non fossil energy.

But we believe that the other side will be a lot of producers disappearing. So for example we believe that China will be decreased sharply if not disappeared after five years from today. And other countries will continue every day to disappear as countries producing oil. Nineteen years from today, Russia will have declined heavily if not disappeared with 10 million barrels. So comparing the rise of the demand for oil and the disappearing supplier, Saudi Arabia needs to supply more in the future. So we don’t believe that there is any risk in that area for Saudi Arabiahttps://www.bloomberg.com/news/articles/2018-10-05/saudi-crown-prince-discusses-trump-aramco-arrests-transcript

Electric vehicles, pvc, and distributed generation will make a huge dent in CO2 growth or even a reduction, not that CO2 makes much difference. Almost ALL the data is cooked. Sea level change is really moderate 8”/century straight line increases, not curved upward like CO2.

There are extremely crude assumptions made in all the supply side analyses so I wouldn’t put too much faith in them, though it does seem likely that technological advancements will be required in order to fully satisfy RCP8.5 coal consumption. But then, technological advancements will be required to meet growing energy demand no matter the source, so it’s not clear why there would be a special barrier preventing such advancement in coal extraction.

You do make the argument that new coal extraction technologies won’t be pursued because of climate change and other environmental issues, but there are three major issues I can see with you presenting that idea here:

1) They’re already being pursued. We’ve literally just witnessed shale oil and gas tech revolutions in the last decade. You really think people aren’t lining up to do a similar thing with vast unproven coal resources?

2) The idea assumes strong mitigation policies will be in place globally, but RCP8.5 is a no-policy scenario. No-one, least of all the IPCC, is suggesting it would happen in a scenario where we actually implement strong policies against fossil fuel consumption, so a line of reasoning which requires mitigation doesn’t make sense when assessing inherent plausibility of RCP8.5.

3) And, linked with 2, most of those who comment here (and perhaps yourself) advocate against such preventative measures. There are many people trying to dismiss potential impacts via assumptions which require that we follow the very mitigation policies which they advocate against. The phrase is “having their cake and eating it too”.

The estimates aren’t “extremely crude”. I don’t worry about preventive measures to control temperature because the forthcoming energy crisis is much more serious. However, I do support geoengineering research to understand feasibility, costs and risks.

Have you read some of the literature? Taking Wang et al. 2016 as an overall literature summary, probably the most important passage in the paper in terms of acknowledging uncertainty in these matters is relegated to a figure caption:

Note: forecasts for coal production are especially uncertain. While undoubtedly large volumes of coal exist as resources, particularly in thin and deep seams, and also as non-bituminous coal, it is not well established how much of these resources constitute recoverable resources under any realistic scenarios on the future price of coal, and on technology levels. It is recognized that increased automation, and use of in-situ recovery methods can –at least in principle–access large quantities of the coal currently considered uneconomic to extract

Trouble is that their analysis doesn’t take this uncertainty into account at all. Instead it simply relies on estimates from the papers, which make assumptions like current proven reserve estimates representing the maximum amount which could ever be used. That is an extremely crude assumption. One obvious issue looking at the Wang et al. synthesis of coal production forecasts is that at least half of the contributory estimates forecast peak levels which were already exceeded before Wang et al. 2016 was published.

I have an estimate that each 2000 GtCO2 we don’t emit by 2100 saves about 1 C of warming. We could emit anywhere from 2000-8000 GtCO2 by 2100, so that is a 3 C range, staying below 2 C only at the low end.

“It’s a F R A U D really, a F A K E,” he says, rubbing his head. “It’s just B U L L S H I T for them to say: ‘We’ll have a 2C warming target and then try to do a little better every five years.’ It’s just worthless words. There is no action, just promises. As long as fossil fuels appear to be the cheapest fuels out there, they will be continued to be burned.”

I agree with him that the proposed (or already tried) mitigation policies are ineffective and just worthless words. I also agree that as long as fossil fuels appear to be the cheapest fuels out there, they will be continued to be burned. The “stronger measures” are impossible and not needed either.

But even if they were possible, i think there is no strong evidence that human emissions significantly affect the atmospheric concentrations nor that the concentrations significantly the global climate.

How about where Hansen ends with “”I think we will get there because China is rational,” Hansen says. “Their leaders are mostly trained in engineering and such things, they don’t deny climate change and they have a huge incentive, which is air pollution. It’s so bad in their cities they need to move to clean energies. They realise it’s not a hoax. But they will need co-operation.” Do you agree or does it make your head explode?

“There is a positive note to end on, however. Global emissions have somewhat stalled and Hansen believes China, the world’s largest emitter, will now step up to provide the leadership lacking from the US. A submerged Fifth Avenue and deadly heatwaves aren’t an inevitability.

“I think we will get there because China is rational,” Hansen says. “Their leaders are mostly trained in engineering and such things, they don’t deny climate change and they have a huge incentive, which is air pollution. It’s so bad in their cities they need to move to clean energies. They realise it’s not a hoax. But they will need co-operation.””

Positive note? CO2 emissions are as relevant as H2O emissions – not at all!

Leadership? What a B U L L S H I T! The US reduced more than anybody else, not intentionally of course.

From what you are quoting, you would prefer Paris to be more effective and for China to be reducing coal use. But then away from your quotes, you seem to have either an opposite opinion or not even caring about Paris and China.

It is surely reasonable/sensible to bound the problem with worst case (RCP8.5) and best case (RCP2.6) scenarios. The problem is describing RCP8.5 as ‘business as usual’. Indeed the problem is assigning any probability to any scenario in 2018 through to 2100. Imagine in 1918 defining a pathway for CO2 emissions through to 2000. Would you anticipate the Second World War? China’s one child policy? The contraceptive pill? The rise of mass car/air travel? Perhaps the real issue is ‘should global energy/climate policy be based on the worst case scenario?’ And that’s just a matter of opinion.

There are a number of quite solid additional coal TRR peer reviewed studies in addition to those noted. I discussed them in the coal chapter of ebook Gaia’s Limits. In particular, the Patzek (Uppsala) and Rutledge (Cal Tech) papers suggest RCP 8.5 is literally impossible. Dave Riutledge’s papers and presentations are available on line at his CalTech website.

Abstract: Climate projections are based on emission scenarios.
The emission scenarios used by the IPCC and by mainstream climate scientists are largely derived from the predicted demand for fossil fuels, and in our view take insufficient consideration of the constrained emissions that are likely due to the depletion of these fuels.
This paper, by contrast, takes a supply-side view of CO2 emission, and generates two supply-driven emission scenarios based on a comprehensive investigation of likely long-term pathways of fossil fuel production drawn from peer-reviewed literature published since 2000. The potential rapid increases in the supply of the non-conventional fossil fuels are also investigated.
Climate projections calculated in this paper indicate that the future atmospheric CO2 concentration will not exceed 610 ppm in this century; and that the increase in global surface temperature will be lower than 2.6 oC compared to pre-industrial level even if there is a significant increase in the production of non-conventional fossil fuels.
Our results indicate therefore that the IPCC’s climate projections overestimate the upper-bound of climate change. Furthermore, this paper shows that different production pathways of fossil fuels use, and different climate models, are the two main reasons for the significant differences in current literature on the topic.

The results suggest that growth and globalization scenarios are, not only undesirable from the environmental point of view, but also not feasible. Furthermore, regionalization scenarios without abandoning the current growth GDP focus would set the grounds for a pessimistic panorama from the point of view of peace, democracy and equity.

Most of these analyses miss the feature of modern history that new technologies can spring up and change the world view on a topic in as little as ten years.

In US alone, the production of Dry Natural Gas DOUBLED in 13 years from Sept 2005 to August 2018 — driving prices down and shifting many power plants to use cheaper natural gas.

What this means for the future, we can not know. That’s the trouble with the future.

US CO2 emissions have been falling since 2005 as Natural Gas replaced coal for electrical energy production.

Can we count on new technologies to change the playing field? No. But we must allow for them. The most likely new technology will be a electrical storage scheme — cheap, fast-in/fast-out, high volume — “battery”. Such a breakthrough will ramp up the renewable sector and leave “burning stuff” production to act as the grid maintenance/leveler.

Nuclear could be forced down the throats of countries with centralized, autocratic governments — think China, Russia, EU — using existing technologies to build fast-build, small, self-cooling power stations capable of powering single cities.

Either change — one technological, the other social — could alter the energy production landscape significantly.

While a small group of folks debate, for years, the details of models that can never be proven in advance…

..we have very obvious, systemic changes in the human technology combination that prove how it is much better to focus on the art of possible, scaleable, attractive innovations that are already influencing the baseline assumptions in academically approved climate models.

Example:

In less than 15 years the number of mobile devices in the world rose from a few million to 7 billion. Really. Except in war-torn Africa, the per capita penetration is more than 90%. In Africa a $2 used mobile phone is rented out to as many as 1200 non-phone owners. See Grameen Phone for similar deployment in SE Asia 15 years ago.

That means a large portion of daily human activity shfted from physical activity to electronically mediated social activity that is rapidly shedding the capital intensive industries of the past century.

Example – what might be called “the University of Youtube” is rapidly replacing brick and mortar education all over the world. No more driving to big school buildings – both energy consumptive – but rapid spread of knowledge and modern technology to the poorest regions on Earth.

Finance is shifting equally rapidly. See the Aadhaar card in India. Uniform ID given to more than 1.2 bilion people that is biometrically validated, and can carry currency to the poorest. India is experimenting with sending electronic currency to hundreds of millions of “street people” because the government required that the ID be given to people with no home or address.

We in the former imperial world still assume that India and China will follow the discriminatory industrial/technological pace of the colonial era.

Picture the shifts in “Anglo” style institutions in such vast, rapid “developing world” deployment of the most advanced technology humanity has at its disposal.

Similar changes in healthcare – what we once called “medical tourism” – are expanding across Asia and LATAM. Expat Indian workers are sending small amounts of money back to India – providing health insurance for their families, and villages.

The list of such innovations is very long.

But the major takeaway is that a few thousand scientists, using geusstimates of the future based on a century of autos, domestic/commercial heating, and classical regulation – are probably missing the major shifts happening as they debate the details of the imagined future.

In US alone, the production of Dry Natural Gas DOUBLED in 13 years from Sept 2005 to August 2018 — driving prices down and shifting many power plants to use cheaper natural gas.

Here is another example: The Hanford reactor started producing plutonium for the nuclear bomb in 1944. The first nuclear power reactor was sending power to the grid in 1954 (it was in Russia). Just 10 years!

Here is what I conclude from the FOURTH NATIONAL CLIMATE ASSESSMENT (all-caps theirs) report: –

Let us assume, like the authors, that RCP8.5 is realistic (in whatever terms we define ‘realistic’ we must assume it is sufficiently real to them to be included in conjunction with other stuff that they insist should most definitively be regarded as real).

The bottom-line assessment is the projected economic impact on the USA (which in itself also takes into account world impacts) going out to 2099. The relevant table is in Chapter 29, Key Message #3 (‘Estimates of Direct Economic Damage from Temperature Change’) (https://nca2018.globalchange.gov/chapter/29/#key-message-3)

If I read this correctly, the upshot is that the RCP8.5 pathway will hit GDP at the turn of the century by somewhere between 6 and 15 percent (90% confidence interval) with a median of around 11%.

Following M Anderson, I don’t believe for one second that the authors of this report can make credible economic predictions 80 years into the future. However, taking their claims at face value, what do they really mean?

Projecting this number 80 years into the future gives rise to a Real GDP of 1,720.18, i.e., around 15 times what it is today.

An 11% hit is suddenly not looking all that bad! Indeed, by any sane assessment being anything even close to 15 times (plus or minus even 15%) better off seems like something other than an economic and human catastrophe! Given the upside of the enormous potential bounties that the future holds based on what we know from the past (with climate change impacts not affecting things much one way or the other), perhaps it would be best to focus our efforts on absolutely ensuring we continue along the same path for the next 80 years that we have for the last 80 years.

If RCP8.5 is the worst case scenario – what do we do with it? Climate models and the intertwined biological and physical Earth system are chaotic. The future is unpredictable – a limitation of science at the most fundamental level. And this is the dominant Earth system science paradigm – very little regarded or understood more generally.

“Sensitive dependence and structural instability are humbling twin properties for chaotic dynamical systems, indicating limits about which kinds of questions are theoretically answerable. They echo other famous limitations on scientist’s expectations, namely the undecidability of some propositions within axiomatic mathematical systems (Gödel’s theorem) and the uncomputability of some algorithms due to excessive size of the calculation.” James McWilliams

Are we headed for an Earth system tipping point as Will Steffan et al 2018 – http://www.pnas.org/content/115/33/8252 – suggest is possible. Or is there an underlying intrinsic component of recent warming that will counter greenhouse gas warming in future?

Climate will shift again – in the decade to come if it is not happening now. Where to, how fast and by how much is impossible to foretell. But dramatic changes – whatever the cause – in as little as a decade are possible. In global hydrology and biology at the least.

“Collective human action is required to steer the Earth System away from a potential threshold and stabilize it in a habitable interglacial-like state. Such action entails stewardship of the entire Earth System—biosphere, climate, and societies—and could include decarbonization of the global economy, enhancement of biosphere carbon sinks, behavioral changes, technological innovations, new governance arrangements, and transformed social values.” Steffan et al 2018

Stabilization is impossible. The latter is the cultural battleground in a war of values – the outcome of which will determine the course of global civilization – and the scientific enlightenment – in the 21st century.

Robert I. Ellison: “Where to, how fast and by how much is impossible to foretell.”

So, basically, we don’t really know how to properly diagnose the current or future climate, let alone ‘fix’ it.

However, we have a pretty good understanding of the kinds of things likely to f**k our economy, and those less likely to do so based on considerable experience. Not a perfect understanding, of course, but we are certainly in a better position to distinguish dumb sh*t from the credible and know and have access to the requisite tools in economics versus, say, climate science.

Isn’t the prudent route, therefore, to control what we can, and on that basis hope to be as best prepared as possible to deal with those things we can’t control? After all, if one finds oneself drowning in sh*t, being able to afford to buy a canoe and paddle is to be in a profoundly different predicament to not being able to do so.

You assume too much. The probability of major shifts is low this century – but the consequences severe. This makes it an extreme risk that should not be ignored. But the pragmatic response that takes into account both humanitarian and environmental considerations is something entirely different.

The most unlikely assumption in RCP8.5 is a stagnation in tech during the next 80 years. That is a useful assumption in a worst case scenario, but has no precedent in the modern era (i.e. the past several centuries).

It is an especially odd assumption to take seriously with so many signs that another industrial revolution is beginning — in computing, materials science, genetics – and energy. To name just one example, fusion R&D has passed an important milestone: large investments from the private sector (include venture capitalists). They are experienced at evaluating the potential for new tech, and have short time horizons compared to government R&D.

Here are 15 companies active in this area. Most will fail. Perhaps all will fail. Only one need succeed to change the future.

“a stagnation in tech […] has no precedent in the modern era. [….]To name just one example, fusion R&D has passed an important milestone.”

The critical thing in your analysis, which you hint at, but do not make explicit, is that technological progress is certain, but the direction is entirely unknown – not only in terms of where the breakthroughs will occur, but also in terms of which will be important and why. The future will unfold, of course, but the few who predicted correctly will have done so for the wrong reasons and so it can be put down to luck (exactly the same luck enjoyed by those who predicted wrongly).

This perspective is of critical importance, and prevents one falling into the trap of Jim D, which is that technology will provide. It WILL provide, but we don’t know what! He looks to past developments and concludes that we should roll out windmills and solar panels galore because technological providence will fill the glaring gaps. It may, but it may not, and Jim D doesn’t know one way or the other, anymore than you, I or anyone else. However, having erected the windmills he has no other option other than faith. Of course, he considers “science will provide” to be a scientific claim, rather than article of faith because the word ‘science’ appears in it.

There are choices in direction. There is the dumb direction of finding new ways to exploit fossil fuel resources like methane hydrates or Arctic Sea oil, which will allow us to continue fossil fuels as the main energy source for decades. Or there is the sensible, given what we know already, direction of non-fossil sources. Not only that, but it needs to happen soon to prevent a crunch coming from the supply-side direction or the climate-change direction. Both of these drive mitigation policies.

Of course you are right. Unfortunately, you will always choose the former rather than the latter, while being entirely convinced you have things the other way round. I suspect this arises from a lack of experience of action in the real world, which is what it takes to discover what is dumb and what is not.

The most unlikely assumption in RCP8.5 is a stagnation in tech during the next 80 years.

That is not an RCP8.5 assumption. It’s an assumption (or, arguably, the result of assumptions) in one particular scenario which produces RCP8.5 level emissions. Other RCP8.5 level scenarios assume huge advancements in tech.

“This paper summarizes the main characteristics of the RCP8.5 scenario. The RCP8.5 combines assumptions about high population and relatively slow income growth with modest rates of technological change and energy intensity improvements, leading in the long term to high energy demand and GHG emissions in absence of climate change policies.”

The scientists creating the RCPs described them in terms of the paths to get there. Without showing conditions that can produce those end states, RCps would be of no public policy value. These descriptions allow rough estimation of their probability.

It doesn’t disagree. That’s a paper describing a particular scenario, which is described as “the RCP8.5 scenario” because it was used to generate the RCP8.5 emissions for CMIP5 modelling. Just look at the SSP5, which is an RCP8.5 scenario from the current generation – entirely different assumptions. Or A1FI from SRES. RCP8.5 level but assumed massive technological developments.

If I have the numbers right, that graph of CO2 emissions vs. the RCP scenarios is misleading.

Compare the RCP Databank with table 6 in the “Global Carbon Budget 2017.” Estimated total CO2 emissions of carbon in 2016 were 10.0 to 12.4 gtC (11.2 mean). That range puts it roughly in the range between the 2016 estimates for RCP2.6 and RCP8.5 (interpolating from the given numbers for 2010 and 2020). RCP2.6 (usually described as “impossible”) is the second highest co2 emissions scenario until 2030 (the RCP gives only one number per decade).

The graph in the U.S. National Climate Assessment obscures this – a period of high interest – by showing only the 200 year period. I suspect that most people will look at the graph and conclude, as Dr. Curry did, “that we are currently on track for RCP8.5.”

‘And Then There’s Physics’ wrote: “I don’t think it is really possible to really say how likely RCP8.5, for example, is in term of a percentage.”

I’m not sure about that. See van Vuuren et al 2011: ‘The representative concentration pathways: an overview’ (DOI 10.1007/s10584-011-0148-z). It states that “RCP8.5 leads to a forcing level near the 90th percentile for the baseline scenarios”, where baseline scenarios are those not involving greenhouse gas mitigation or stabilization. Also that “RCP8.5, in contrast, is a highly energy-intensive scenario as a result of high population growth and a lower rate of technology development”.

Its Fig. 3 shows that RCP8.5 is above the 90th percentile for primary energy consumption until 2090. Its Fig. 4 shows that RCP8.5 is above the 90th percentile for energy intensity through 2100. Its Fig. 9 shows that RCP8.5 is close to or above the 98th percentile for CO2, CH4 and N2O concentration from 2050 through 2100, and its Fig.10 shows RCP8.5 to be close to the 98th percentile for radiative forcing from 2040 through 2100.

It is clear that in no way is RCP8.5 representative of “business as usual”. Rather, it should be viewed as a worst case scenario, with a probability of at very most 10% in the absence of any mitigation, and almost zero in the light of the level of mitigation that nations have now committed to.

Adding to your comment:the population assumption in RCP8.5 is 12 billion in 2100 – near the 80% probability line at roughly 12.2 billion in the Gerland 2014 probabilistic forecast. Many experts consider that forecast to be almost impossible.

That paper used two methods to estimate long-term economic growth: time series estimation using historical data and experts’ guesses. While a useful and fun exercise, neither has any proven skill. As such, they have no role in the public policy process.

That is especially so since RCP8.5 assumes something not in the historical record of the period they examined (1900-2010): 80 years of tech stagnation. Their stats are of a 110 year long period of rapid tech progress, from which they’re extrapolating a radically different era.

I would like to put on som figures regarding what it means to reach 600ppm Co2.
One ppm CO2 equals 7.5Gt and that is close to 2Gt pure carbon.
To reach 600ppm from our present 400ppm you need to emit an amount of 400ppm CO2, because only half the amount stays in the air.
It means you have to burn 800Gt carbon molecules in whatever fuel they exist.
Now you can discuss if that is possible relative to the known reserves of fossil fuel.

Good point Sven! Additionally, the harder and more expensive fossil fuels become to extract and the cheaper and more efficient photovoltiac cells and batteries become, all of a sudden fossil fuel will fade away, similar to the demise of steam power on railroads. Note the size of Tesla’s battery factories https://en.wikipedia.org/wiki/Gigafactory_1 and their solar panel development, and that of China.

Here’s Dr Rosling’s audit of Human progress from poor and sick to wealthy and healthy 1810 to 2010. This BBC video only takes 4 minutes of your time, but you may learn something. Boy the Industrial Rev and fossil fuels sure were terrible, NOT.

And here’s Dr Rosling’s TED talk trying to dispel the ignorance about the modern world. The so called educated elite are no better than Chimps at the Zoo. Just watch the first 5 minutes to understand the problem.

It is clear that in order to be properly informed, you must ignore the MSM and you must educate yourself with primary data.

It is the same with climate. Educating and informing is not about publishing in academic journals. It is about disseminating that data which is agreed to be sufficiently robust as to be beyond question, to highlight where ongoing data collection is attempting to address current questions and highlighting honestly and humbly where ignorance still exists.

Here are my three questions:

1) Would you fundamentally change the world based on the data of a three year old? (Climate data is three ‘climate years’ old)
2) Would a nation prioritising the extinction of 60 million bison to destroy the culture of indiginous peoples be your first choice to steward the earth?
3) If you had to choose between real-life experience of farmers and computer modelling of urban geeks, what would you choose to rely on?

“In considering ‘worst case’ climate change impacts, we first need to assess the realistic worst case for global carbon emissions.”
How can there be a realistic worse case.
Semantically incorrect.
Whichever realistic worse case one chooses one can always make it a little bit worse without losing realism, surely.

“Worst case” is a term in risk management. It refers to the “worst case scenario” in the range of scenarios considered. A specific worst case scenario can be “useful” or “inappropriate” — by being too likely or too unlikely.

Now as to “we first need to assess the realistic worst case for global carbon emissions.”
There is an assumption that business as usual is the worst case.
Why?
It should be the median position.
After all we can dig up and burn more or less fossil fuel than business as usual.
With increased ability to mine and increasing products available from fossil fuels there is an expectation that use per person should increase.
Considering that only 1 in 5 people benefit fully from fossil fuel use there is an 80% possible increase in numbers of people who might wish to avail themselves of fossil fuel use.
Which means that RCP8.5 despite Judy’s question should really be considered as merely a middle of the road scenario.
“The recently published U.S. National Climate Assessment shows that we are currently on track for RCP8.5.”

There are a lot of old fossils [joke] here pontificating about the use of fossil fuels going down and the worlds ability to extract coal resources going down.
If people want them badly enough they will get the more difficult to reach coal seams plus the ones not used or known about.
All the way down the East Coast of Australia only 1% so far being dug up and sold to India, China etc.
Who knows how much more in the shallow sea depths of that coastline.
Plus Natural Gas everywhere. I have no concerns re RCP8.5 being unrealistic, only not realistic enough.

Bear in mind that coal was once plants which were once CO2, water, and sunlight. Similar with other carbon containing minerals like chalk cliffs, limestone, and marble which were once part of living plants and animals which fell to the bottom of the ocean when they died, a process which continues to this day.

As to ATTP and Mosher with their scenarios are not predictions comments. A scenario is a prediction. We do this, that will happen. Another semantic cop out on their part from the fact that temperatures are not following the CO2 increase. What a calamity, what shall we do? Go into denial is a good starting spot.
If they were in tune you know they would become predictions.
Enough rants.

How can such models ever be useful?
The facts are, CO2 is not a ‘greenhouse’ gas, it has a logarithmic response that is pretty much already saturated and the whole AGW idea is a scam – at no point in history or prehistory has CO2 ever caused temperature changes; the evidence of CO2 following temps is indisputable except in the climate models that have yet to produce a useful, let along accurate prophesy. (you cannot call what the priests of AGW do actual predictions)

Why give even the slightest legitimacy to such manipulations of reality when they are stealing $Millions that could be used for reality science?
I’ve just had a discussion online about letting the true believers get away with saying climate change – if we do not point out, every single time, that they really mean AGW, we hand them the field to play as they wish.

We already see the craziness they planned by rebranding AGW to climate change – Warming makes it colder and could even cause an Ice Age!

O think we need to very carefully deny the left ANY share of the field and make sure we always point out their lack of scientific reality, and discussing AGW models, even as it is done in this post, gives the an avenue to claim it’s just a disagreement of theories, instead of showing the reality that they are doing politics, not science of ANY form.

‘The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W/m 2 with 90% uncertainty bounds of +0.17 to +2.1 W/m 2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W/m 2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing…’ Bond, T. C. et al, 2013, Bounding the role of black carbon in the climate system: A scientific assessment, JOURNAL OF GEOPHYSICAL RESEARCH: ATMOSPHERES, VOL. 118, 5380–5552, doi:10.1002/jgrd.50171

Co-emitted species?

“Compiling all the data, we show that solar-absorption efficiency was
positively correlated with the ratio of black carbon to sulphate. Furthermore, we show that fossil-fuel-dominated black-carbon plumes were approximately 100% more efficient warming agents than biomass-burning-dominated plumes. We suggest that climate-change-mitigation policies should aim at
reducing fossil-fuel black-carbon emissions, together with the atmospheric ratio of black carbon to sulphate.” http://www-cas.ucsd.edu/personnel/vram/files/pr176.pdf

“We must make the building of a free society once more an intellectual adventure, a deed of courage. What we lack is a liberal Utopia, a programme which seems neither a mere defence of things as they are nor a diluted kind of socialism, but a truly liberal radicalism which does spare the susceptibilities of the mighty (including the trade unions), which is not too severely practical and which does not confine itself to what appears today as politically possible…” F. A. Hayek

Liberal in the classic sense of course.

Australia’s Paris commitment is to a 50% reduction per capita emissions – and a 65% decrease in carbon intensity – adding to the efficiency gains that started with the 1st oil shock in the 1970’s. Much better than China.

Is the following announcement based on this RCP8.5?
“In its first major update on climate change in almost 10 years, the UK Met Office has warned of significant temperature rises in the decades ahead.
The UK Climate Projections 2018 study is the most up to date assessment of how the UK may change over this century.
It says that under the highest emissions scenario, summer temperatures could be 5.4C hotter by 2070.”

The RCP8.5 questions posed by Dr. Curry are ones I repeatedly encounter. My conclusion is that not only should RCP8.5 be eliminated from consideration for policy making, but it is also useless for adaptation planning (not a “worst case”). When I have presented downscaled RCP8.5 model projections to clients eager to understand their regional climate future, the numbers are met with either silence or simply dismissed as ridiculous. Then they ask for a realistic forecast.

Most of the posted comments have dealt with the innumerable considerations encountered in a bottom-up approach to GHG forecasting. Just one of those is whether there are limits on a return-to-coal scenario (which is a highly questionable proposition inconsistent with energy technology evolution, thereby disqualifying it from consideration). In my forecasting experience, thrashing about in such details of bottom-up constructs without a top-down constraint is a never-ending, misleading, and ultimately irreconcilable pursuit; so that trying to find a useful “worst case” future (if you can even define what that means) will be inconclusive.

I suggest that development of an objective, quantitative forecasting approach not subject to selective bottom-up assumptions can more successfully establish a baseline or ‘most-likely’ forecast from which a range of logical future CO2 concentrations can then be explored. Any proposal of a significantly deviating scenario would then require quantification and coherent justification. We have 60 years of rigorous CO2 measurements to which fundamental forecasting science principles can be applied (I continue to be appalled how climatology consistently ignores them). Atmospheric gas transitions have been underway for many decades, and the net of all sources, sinks and sub-processes are quantified in the instrumental record that is remarkably rich to forecasts of the future.

I extend an invitation to my presentation at the upcoming AMS Annual Meeting in Phoenix AZ:
Paper number 4B.5: “Addressing Climate Change Uncertainties Arising from Greenhouse Gas Scenarios – Can a More Bounded Range of Future CO2 Concentration be Identified?”
9:30am, Tuesday, January 8, 2019https://ams.confex.com/ams/2019Annual/meetingapp.cgi/Paper/350740

A preview of findings from the paper:
Forecasting from the Mauna Loa record, CO2 concentration in year-2100 of 654ppm (~70ppm lower using the South Pole record). Long-term Mauna Loa CO2 ceiling of 840ppm.
RCP8.5 stipulates average global CO2 concentration in year-2100 of 936ppm and a long-term ceiling of 1962ppm. RCP6.0 is 670ppm in yr-2100. RCP4.5 is 538ppm in yr-2100. So, findings fall between RCP4.5 and RCP6.0. Reaching RCP8.5 CO2 levels are shown to be mathematically illogical and not at all “business as usual”.

By the way, apply the Curry-Lewis TCR finding of 1.33degC/2xCO2 to this CO2 forecast and you’ll see we are headed for 1.5-2.0C of temperature change in 2100 (half has already happened). There aren’t many things which will change that. (My empirical climate sensitivity after deconvolving modes of natural variability is 1.45degC/2xCO2, presented at AMS-2016.)

It is reported that the US National Climate Assessment was compiled by “more than 300 scientists.” I read somewhere that only a small fraction were scientists. Does anyone have a reference to the group’s make-up? Thanks.

Short of running out of fossil fuel, is it even possible to estimate the probability of different forcing pathways that depend on human choice? What pathway should we choose?
In retrospect the one climate variable that has been the most “predictable” has been the concentration of CO2 in the atmosphere, i.e the Keeling curve. Since 1958 the yearly trend has been a quadratic. Superimposed on this quadratic has been a gradually increasing seasonal trend. From 1958 until the end of 2015, when the Paris Agreement was signed, the average deviation from this trend was zero with a standard deviation of 0.79ppm. That’s not just multi-decadal. That’s almost 60 years that “business as usual” plus natural feed-backs and Henry’s law have produced an incredibly precise trend! How could human development have been that precise? When will the trend start to bend in the downward direction? The following graph shows an extrapolation of the pre Paris trend.
Atmospheric CO2

Since 2015 the average deviation of measurements from the pre-Paris trend has been a positive 1.5ppm. This is a very small change, but it is the opposite of downward.
Deviation from the trend since Paris Agreement.
Eventually the curve must start to bend down. Is that point in the future predictable? Will future levels of atmospheric CO2 depend only on human emissions?

The most significant human choice, by far, is how many children we have.

Right now the fertility rate is below 2.1, the level-population rate, in many countries, and falling in nearly every country. If this continues there will be fewer persons on Earth in 2100 than there are now, and they will be much older. China will lose at least 300 million, and much more if they cannot achieve the two-child rate very soon. Japan will lose 40 million of its 120 million population by 2060. Russia, Portugal and other countries are also losing population right now.

The population will never reach 12 billion, as RCP8.5 appears to assume, and likely will not reach 10 billion before starting to fall around 2055-2060. This will be a major driver of emissions. Further, the average age worldwide is going to go up significantly, reducing the per-capita use of energy. China already has more persons aged 40 to 60 than 0 to 20.

hswildman –
Good observations. Yes, it has been an incredibly precise trend for six decades, and should therefore serve as the starting basis for future projections. While it looks quadratic there are actually a few formulations that can describe it. It starts exponential and then evolves into another form that I will present at AMS (see my post a couple days ago). Applying the math shows that RCP8.5 is illogical. The ‘bend down’ you are looking for can actually be projected from the existing data series and comes mid-century (2040-2080). You are correct that it must inevitably come because global population growth rate has already turned the corner downwards, and population is the primary macro-driver.

A question is whether or not there are enough available coal reserves.
There are.

The question is whether or not inexpensively extractable coal is sufficiently close to major energy consumers to offset the transportation costs.

On this question coal fails. The largest reserves of inexpensively extractable coal are in Wyoming. A ton of coal that costs $10 to extract in Wyoming ends up costing $80/ton by the time it gets to a power plant in China or for that matter a power plant in the Southeastern US.(Hence the decision to build new nuclear power in the South Eastern US- low natural gas prices were unexpected)

Nuclear power in China competes well financially against $80/ton coal.
Construction costs are a bit higher in Europe so coal still does well at the $100/ton price.

A reasonable person would expect that China’s baseload coal plants will be transitioned to intermediate load plants as new nuclear power comes online in China in the 2020’s and eventually seasonal peakers as 4th generation nuclear comes on line in the 2030’s.

RCP 8.5 fails because it makes the assumption that inexpensively extractable coal is evenly distributed in the world. It is not. In China and Europe coal extraction effort is measure in man hours per ton. In Wyoming and Australia coal extraction effort is measured in tons per man hour.

I’ll give Steve Bloom the vegetative collapse path as possibility, only because I could totally see some stupid policy to try to stop warming triggering something like that. Also, economic malaise leading to war and triggering ecological disaster is very possible.

drforecaster-
I wish I could attend your ASM talk. I commend your effort to reduce the gap between best case and worse case scenarios providing a “more constrained range of logical projections”. An 8.5:2.6 ratio is too large for planning purposes. I’m looking forward to the ‘bend down’ arriving in 2040. With respect to your 2016 paper, could you comment on the sources of natural modes you used to deconvolve global temperatures to obtain a climate sensitivity of 1.45? I realize that other greenhouse gasses should be included and that this is too simplistic, but if I subtract 2.44ln(base 2)[CO2/ref] from the NASA temperature record I obtain a rather flat residue (~0.140°C std. dev,) with major periodicities of about 19.7, 9.2, 4.8, 3.6, and 2.3 years, which I assume are “natural modes”, maybe ocean. Why isn’t climate sensitivity closer to 2.44?

The natural variability mode you seek is not among those you’ve identified. Look for spectral components with periodicity greater than a decade, since those are what modulate the temperature anomaly series to appreciably affect the level of the series from time to time, including now during the ‘hiatus’ (yes, it persists). The short period variations (of a few years) that you see are simply transients which come and go without much overall consequence (ex- El Niños, volcanoes).

To the logarithmic expression you wrote add a Fourier series of all frequency components. The component you are missing has a 63-year periodicity that I’ve estimated from the Hadley Center T4 global temperature anomaly series. A data series at least twice the length of the cycle period is needed for analysis; and the HadT4 goes back to 1850. To reveal what is going on, the high frequency components (<10 yr period) should be suppressed, and there are various smoothing filters with decadal cutoffs to accomplish that (simple running averages still retain high frequencies). The result reveals local minima and maxima around 1879, 1909, 1943, 1974, 2005. The ~63-year cycle is forcing those and it perturbs your sensitivity derivation. There have been numerous spectral analyses of temperature anomalies in the literature revealing this primary component, with a secondary one of period ~20 years. Since the 1990s (ex –Schlesinger & Ramankutty, 1994) numerous papers have reported the 60-70 year cycle that goes back centuries (even millennia) in paleoclimate evidence (including even, ironically enough, Delworth & Mann 2000).

Many presentations have noted the “stair-step” manner in which twentieth century temperatures increased. That is what you get when the 63 year cycle is superimposed on the underlying anthropogenic trend. If natural variability it not subtracted out of AGW+FourierSeries then you get the erroneous climate sensitivity number you found. The 63-year cycle is the strongest of all variability modes. When I have subtracted it and re-filter-smoothed the series, the ~20 cycle is evident with a period closer to 21 years by autocorrelation analysis. But, it is only ~30% the strength of the primary component and makes little difference to an empirical climate sensitivity value (1.43 vs 1.45 oC/2xCO2). So, try just approximating the primary mode with a sinusoidal cycle of strength A/SqRoot2 where amplitude A~0.16oC, period = 63, phase set by maximum at any of 1879, 1943, or 2005 (I prefer the later). Subtract it from your anomaly series and re-derive the sensitivity value.

So, what’s with the 63-year cycle? Taking dozens of papers together it appears to me that the Atlantic Ocean has a resonant frequency in overturning circulation which, through teleconnections also modulates the Pacific. Just to mention a few of the papers: Dr. Curry (and collaborator Wyatt) described the stadium wave imprint; Barcikowska et al 2017 (Journal of Climate 30(2)) provided interesting insight to teleconnections. Can/should we use this cycle in temperature forecasts considering the volume of revealing research, the long-term paleoclimate evidence, and that only another 1.3 cycles brings us to end-century? Yes.

For the sake of completeness, the RCP 8.5 scenario does not end in 2100 nor is the 8.5 W/m2 the equilibrium forcing, that is 12 W/m2 in 2250!
The complete science fiction scenario ends in 2500 when the world has not done anything to limit global warming for 300 consecutive years….