It is curious in these days of much talk of rapid ice decline in the Arctic that the minimum extent is the same as it was 9 years ago.

It’s that time of year when the minimum ice extent in the Arctic. One common way to look at it is to pick a particular month and wield a straight line. Fig 1 is from 1979 showing the ice extent going down and down, prompting claims of an ice-free Arctic sometime in the near future. It shows the declining Arctic ice cover which seems precipitous until one considers that it is a decline of about 10% by its measure of ice extent in 35 years! Important certainly but not as dramatic as the graph shows. But with graphs like those one needs to step back and consider the context, for it does not show what it appears to.

Between 1979 and 2015 – the years covered by the graph – atmospheric CO2 levels increased a lot, from 340 ppm to 400 ppm. To put it into context the increase from 1960 to 1979 was just 25 ppm. Fig 1 shows that during this unprecedented increase the gradient – the rate of decline – of the sea ice loss remained constant. In other words the addition of almost 20% of CO2 into the atmosphere did not change the behaviour of the sea ice at all. If one was being strict, based only on the arctic ice data and CO2 information, one would have to conclude that there is no correlation between Arctic sea ice extent and atmospheric CO2 levels! Surely one might have expected the more CO2 in the atmosphere the greater would be the so-called polar amplification effect, and the greater the decline in the rate of loss of sea ice.

As I wrote when looking at last year’s data the declining Arctic ice cover has been one of the most powerful images of climate change and that many who follow the debate don’t look too hard at the data. This results in superficial reporting that does not convey any of the complexities of the situation and as such is poor science communication.

Last year a suggestion (which had been made before) that Arctic ice was more resilient that was thought prompted much discussion but little media coverage despite the research being published in Nature Geoscience by Tilling at el (2015)called “Increased Arctic sea ice volume after anomalously low melting in 2013.” The headline was that the volume of Arctic sea ice increased by about a third after an unusually cool summer in 2013. Reports went on to say that the unusual growth continued in 2014 and more than compensated for the loss in the three previous tears. Overall it was concluded that changes in summer temperatures in the Arctic have a greater impact on the ice than was thought.

With the data for 2016 now in it is time to look again at the claims of an “ice pause.” Fig 2 shows the latest situation using one measure of sea ice extent.

This year’s minimum was reached on day 254 (September 10th) of the year (nothing unusual). The minimum ice extent was also nothing unusual at 4.1 million km2, not the lowest and about the same as 2007. Some media reports portrayed this as the second lowest (behind the anomalous year of 2012) and mentioned its comparison with 2007 without making the obvious comment that it was curious in these days of much talk of rapid ice decline in the Arctic that the minimum extent was the same as it was 9 years ago!

Here is the minimum extent since 2007 (millions of sq km) and it is obvious there is no general decrease in minimal ice area, by this measure, between 2007 – 2016 – ten years! Did anyone run the headline that Arctic minimum ice extent has showed no significant change in the past decade? The case can be made that the behaviour of the Arctic ice cover has changed from the declining years of 1998 – 2007.Feedback: david.whitehouse@thegwpf.com

The ability to make accurate predictions is a hallmark of good science. Predictions by environmental doom-mongers have proven to be wrong time and time again. No wonder most people no longer believe them.

If we do not reverse global warming by the year 2000, “entire nations could be wiped off the face of the earth by rising sea levels”, warned Noel Brown, a director of the United Nations Environment Programme, in 1989.

It is common cause that sea levels have been rising ever since the start of the Holocene at the end of the last Ice Age, about 11,700 years ago. Throughout the 20th century, tide gauge data has shown this rise to be fairly steady at about 1.5mm/year, and largely unaffected by changes in temperature or atmospheric carbon dioxide levels. Since 1993, satellite altimetry has determined a fairly constant sea level rise of just over 3mm/year. However, it is far from clear whether this represents an acceleration or an artefact of how sea level is measured with respect to surrounding land.

A 2016 paper by Australian scientists Albert Parker and Cliff Ollier suggests that the altimetry record suffers from errors larger than its trends, and “returns a noisy signal so that a +3.2 mm/year trend is only achieved by arbitrary ‘corrections’”.

“We conclude that if the sea levels are only oscillating about constant trends everywhere as suggested by the tide gauges, then the effects of climate change are negligible,” they write, “and the local patterns may be used for local coastal planning without any need of purely speculative global trends based on emission scenarios. Ocean and coastal management should acknowledge all these facts. As the relative rates of rises are stable worldwide, coastal protection should be introduced only where the rate of rise of sea levels as determined from historical data show a tangible short term threat. As the first signs the sea levels will rise catastrophically within a few years are nowhere to be seen, people should start really thinking about the warnings not to demolish everything for a case nobody knows will indeed happen.”

Clearly, history proved Noel Brown wrong.

In 2002, George Monbiot urged the rich to give up meat, fish and dairy, writing: “Within as little as 10 years, the world will be faced with a choice: arable farming either continues to feed the world’s animals or it continues to feed the world’s people. It cannot do both.”

In 2002, 908-million people worldwide suffered hunger. Ten years later, that number had declined to 805-million, according to the UN Food and Agricultural Organisation. Because of continued population growth, this nominal decrease represents a much larger decline in the prevalence of undernourishment, from 18.2% of the world’s population in 2002 to 14.1% in 2012. Hunger remains steadily on the decline. Famines, once so common, are rare nowadays.

Clearly, history proved George Monbiot wrong.

In 2008, the US television channel ABC promoted an apocalyptic “documentary” called Earth 2100, hosted by Bob Woodruff. The film cites a host of scientists, including such perennial alarmists as James Hansen, formerly head of Nasa’s Goddard Institute for Space Science, and John Holdren, the US science czar who in the 1970s thought population control might be necessary to ward off mass starvation (see Prophets of doom in high places).

The show depicts the world at various times in the future, leading up to a collapse of civilisation “within this century, and perhaps your lifetime”. By 2015, it said, agricultural production would be dropping because of rising temperatures and the number of malnourished people “just continually grows”. We’ve already seen that the latter prediction proved to be false. Agricultural output also remains on a strong upward trend worldwide, and most of that is because of rising productivity, and not a rise in land use, irrigation, labour or other capital inputs.

A carton of milk would cost $12.99 by 2015, the film said, and a gallon of fuel would cost over $9. In reality, milk cost $3.39 and fuel cost $2.75 in 2015. Much of New York and surroundings would be inundated by rising sea levels, they said. Below is their 2008 map, and a current satellite view of New York from Google Maps. I’m no aerial surveillance analyst, but I don’t see any evidence that half of New York is under water.

Clearly, history proved Bob Woodruff and his famous scientific sources wrong. [...]

The ability to make accurate predictions is a hallmark of good science. Conversely, making false predictions implies either that a statement is not based on science, or that it is based on bad science.

So why is it that we continue to believe environmental doom-mongers, even though history has proven them wrong time and time again?

Oh, wait. It turns out we don’t believe them. Almost 10-million voters in a huge United Nations poll have ranked climate action dead last out of 16 concerns – below even the phone and internet access that all respondents would have enjoyed anyway if they were able to complete the online poll.

Those are the fruits of constant, shrill, exaggerated alarmism. Get proven wrong by history often enough, and four out of five ordinary people will stop believing you. And good for them, too. Perhaps environmentalists should mind the environment instead of trying to rule the world.

Actual growth in South East Asian fossil fuel consumption over the last twenty years, and projections of a near doubling of demand by 2040, indicate that developing nations won’t undermine their own development objectives in favour of decarbonisation.

On the 22nd of September at the 34th Association of South East Asian Nations (ASEAN) Ministers of Energy Meeting (AMEM) in Myanmar, Dr Fatih Birol of the International Energy Agency (IEA), met energy minsters in the 5th AMEM-IEA Dialogue, a meeting that resulted in the following joint statement:

The ASEAN Ministers and the IEA Executive Director discussed current energy trends and their impact on the ASEAN region. In particular, the Ministers acknowledged the changing dynamics of the oil and gas markets, including the prolonged low oil price environment, the rebound in global oil and gas production unlocked by upstream technologies in oil and shale gas, and reduction in new investments in upstream exploration as well as volatilities arising from geopolitical factors and long-term market rebalancing. The Ministers looked forward to further analysis by the IEA, which will help ASEAN Member States in developing appropriate energy policies and strategies to respond to the changing energy landscape. ASEAN Ministers also welcomed the IEA’s new “open doors” policy towards deeper and wider-ranging collaboration with emerging economies in Southeast Asia.

While this is diplomatic language, it is hard to believe that Dr Birol will have been pleased with such a cheap-oil-and-gas response to his own presentation, in which he emphasized both “the shift in energy demand growth to Asia” and “the importance of taking advantage of [the] decreasing cost of renewable energy and energy efficiency.”

Far from wanting to restrain their consumption, the ASEAN ministers seem much more concerned with securing fossil fuels at advantageous terms, a pragmatism that should come as no particular surprise, least of all to Dr Birol. Indeed, read in the context of last year’s IEA South East Asia Energy Outlook special report Dr Birol’s speech sounds more like a plea of desperation than a realistic hope. It is a plea, however, that the energy ministers politely brushed aside. In their shoes Dr Birol would probably have done the same. He certainly knows the facts.

Reporting empirical growth data, the IEA observed that primary energy consumption in the ASEAN states has nearly trebled since 1990, growing from 233 mtoe a year in 1990 to 386 milion tonnes of oil equivalent (mtoe) in 2000, and then by a further 50% to 594 mtoe in 2013. (For comparison, the UK’s annual primary energy consumption in 2015 was about 200 mtoe, down from 240 mtoe in 2005 (See Chart 1.3 in Digest of United Kingdom Energy Statistics, 2016.)

Southeast Asia’s rapidly rising energy demand is daunting in itself, but the outlook is still more remarkable. In the central, but arguably rather optimistic New Policies Scenario, the IEA projects demand to increase by 80% up to 2040, reaching 1,070 mtoe per year. The bulk of this increase is provided by fossil fuels, which increase their share from 74% (437 mtoe) at present to 78% (838 mtoe) in 2040.

While renewables grow, from 156 mtoe in 2013 to 223 mtoe in 2040, their share actually declines from 26% to 21%.

The IEA’s table is so striking that I reproduce it here in its entirety:

This is the fundamental truth behind the Paris agreement, echoing through the ASEAN minister’s response to the IEA: aspirations are one thing, demands for affordable energy in the real world are another.

The “debate” over climate change — which is really, in Matt Ridley’s terms, a “war” absent any real debate — has potentially done grave harm to the scientific enterprise.

The question of whether climate change is produced by anthropogenic global warming (henceforth AGW) has triggered an increasingly contentious confrontation over the conduct of science, the question of what constitutes scientific certainty, and the connection between science and policymaking. In a world in which we seek to understand complex, multifaceted phenomena such as climate (and to extract from this knowledge appropriate policy responses) the enduring epistemological question arises: What do we know? Logical inquiry might be expected to help resolve this knowledge problem (Hayek 1945) but is confounded by the assertion that the “science is settled,” by condemnation of those who disagree as “deniers,” and even by proposals that they be prosecuted as RICO offenders. There is increasing talk on the left— and even among Democratic state attorneys general and the highest levels of the Obama administration—of criminalizing the very effort to rebut the climate change orthodoxy (Gillis and Schwartz 2015, Moran 2016).

What could have been a fruitful, albeit perhaps contentious debate over decisionmaking when addressing highly complex phenomena has degenerated into a prolonged contest. While recognizing the problems attending denial of climate change, our purpose here is to elucidate the limitations of the now-dominant view. We ground this view within a Kuhnian framework and suggest the limitations of that framework in understanding the uncertainties of climate change and policies that flow from it. Kuhn (1962) points to an often-repeated process whereby scientific paradigms become locked in and resist challenges to their validity because knowledge production is socially controlled and deeply invested in the political currents of the day.

Power relationships and vested interests have frequently played a critical role in determining what acceptable science is or is not. In contemporary parlance there is historical lock-in and path dependence: once there is commitment to a particular body of knowledge that relates to a particular course of action, the costs of change increase over time and even if one wishes to move to a different path, it is difficult to do so. This is not to say that it is impossible for dissenters from the standard accepted approach to get their views expressed in the standard academic journals, but it is clearly more difficult. Moreover, consistent with the concept of path dependency (Greif and Laitin 2004, Arthur 1989), once a scientific paradigm becomes locked in, it becomes increasingly difficult to challenge the status quo in the accepted scientific outlets, at least until challenges to the orthodoxy of the day become so compelling they cannot be ignored.

To be sure, sometimes change does take place in a relatively smooth fashion, as when Lavoisier’s description of oxygen led to the abandonment of Becher’s phlogiston theory of combustion. At other times, where long-held doctrine is at stake, the conflict over new ideas becomes brutal: Galileo was tried by the Inquisition, found guilty, and spent the rest of his life under house arrest. In all cases, time is involved and supporting facts must be provided before a new paradigm gains acceptance. Both Wegener’s 1915 theory of continental drift and Milankovitch’s 1912 theory of the relationship of climate cycles to earth-sun geometry were dismissed for many decades until new evidence was provided—the Wilson-Morgan-Le Pinchon-McKenzie evidence for plate tectonics that was codified in 1965–67 and the Hays-Imbrie-Schackleton spectral analysis of ice core data that reinforced the idea of orbital forcing in 1976 (Hays, Imbrie, and Shackleton 1976). [...]

Conclusion

As awareness of the uncertainties of global warming has trickled out, polling data suggests that the issue has fallen down the American public’s list of concerns. This has led some commentators to predict “the end of doom,’ as Bailey (2015) puts it. In light of this, it seems odd to keep hearing that “the science is settled” and that there is little, if anything, more to be decided. The global warming community still asks us to believe that all of the complex causal mechanisms that drive climate change are fully known, or at least are known well enough that we, as a society, should be willing to commit ourselves to a particular, definitive and irreversible, course of action.

The problem is that we are confronted by ideologically polarized positions that prevent an honest debate in which each side acknowledges the good faith positions of the other. Too many researchers committed to the dominant climate science position are acting precisely in the manner that Kuhnian “normal science” dictates. The argument that humanity is rushing headlong toward a despoiled, resourcedepleted world dominates the popular media and the scientific establishment, and reflects a commitment to the idea that climate change represents an existential or near-existential threat. But as Ellis (2013) says, “These claims demonstrate a profound misunderstanding of the ecology of human systems. The conditions that sustain humanity are not natural and never have been. Since prehistory, human populations have used technologies and engineered ecosystems to sustain populations well beyond the capabilities of unaltered natural ecosystems.”

The fundamental mistake that alarmists make is to assume that the natural ecosystem is at some level a closed system, and that there are therefore only fixed, finite resources to be exploited. Yet the last several millennia, and especially the last two hundred years, have been shaped by our ability—through an increased understanding of the world around us—to exploit at deeper and deeper levels the natural environment. Earth is a closed system only in a very narrow, physical sense; it is humanity’s ability to exploit that ecology to an almost infinite extent that is important and relevant. In other words, the critical variables of creativity and innovation are absent from alarmists’ consideration.

In that sense, there is a fundamental philosophical pessimism at work here—perhaps an expression of the much broader division between cultural pessimists and optimists in society as a whole. Both Deutsch (2011) and Ridley (2015b) view much of the history of civilization as being the struggle between those who view change through the optimistic lens of the ability of humanity to advance, to solve the problem that confronts it and to create a better world, and those who believe that we are at the mercy of forces beyond our control and that efforts to shape our destiny through science and technology are doomed to failure. Much of human history was under the control of the pessimists; it has only been in the last three hundred years that civilization has had an opportunity to reap the benefits of a rationally optimistic world view (see Ridley 2010).

Yet the current “debate” over climate change—which is really, in Ridley’s (2015a) terms, a “war” absent any real debate—has potentially done grave harm to this scientific enterprise. As Ridley documents, one researcher after another who has in any way challenged the climate orthodoxy has met with withering criticism of the sort that can end careers. We must now somehow return to actual scientific debate, rooted in Popperian epistemology, and in so doing try to reestablish a reasonably nonpolitical ideal for scientific investigation and discovery. Otherwise, the poisoned debate over climate change runs the risk of contaminating the entire scientific endeavor.

Using automatic text generation software, computer scientists at Italy’s University of Trieste created a series of fake peer reviews of genuine journal papers. The research team was able to influence the peer review process in one in four cases by throwing fake reviews into the mix.

I’m guessing that the number of Power Line readers who take in the academic series Lectures in Computer Science doesn’t approach zero, but is zero, so we might put it in scientific notation this way: PLR(N=0)~0. In which case you would have missed this gem:

AbstractPeer review is widely viewed as an essential step for ensuring scientific quality of a work and is a cornerstone of scholarly publishing. On the other hand, the actors involved in the publishing process are often driven by incentives which may, and increasingly do, undermine the quality of published work, especially in the presence of unethical conduits. In this work we investigate the feasibility of a tool capable of generating fake reviews for a given scientific paper automatically. While a tool of this kind cannot possibly deceive any rigorous editorial procedure, it could nevertheless find a role in several questionable scenarios and magnify the scale of scholarly frauds.

A key feature of our tool is that it is built upon a small knowledge base, which is very important in our context due to the difficulty of finding large amounts of scientific reviews. We experimentally assessed our method 16 human subjects. We presented to these subjects a mix of genuine and machine generated reviews and we measured the ability of our proposal to actually deceive subjects judgment. The results highlight the ability of our method to produce reviews that often look credible and may subvert the decision.

Using automatic text generation software, computer scientists at Italy’s University of Trieste created a series of fake peer reviews of genuine journal papers and asked academics of different levels of seniority to say whether they agreed with their recommendations to accept for publication or not.

In a quarter of cases, academics said they agreed with the fake review’s conclusions, even though they were entirely made up of computer-generated gobbledygook — or, rather, sentences picked at random from a selection of peer reviews taken from subjects as diverse as brain science, ecology and ornithology.“Sentences like ‘it would be good if you can also talk about the importance of establishing some good shared benchmarks’ or ‘it would be useful to identify key assumptions in the modeling’ are probably well suited to almost any review,” explained Eric Medvet, an assistant professor in Trieste’s department of engineering and architecture, who conducted the experiment with colleagues at his university’s Machine Learning Lab. . .

Mixing the fake reviews with real reviews was also likely to distort decisions made by academics by making weak papers appear far stronger thanks to a series of glowing reviews, the paper found.

The research team was able to influence the peer review process in one in four cases by throwing fake reviews into the mix, it said.

The London-based Global Warming Policy Forum is a world leading think tank on global warming policy issues. The GWPF newsletter is prepared by Director Dr Benny Peiser - for more information, please visit the website atwww.thegwpf.com.

No comments:

Post a Comment

Thanks for engaging in the debate!

Because this is a public forum, we will only publish comments that are respectful and do NOT contain links to other sites. We appreciate your cooperation.

Please note - if you use the new REPLY button for comments, please start your comments AFTER the code. Also, the Blogger comment limit is 4,096 characters, so to post something longer, you may wish to use Part 1, Part 2 etc.

Subscribe To

Welcome to Breaking Views

Breaking Views brings you expert commentary on topical political and policy issues. The views expressed are those of the author alone. The blog is administered by the New Zealand Centre for Political Research, an independent public policy think tank at NZCPR.com - register for the free weekly NZCPR newsletterHERE.