Paul BeckwithPart-time professor, PhD student (abrupt climate change), Department of GeographyLocation: University of Ottawa, in the hub next to the university bookstore

Description:

Not a typical January in Ottawa. 10 degrees C for several days one week; -30 the next; followed by 10 the one after that. Why?Normally the high altitude jet streams that circle the planet are predominantly from west to east with little waviness. Weather is cold and dry northward of the jets (Arctic air sourced) and warm and wet southward (moist tropics and ocean sourced). Now, and moving forward, the jets are extremely wavy and as the crests and troughs of the waves sweep by us each week we experience the massive swings in temperature. The extreme jet waviness is due to a very large reduction in the equator-to-Arctic temperature gradient caused by an exponentially declining Arctic reflectivity from sea-ice and snow cover collapses (which causes great amplification of Arctic temperatures). Additional amplification is occurring due to rapidly rising methane concentrations sourced from sea-floor sediments and terrestrial permafrost.Observed changes will accelerate as late summer sea-ice completely vanishes from Arctic within a few years. Largest human impacts will be food supply shortages and increases in severity, frequency, and duration of extreme weather events.

Tuesday, January 29, 2013

"Our greatest concern is that loss of Arctic sea ice creates a grave threat of passing two other tipping points -- the potential instability of the Greenland ice sheet and methane hydrates. These latter two tipping points would have consequences that are practically irreversible on time scales of relevance to humanity." [1] ....."We are in a planetary emergency." [2] - World renowned climate scientist Dr. James Hansen

The scientific community must be commended for its efforts to convey to the world the reality of climate disruption caused by carbon dioxide (CO2) emissions. That world is now grappling with the politics of whether effective reductions can be achieved in time. But there appears to be a new danger emerging from the Arctic which threatens to accelerate such disruption beyond the reach of any meaningful control. Cutting edge researchers in the field are observing large plumes of methane rising from the shallow seabeds. [3] Others are discovering heightened levels through airborne measurement. [4] According to the most recent Intergovernmental Panel on Climate Change (IPCC) report, methane is a global warming gas no less than 72 times more powerful than CO2. [5]

In the course of working on a documentary [6] on this super greenhouse gas and the frightening prospect that such is beginning to thaw and release to the atmosphere in the Arctic, the author has encountered a highly disturbing "disconnect". On the one hand, there are highly eminent scientists - such as James Hansen - warning that the situation in the Arctic could well lead to the crossing of an "irreversible tipping point". On the other hand, such warning is not finding its way into the major scientific reports which government policy makers will use to chart their response. It cannot be found in either the draft of the new IPCC report or the draft of the U.S. National Climate Assessment.

As global climate disruption begins to enter the realm of "tipping points of no return", humanity is coming face to face with a moral crisis inextricably linked to the physical crisis. Scientists are first and foremost human beings. If information is discovered that points to a real possibility that a given situation can abruptly escalate into an existential threat to human survival, there is a profound moral responsibility to issue a loud and unambiguous warning.

There are now several indicators that the factors which could generate such a threat are indeed lining up in the Arctic - such as the ice collapse and the loss of solar reflectivity that will only accelerate further Arctic warming. At this time however, other than a handful of notable exceptions, the scientific community as a whole is utterly failing to issue such warning. This essay is an attempt to grapple with what might be the systemic reasons for such failure. Although the author is not a scientist and addresses the issue from outside that frame of reference, reasons are provided for why such may be an advantage rather than a disadvantage.

The climate science community around the world is performing a tremendous service to humanity. As climate disruption continues to escalate and the threat to our society becomes more grave, its members have worked long hours - in many cases on their own time - to gather the relevant data. As only one example. documentary-related exchanges between the author and scientists working in the Arctic make clear that much personal hardship and sacrifice are being endured in order to conduct such research.

It is well known that in preparation for their work, scientists are taught to exercise great caution in reporting their findings and never stray beyond that for which there is incontrovertible evidence. Even when it appears evidence is present, it is the time-honored tradition of science to still submit any conclusions drawn from such to their peers for review.

In almost every case, this strict methodology has well served the public interest. It has filtered out errors and made solid information available to the public and policy makers. But in the arena of global climate disruption, humanity is now facing something unique and quite un-paralleled by any other issue. There is a point in the process of climate de-stabilization where colossal natural forces can be unleashed which are capable of developing their own unstoppable momentum and spiralling well beyond the reach of human control. When such occurs - the already mentioned "irreversible tipping point".

Though a term used frequently in discussions on climate, its full meaning and magnitude have rarely been taken to heart. Far too often, it is simply another "buzzword" dropped into an article and treated only in the most superficial way. Indeed, such usage seems almost to "anesthetize" us to the horrific reality it points toward. In truth, the crossing of some kinds of tipping points can lead to the crushing of our entire civilization on no less of a scale than nuclear war. The devastation can be so sweeping that the concept of "adaptation" becomes meaningless. "Irreversible" refers to the brutal fact that once humanity allows this process to become triggered, there will be no chance to go back, no chance to learn from our mistakes and correct them.

Of the several tipping point scenarios which are possible, one considered especially frightening is the prospect of triggering an abrupt and large scale methane release in the shallow seabeds along Arctic coastlines. The entire climate debate has been dominated by a discussion of humanity's contribution to the problem - which has been the emission of carbon dioxide since the beginning of the industrial age. What science has discovered is that nature has its own vast storehouse of ancient carbon trapped in the ice of the polar regions.

The scenario of most concern to methane "specialists" is what's known as the "runaway feedback" reaction. As described by Dr. Ira Leifer of the Marine Science Institute at the Univ. of Calif.: "A runaway feedback effect would be where methane comes out of the ocean into the atmosphere leading to warming, leading to warmer oceans and more methane coming out, causing an accelerated rate of warming in what one could describe as a runaway train."[7] A cycle would be initiated which feeds upon itself and therefore becomes unstoppable.

When one looks at the history of the most devastating "wipeouts" of life on earth - such mass extinction events as the PETM or the end-Permian - it is sobering to learn that large scale release of methane has been pointed to as a "probable cause". World-renowned climate science pioneer Dr. James Hansen relates methane to the PETM extinction event:

"There have been times in the earth's history when methane hydrates on the continental shelves melted and went into the atmosphere and caused global warming of six to nine degrees Celsius, which is 10 to 18 degrees Fahrenheit." [8]

The end-Permian extinction was the most colossal mass extinction event, wiping out over 90% of the life forms on earth. According to paleontologist Michael Benton - considered by some to have written the definitive book on this event (When Life Nearly Died):

"Normally, long-term global processes act to bring greenhouse gas levels down. This kind of negative feedback keeps the Earth in equilibrium. But what happens if the release of methane is so huge and fast that normal feedback processes are overwhelmed? Then you have a "runaway greenhouse"...... As temperatures rise, species start to go extinct. Plants and plankton die off and oxygen levels plummet. This is what seems to have happened 251 million years ago."[9]

While devastation on this level is inherently difficult to grasp, one attempt to convey such is provided in the documentary "Miracle Planet". [10] Though absolute certainty on causation may not be attainable, just the possibility that our society may be triggering a force with this kind of power is mind-boggling enough.

Could anything of this unspeakable magnitude be triggered by thawing methane in the Arctic? Again Ira Leifer: "The amount of methane that’s trapped under the permafrost and in hydrates in the Arctic areas is so large that if it was rapidly released it could radically change the atmosphere in a way that would be probably unstoppable and inimicable to human life."[11] Hansen adds: "It is difficult to imagine how the methane clathrates could survive, once the ocean has had time to warm. In that event a PETM-like warming could be added on top of the fossil fuel warming."[12] Dr. Hansen - along with Arctic ice and methane experts - address this issue in more depth in a documentary co-produced by the author. [13]

Methane plumes rising from the seafloor

In examining the key reports being made by the scientific community - such as drafts for the new IPCC document and the National Climate Assessment (of the U.S.), one might expect there to be dire warnings about a potential "point of no return" if these forces are unleashed. We do find a discussion of the various consequences of climate disruption that are hitting right now - Arctic and glacial melt, extreme weather, more powerful hurricanes and storms, increases in drought, food shortages, wildfires, and flooding. But where is the discussion of what these symptoms of disruption are leading to? One of the most frightening spectres looming over humanity - the tipping point of a methane "runaway" - is completely ignored.

How can this be possible? Several factors may be combining. As stated earlier, scientists are trained to only make statements based on "hard evidence". In the case of a potentially abrupt methane runaway, it is not possible to pinpoint a specific moment in time when such may be initiated. It cannot be stated with certainty whether this will happen in 2017, 2027, or 2037. Without this ability to pinpoint and quantify, the response of science has been to simply not address it.

Secondly, scientists are human. All of us have great difficulty in truly facing and absorbing the full implications of a complete collapse of human society and even a wiping out of most or all life on the planet. It is human to utilize methods of psychological denial to block out such a staggeringly horrific threat to our collective existence. In this instance, scientists are no different. Unless there is "absolute proof" staring us in the face, the overwhelming tendency is to push such thoughts out of our consciousness so we can "get on with our day".

A third factor is that the current methodology for the reporting of climate science is fundamentally flawed and dysfunctional in regard to the challenge at hand. The single most important such report is that issued by the Intergovernmental Panel on Climate Change (IPCC). It is based on a consensus process and an intensively time-consuming level of peer review. Under normal circumstances, such would be seen as positives. But in a situation where humanity may come under severe threat in the very near term future, these reports are essentially looking backward at where science has been for the past 7 years rather than reporting the cutting edge trend lines. A glaring example is how the IPCC completely missed on predicting the speed at which the Arctic would melt.

The tremendous danger with this situation is that by the time any kind of "absolute proof" is gathered, it will very likely be too late to stop the conditions bringing on the dreaded runaway reaction. An unspeakably terrifying process will already have been set into motion and humanity at that point will be helpless to stop it. Temperatures on earth could eventually skyrocket to a level where mass famine is initiated.

What are the potential solutions to these terrible problems? One would be procedural. A special section of such reports should be dedicated to communicating the work of those scientists whose research is on the cutting edge of dealing with potentially huge climate impacts and yet still "in progress" in terms of gathering the relevant data. For example, even though the precise timing of a methane runaway cannot be predicted, there should be a report on the trend lines and the extent to which the conditions that could bring on a runaway are manifesting. If those conditions are lining up to a considerable extent, then an appropriate warning should be issued.

A second corrective step is more related to basic philosophy and morality. At the Earth Summit in Rio in 1992, representatives of the world's nations agreed to apply the precautionary principle in determining whether an action should go forward to prevent irreversible damage to the environment. Principle #15 of the Rio Declaration [14] states that:

"In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation."

In a further treatment of the meaning of the precautionary principle, it has been stated that there must be

"a willingness to take action in advance of scientific proof of evidence of the need for the proposed action on the grounds that further delay will prove ultimately most costly to society and nature, and, in the longer term, selfish and unfair to future generations."[15]

A methane runaway in the Arctic more than meets the criteria of being a "threat" which can bring about "irreversible damage". Due to its potential to create an unstoppable wave of continually rising temperatures capable of initiating something as horrendous as a mass extinction event, its irreversible damage could be of the most frightening magnitude imaginable. If humanity failed to recognize this danger and allowed its occurrence, such would most certainly constitute an irreparable crime against future generations. The most fundamental tenet of human morality demands that in such a unique situation the scientific community act on the basis of the precautionary principle and issue the appropriate warning.

In his powerful book on the ethics of nuclear war, Jonathan Schell wrote: "To kill a human being is murder, but what crime is it to cancel the numberless multitude of unconceived people?"[16] In the words of the ecological ethicist David Orr: "Climate destabilization, like nuclear war, has the potential to destroy all human life on Earth and in effect 'murder the future'......... Willfully caused extinction is a crime that as yet has no name."[17]

Is it possible that the very pillar of science which has served our society so well - the uncompromising demand for incontrovertible "evidence" - has in this unprecedented current crisis become a dangerous obstruction? Is it possible that this requirement of absolute "proof" is creating a perceptual blindness that could pave the way for the most horrendous suffering in the history of civilization?

To the scientists who may read this essay, an appeal is made in the name of our collective humanity to truly confront and grapple with the meaning of the term "irreversible" and weigh the potentially horrific consequences of silence. It is no violation of scientific "objectivity" to look at trend lines and determine whether their continued trajectory might well carry our civilization over the cliff. And if this possibility is there, is there not a profound moral obligation under the precautionary principle to issue a loud and unambiguous warning to humanity?

Before arriving at an answer, the reader - and especially any member of the scientific community - is invited to view a powerful film that is simply entitled "HOME". [18] In this artistic masterpiece, images of life on earth convey beyond the reach of words the incredible magnificence of what will be lost if climate disruption is allowed to escalate into an unstoppable "wipe-out". It also describes the methane lurking in the Arctic as a "climatic time bomb". Please watch and please speak out before it is too late. As James Hansen says: "We are in a planetary emergency."

The researchers observed more methane in the water under the sea ice than the methane in the air above the sea ice. They conclude that the sea ice collects and holds the methane in places close enough to the surface for the methane to be consumed through photochemical and biochemical oxidation. In other words, sufficient light can reach the spots where methane assembles underneath the ice for the methane to get consumed by biological processes.

Change of CH4 mixing ratio over time in the chamber on under-ice water. In CBS1–CBS3, CH4mixing ratios fluctuated in the first 1–2 h and then increased, suggesting significant emission of CH4. No increasing trend in CBS4 might due to the relatively short sampling time (only 2 h) during which CH4mixing ratio had not started increasing.

A study by the Alfred Wegener Institute led by Marcel Nicolaus has found that where melt water collects on the ice, far more sunlight and therefore energy is able to penetrate the ice than is the case for white ice without ponds. The consequence is that the ice is absorbing more solar heat, is melting faster, and more light is available for the ecosystems in and below the ice.

As the sea ice has decreased in volume over the years, there now is mainly thin, first-year ice, extensively covered with melt ponds in the summer months, where once metre-thick, multi-year ice used to be. Additionally, the melt ponds have a different color, causing further albedo change. “Their colour depends entirely on how thick the remaining ice below the melt pond is and the extent to which the dark ocean beneath can be seen through this ice. Melt ponds on thicker ice tend to be turquoise and those on thin ice dark blue to black”, explains Dr. Marcel Nicolaus, sea ice physicist and melt pond expert at the Alfred Wegener Institute.

Graphic depiction of the amount of sunlight above and underneath the Arctic sea ice. The growing coverage of the ice by darker meltponds increases the share of sunlight, which passes the sea ice. That means, the space underneath the ice becomes brighter and warmer. Furthermore less sunlight is refleced back into the atmosphere. Graphic: Marcel Nicolaus/Yves Nowak, Alfred Wegener Institute

Marcel Nicolaus explains: “The young thin ice with the many melt ponds does not just permit three times as much light to pass through than older ice. It also absorbs 50 per cent more solar radiation. This conversely means that this thin ice covered by melt ponds reflects considerably fewer sun rays than the thick ice. Its reflection rate is just 37 per cent. The young ice also absorbs more solar energy, which causes more melt. The ice melts from inside out to a certain extent,” says Marcel Nicolaus.

Marcel Nicolaus adds: “The greater the share of one-year ice in the sea ice cover, the more melt ponds will form and the larger they will be. This will also lead to a decreasing surface albedo (reflectivity) and transmission into the ice and ocean will increase. The sea ice will become more porous, more sunlight will penetrate the ice floes, and more heat will be absorbed by the ice. This is a development which will further accelerate the melting of the entire sea ice area.”

These studies contain an important warning. As the sea ice gets thinner, more sunlight and therefore energy is penetrating the ice and getting absorbed. Initially, this will increase the growth of bacteria that break down methane collecting underneath the sea ice. Eventually however, as sea ice retreats further, there will be less opportunities for methane to be held underneath the sea ice and broken down by bacteria. Instead, more methane will then enter the atmosphere unaffected.

These are further feedbacks of sea ice retreat, in addition to the many feedbacks described in the Diagram of Doom. Sea ice is declining at exponential pace. The big danger is that a huge rise of temperatures in the Arctic will destabilize huge amounts of methane currently held in the seabed. Comprehensive and effective action is needed now to avoid catastrophe.

How does the current situation in the Arctic compare to times back in history when temperatures were high, in particular the Eemian interglacial (130 000 to 115 000 years ago)? “Our data show that it was up to eight degrees Celsius warmer during the Eemian interglacial in North Greenland than today”, says project leader Prof. Dorthe Dahl-Jensen from the University of Copenhagen in a recent news release.

As has been described in earlier posts at this blog, the current speed of change is unprecedented in history, and this is destabilizing the Arctic and threatening to unleash huge amounts of methane from the seabed and escalate into runaway warming. Comprehensive and effective action is therefore desperately needed.

Views on this from other people follow below.

Paul Beckwith:

Basic premise about stability of Greenland Ice sheet is wrong. In previous interglacials summers were much hotter but winters were much colder (more extreme seasonality); this helped maintain both sea ice and Greenland ice. CO2 and CH4 stayed within narrow bands much lower than now. Now, reason for melt is completely different. GHG much higher now; temps higher year round so recovery less robust in winter. Before troposphere and stratosphere warmed; now troposphere warming like crazy and stratosphere cooling. Lapse rates did not change much before, now lapse rate is slower so more warming higher up (recall extreme melt at 3100 m on Greenland peak; in fact on entire Greenland surface; well 97%). Daily lows much higher now then before due to GHG trapping at night; not the case before. GHG concentrated more at pole since tropopause only 7 km high (compared to CH4 wetland emissions from wetlands near equator where tropopause is 17 km high). Over Greenland summit only 4km up to tropopause). Also more black carbon now to kill snow/ice albedo. Not looking too good for the home team (us).

Also, with warmer upper troposphere from reduced lapse rate colder stratosphere now (not so in previous interglacials) there is no surprise that SSWs (sudden stratospheric warming) events are occurring more frequently...

The image below was part of last year's post Polar jet stream appears hugely deformed. The image highlights that Arctic sea ice minimum volume in 2012 was only 19.3% what it was in 1979. The background image, prepared by Wipneus, shows an exponential trend projecting a 2013 minimum of only 2000 cubic km of sea ice, with a margin of error that allows Arctic sea ice to disappear altogether this year, i.e. in less than six months time.

by Stephen Salter, University of Edinburgh, and Alan Gadian, University of Leeds.

Background

In 1990 John Latham [1] suggested that the Twomey effect [2] [3] could be used to slow or reverse global warming by increasing the reflectivity of clouds. Reflectivity depends on the size distribution of drops. For the same amount of liquid water a large number of small drops is whiter than a smaller number of big ones. Latham suggested the release of submicron drops of filtered sea water into the marine boundary layer below or near mid-oceanic stratocumulus clouds in regions where the concentration of cloud condensation nuclei is low and cloud drops are large. Evaporation would produce salt residues which are excellent cloud condensation nuclei. Turbulence would disperse them through the marine boundary layer. They would increase the number but reduce the size of drops in the cloud. Twomey suggested that, for many cloud conditions, a doubling of the number of nuclei would increase cloud top reflectivity by about 0.058.

Several independent climate models [4], [5], [6], show that the amount of spray that would be needed to reverse the thermal effects of changes since pre-industrial times is quite small, of the order of 10 cubic metres a second for the whole world. The thermal effects of double preindustrial CO2 concentration would still be manageable. Furthermore the technique intercepts heat flowing from the tropics to the poles and so cools them no matter where the spraying is done. It should therefore be possible to preserve Arctic ice. Local control and rapid response may allow thermal protection of coral reefs. Design of wind-driven vessels and spray equipment is well advanced [7].

The aim of this proposal is to identify and quantify potential side-effects of marine cloud brightening. We want to produce an everywhere-to-everywhere transfer-function of spray quantity with regard to temperature, precipitation, polar ice, snow cover and vegetation using several leading climate models in parallel. This should especially show the times and places at which spraying should NOT be done. The technique involves changing the concentration of condensation nuclei at many spray regions round the world according to coded sequences unique to each region and correlating this sequence with model results at observing stations round the world. A first test on a set of 16 artificial changes with different magnitudes to a real 20-year temperature record showed that the magnitude of each change could be detected to 1% or 2% of the standard deviation. This is better than many thermometers. Confidence has been boosted by the PhD project carried out by Ben Parkes at Leeds who has shown that the effects on precipitation are bi-directional.

The technique may let us steer towards beneficial climate patterns if only the world community can agree what these are.

The differences between climate models may point to general model improvements for which there is plenty of room.

As well as humanitarian benefits the project may lead to better understanding of atmospheric physics and teleconnections.

Previous work

One of the early attempts at the identification of side effects was in 2009 by Jones, Haywood and Boucher of the Hadley Centre [8]. They picked three regions representing only 3.3% of the world ocean area and raised the concentration of cloud condensation nuclei to 375 per cubic centimetre everywhere in the regions from initial values of 50 to 300. The regions were off California, off Peru and off Angola / Namibia. These are labelled NP for North Pacific, SP for South Pacific and SA for south Atlantic in figure 1, top left. These areas usually have good conditions for cloud cover and solar input. Parts close to the coast have rather high nuclei concentrations. They are good but by no means the only suitable sites for cloud spraying. The increased nuclei concentration was held steady regardless of summer/winter, monsoons or the phase of the el Nino Southern oscillation. The resulting global cooling for the separate regions was 0.45, 0.52, and 0.34 watts per square metre giving a mean annual total of 1.31 watts per square metre. However if all of the regions sprayed together all of the time the 3.3% of ocean area would cool a little less, 0.97 watts per square metre. Even the lower amount of cooling would be a substantial fraction of the widely-accepted increase of 1.6 watts per square metre since preindustrial times.

Present global climate models are good at predicting temperature but are less accurate for precipitation, ice and snow. They cannot predict cloud cover, hurricanes or flood events. Climate change with no geo-engineering is already producing extremes floods in Pakistan and Queensland with droughts in South Australia, the Horn of Africa and the United States.

The Jones, Hayward and Boucher results show that albedo control can both increase and reduce precipitation far from the spray source, even in the opposite hemisphere. Spray from California (NP) shown in the top right of the figure can nearly double rainfall in South Australia. Angola/Namibia (SA) give a useful increase, lower left, in Ethiopia, Sudan and the Horn of Africa. But most attention was given to the 15% reduction over the Amazon. Perhaps Brazilians watching recent television footage of dying children in Ethiopia and Sudan would be glad to have their own rainfall reduced to 2000 mm a year when necessary.

Figure 1. The separate effects of the spray regions in Jones Haywood and Boucher 2009.

Figure 2. The combined effect of all three spray sources of figure 1. This slide appeared on its own with no indication of the spray regions used and could imply that Amazon drying is the result of spray anywhere.

If all three regions in figure 1 spray simultaneously and continuously we get the result in figure 2. The combination is not the sum of the parts. The reduction in the Amazon is there but less marked. There are useful increases in Australia and in the Horn of Africa. The reduction in precipitation in South West Africa caused by the South Atlantic spray region has vanished. Jones et al. did not test other source positions, spray rates or seasonal variations relative to the monsoons.

More recent work by Gadian and Parkes at Leeds [9] used the coded modulation of the nuclei concentration of 89 spray sources of roughly equal area round all the oceans. They then correlated the individual sequences with the resulting weather records round the world. The modulation was done by multiplying or dividing initial nuclei concentration values by a factor chosen initially as 1.5. Because of the logarithmic behaviour of the Twomey equation this alternation should have had a low overall effect. The factor of 1.5 is a much weaker stimulus than an increase of 50 to 375 nuclei per cubic centimetre which would increase reflectivity by 0.168.

Figure 3. An example of the effect of all spray regions on two places in the Amazon. Drying from spray in the South Atlantic as predicted by the Hadley Centre is evident but could easily be countered by spray from many other regions, especially from south of the Aleutians.

The results in figure 3 show that, as well as the spray sources used by Jones et al., there are many other spray sources which will either increase or reduce precipitation in the Amazon. The two regions in the Amazon basin are shown black. Red shows sites which would increase precipitation at the black site and blue shows a reduction. The Amazon increases from the red spray sources off California and Peru are in agreement with the 2009 Hadley Centre result. The strongest blue in (b) off Namibia and the weaker blue off Angola in (a) are also in agreement. But the great majority of spray sites, particularly the one in (b) off Recife, show increases in the Amazon precipitation. The analysis will show maps like these for every observing station of interest. This could amount to many hundred maps depending on the resolving power of the climate models.

It is also possible to show the transfer function of each spray site on target regions on land all round the world. Figure 4 shows a sweep of spray sources along the east sides of the North and South Atlantic. There are alternating effects in South America and Australia. Spraying between the English Channel and Labrador has little effect in the Amazon or Australia but the next region south increases rain in both. The Atlantic coast off Mauritania further increases Amazon precipitation, gives weaker precipitation in eastern Australia but dries the west. A block from Liberia to Nigeria has little effect on either the Amazon or Australia but is close to where hurricanes begin. Angola confirms the Hadley centre drying of the south Amazon but not the north. Namibia reverses this. Spray off the Cape of Good Hope increases rainfall both regions of the Amazon but the effect fades as we spray from further south. Spray further south increase rain in the Indian sub-continent and Japan.

The maximum swings are 0.0006 mm per day for each percentage variation of the initial nuclei concentration. This means that for the 100% nuclei increase needed to give a reflectivity increase of 0.057, the annual precipitation change would be 0.0006 x 365 x 100 mm = 21.9 mm per year. This is much smaller than precipitation changes indicated by the Hadley Centre but the size of individual spray regions is somewhat lower. The Hadley Centre increase from 50 per cubic centimetre in a clean spray region to 375 is a much stronger stimulus by a factor of 15 and a reflectivity increase of 0.168. The offshore edge of the Hadley test regions would have presented an impossibly high slope of nuclei concentration. Perhaps climate systems react just as badly to sharp changes as engineering components under stress.

The full Parkes thesis can be downloaded from [9]

Figure 4. A sweep of spray regions along the east side of North and South Atlantic show cyclical effects on precipitation in South America and Australia. Ben Parkes’ work provides 89 such maps.

The symbols in figure 4 show the scatter of precipitation results from 8 runs with different sequences from 89 spray sites on the Arabian region. Blue bars show standard deviations. A low scatter implies reliable operation of the technique but is not universal. While the general trend is towards slightly more precipitation, there are changes in both directions with less scatter in the wetter direction.

Figure 5. If the results of perturbations from separate runs with different code sequences show a large scatter we can deduce that the technique is not working well for that combination of source and observing station.

Work programme

Because of the poor representation of precipitation in global climate models and in the absence of any better prediction method, we want to use a multiple approach with at least five different climate models driven by different research groups attempting the same jointly agreed objectives but with some freedom to follow interesting results. Suggestions for the central questions which should be tackled by all groups are as follows. They must be debated and approved but then adhered to.

Correlation lag. The changes to weather are not immediate so we should include a time lag between the release and the autocorrelation period. There may also be several time lags with different durations. We can make good estimates by choosing a plausible guess for the response period, driving all or a subset of spray sources in unison to add a sinusoidal component at that period to the nuclei concentration, running the climate model, subtracting the mean offset at each observing station round the world and multiplying the mean response by the sine and cosine signals. This will produce two offset means. The tangent of the phase lag of the response at each observing station will be the cosine offset divided by the sine offset. Repeating the process for various periods will allow the choice of correlation lag for each observing station. The amplitude and phase of the response as a function for period will give an interesting insight into the important climate system processes but does not allow the separation of effects from individual spray sources as is possible with coded modulation.

Coded modulation sequences. Random number generators can, by chance, produce short groups with abnormal auto-correlation. Andrew Jarvis at Lancaster can give sequences without these. When God made random sequences He made a great many so we can all use different ones but it will be interesting to compare results of the same sets of sequences in different models.

Change-over period. The encoding sequence can be seen as a series of coin tosses. Each toss decides whether the spray/not spray mode should be reversed or left alone. If the coin is tossed too frequently then the weather system will not have time to respond. But, if the intervals are too long, the length of the computer run needed to get a reasonably low scatter will be expensive. Perhaps initially the very shortest change-over period should be about the time for which reliable forecasts can be made, perhaps ten days. At each possible change-over period there will be a 50% chance of no change and a 25% chance of getting three ‘no-change’ events in a row and so on. Carbon emissions vary over a weekly cycle and the release of decay gases and di-methyl sulphide from seaweed can be related to the 28 day tidal cycle so we must avoid being phase-locked to these periods. Parkes used a change-over period of 10 days for computational efficiency and a case can be made for 20 days. We should later extend the changeover period but not to the point where too many changes spread across a monsoon period.

Monsoon season. All the work so far has used continuous spray through the year. It might occur to even a naïve engineering person that the monsoon seasons could possibly have an effect on patterns of precipitation and evaporation. This means that we should do separate correlations and calculate separate transfer functions according to the monsoon phase. If the technique shows promise and computing time is available we may be able to resolve transfer functions down to monthly levels provided that we can get resources to allow the use of high resolution models.

Spray amplitude. The susceptibility of an ocean spray region is a function of low cloud, clean air incoming solar energy and perhaps wind to drive spray vessels and disperse spray. Large spray volumes in what initially appear to be regions of high susceptibility will reduce that susceptibility. A lower dose over a wider region will be more effective. Multiplying and dividing the initial nuclei concentration value by 1.5 was quite small but we do not know that it is the best choice. A sweep over multiplying and dividing amplitudes from 1.25, 1.5, 2, 3 and 4.5 will help us choose the best spray amplitude(s) for later work.

Spray asymmetry. The Twomey results can be condensed into an equation which says that the change in reflectivity is 1/12 of the natural log of the ratio of nuclei concentration. The log term led to the decision to multiply and divide initial nuclei concentration values by a constant rather than by the usual addition of some chosen amount. It was intended to cancel the mean thermal effects in other regions. However it may be that the multiplier should not be exactly equal to the divider. We need to establish the best numbers to use for the lowest external interference so as to minimise interference between regions. A possible method might be to use results of the sinusoidal modulation. Any departure from a sinusoidal wave form produces harmonics which can be detected by multiplying the signal minus its mean by the sine and cosine of 2, 3, 4 etc. times the fundamental.

Spray concentration profile. While time constraints forced Parkes to use blocks of spray with sharp edges it would be more realistic to have smoother variations of nuclei concentration, perhaps with the bell-shaped Gaussian concentration.

Number and position of spray sources. The Parkes choice of 89 spray regions was not made with any great confidence. We wanted at least two across the narrow section of the Atlantic. Parkes started with equal areas but then divided them round the Caribbean and either side of Iceland because of the current patterns. Some climate models show strange alterations either side of the equator in the Pacific. There is no need for spray regions to have equal areas provided that we can give each an appropriate weighting. There is no need for everyone to use the same regions provided that research result maps (discussed later) can give a common presentation. Individual selections should be encouraged and results merged to avoid blocky results.

Regions might be merged if there is little difference between their susceptibilities or divided if differences between adjacent neighbours are large provided that the spray regions are large enough to produce a consistent forcing over several grid points.

Region grouping. It is well known that climate patterns all over the world are affected by temperature differences across the South Pacific, not always to advantage. It will be interesting to drive the cloud nuclei concentration differentially either side of the Pacific with code sequences of each side in unison in a number of coherent ways. Two obvious ones are first an equal 50/50 east/west split with a sharp divide, secondly a linear ramp with concentration depending on distance either side of the midline and thirdly a blend of positive and negative Gaussian distributions. Other Boolean combinations of spray regions can be chosen but with the risk that this could lead to a combinatorial explosion of possibilities and so we need careful planning.

Tactical spraying. There is no need for spray rates to be preordained and fixed for a whole experiment. For example if we see that surface temperatures in the Pacific are forming an el Niño or la Niña pattern and we know the cooling power of world spray sites, even ones far from the Pacific, we can drive them so as to increase or reduce the Southern Oscillation. The spray can be in phase with the temperature anomaly or its rate of change or even at some other phase angle. The orientation of the jet-stream waves might be a powerful indicator. A force opposing change of position of a system, ie. a spring, will increase its oscillation frequency. Control engineers know that very small amounts of damping (a force opposing velocity) or its opposite, can have very large effects on the growth or decay of oscillations. We like error sensors and actuators with a high frequency-response and low phase-shift. Tropospheric cloud albedo control has an attractively rapid response – a few days compared with stratospheric sulphur at low latitudes which is about two years. With sufficiently high resolution we may also detect early signs of hurricane formation.

Ganging up. We may learn something about the climate system by using independent spray patterns to identify all the spray regions which have the same effect on one observation station, such as drying Queensland, and then driving then in unison. We then reverse the selection to all the spray regions which increase Queensland precipitation.

Map projections. The site http://egsc.usgs.gov/isb/pubs/MapProjections/projections.html gives a useful selection and explanation of map projections. All projections of a solid globe to a flat plane involve some distortions but we can choose between distorting area, direction, shape or distances at various places in the map. The Mercator projection is very common but produces gross distortion of east/west distances and areas at the high latitudes which are now seen to be of very great importance to climate change. For polar areas the Lambert azimuthal equal-area projection looks best. We can tilt this projection in other directions so that several images can show the whole world with acceptable distortion. The obvious starting one would be six Lambert azimuthal views, two from the poles and four from the equator at longitudes of 0, 90, 180 and 270 degrees and an option to set any other latitude and longitude for any other view. It can be very useful to have a transparent layer of one parameter laid over another but this will need coordination of page layout. Six 90 mm diameter circles on a 100 mm pitch can fit neatly on one page of A4 or letter page with room for arrows to adjacent balloons. If necessary we can fit 12 on an A3. A single 180 mm circle can be used to show finer detail. The modelling teams must consider the question carefully, come to a joint view and then stick to the common decision and page scale.

Solid modelling packages (e.g. SolidWorks) are increasingly common for engineering design and several offer free viewing software to let customers spin images of engineering components about any axis. A spinning image can give a good presentation of complex three-dimensional shapes. It should be possible to modify software to give surface colours with 10 saturation levels and text to regions of a spinning sphere.

Mapping contours. Result maps need to show the magnitude and slope of at least temperature, precipitation, evaporation ice and snow cover. While a continuous rainbow spectrum looks beautiful and gives a superficial impression of work done it is almost useless at providing any numerical information beyond the position of a peak. There are some meteorological result maps which have colour allocations that are particularly unhelpful, for example the one below.

Figure 6. How not to display the results of a climate model. The lead author of the paper from which this figure was taken agrees with me but was unable to challenge official policy. No names no pack-drill. Result format for this project may be dictatorial but will be more intelligent.

The area of ‘no change’ is between the lighter buff colour and the darker green. It covers a large fraction of the map. The polarity of the contour gradient is not obvious where light green moves to cyan or the darker buff moves to orange. Light green is a stronger effect than dark green. Numbers on the colour code bar refer to the borders not the middle of contours. Readers may confuse this map for precipitation with another map for temperature which uses the same colour set.

The right presentation of results can reveal the reasons for the most peculiar phenomena. We must make it as quick and as easy as possible for lazy, tired, non-technical readers to see effects with the minimum of mental decoding effort even when they are looking at a great many different maps.

The first requirement is that areas with effects that are below the level of statistical significance or the middle of the range should be white. Either side of this region there should be just two colours with increasing saturation. Red and blue would be intuitive for temperature with green and brown for river runoff. The male human eye can reliably distinguish 10 saturation levels provided regions are in contact with sharp edges. (Females have higher discrimination.) This gives a range of 20 steps (more than most result maps) plus a white central zero for the mean or the anomaly reference. The steps give an obvious direction of gradients. Adults, babies, birds and many animals can count up to five in an instant ‘analogue’ way. If we have thin black contour lines between the lowest five saturation steps and thin white lines between each of the top five colours we can avoid getting lost. We can also include black or white text numbers to show the contour value and total area of the contour region. An example is shown below.

Astronomers developed a very powerful technique to detect changes in the position or brightness of astronomical objects. Two images of star fields containing thousands of objects, taken at different times but with exactly the same magnification, would be shown alternately at intervals of about one second. A change in any one of them would be immediately apparent. We can adapt this for use with PowerPoint images to detect small differences in maps of model results provided that we can standardise the presentation format across all groups. We can also flicker through a sweep of many small changes in time or nuclei concentration.

Reports should have verbose comments and complete meta-data close to each map with minimum risk of confusion between them. This means frequent repetition and no jumps to other pages, or even other journals as is sometimes done. The clarity of caption wording should be tested by naive readers, rather than the intimidatory style of many journals.

Results can be presented as absolute values or as anomalies from some agreed reference baseline. We must agree on a small selection of base lines such as preindustrial, 1960-90, present, x 2 preindustrial CO2 or one of the, now discredited, IPPC scenarios. We must also be able to show instantly the differences between sets of results from different codes or institutions such as Pacific North-Western minus Hadley Centre. We must be able to combine results with various weightings from different teams. This will require an agreement between the teams of what the format should be, followed by their obedience to the agreement. It will be important to get advice from good information technology experts.

Blockiness. Because of pressure of time the Parkes thesis results were presented as blocks with sharp edges which are unusual in meteorology except for either sides of mountain ranges or places like the Cape of Good Hope. We should agree on a method to produce smoothly blended curves for both spray concentration regions and results.

Naming of sea areas. We must make it easy for people to know which of many possible spray regions are being discussed even if they are not the same as the original Parkes 89. One way is to pick a spot at the centroid of an ocean and then give the bearing and distance of a spray region under discussion in terms of the bearing and distance from the ocean centroid. Two digits are enough to identify runways at airports and most regions will be larger than 100 kilometres or a few model grid points so, for example, we could use a description such as South Pacific 18, 30 for a spray region 3000 kilometres south of the ocean centroid.

Numerical data. If we want daily results of 4 parameters affected by 100 spray regions for 500 different observing stations over 20 years with two-byte precision we will have to access nearly a Terabyte of information. Requirements are unlikely to shrink. We want this to be made freely accessible to anyone. We need verbose and intuitive labels and selection filters with same look and feel from all modelling groups. Subsets should be available in a widely used format, agreed by all teams such as netCDF and GrADS. People should normally supply numerical results in a common, agreed and widely-used format. If other formats have to be used then the teams should provide conversion software or do the conversions themselves on request.

Carbon dioxide variation. We should first test the technique with an agreed level of atmospheric greenhouse gases. If we can establish confidence in the coded modulation technique we can later experiment with changes to gas concentrations. Obvious ones are pre-industrial, double pre-industrial, ramped rises at various rates, methane burps and even the effects of plausible rates of CO2 removal. However access to present real observations will be useful and so there is a strong case for using present day gas concentrations unless there is a sudden need for work on methane burps.

Political information. Models will be able to produce results of several climate parameters of the effect of all the spray regions at places all round the world. There are many ways in which we could select target regions. One obvious one is the present political boundaries of 193 UN Member States. If a climate model has a resolution of one degree each cell has a side of 111 kilometres. It takes about eight points to draw a convincing sine wave so the size of result contour will be larger than the smaller countries. This means that some of the smallest ones will have to be grouped. This could be a matter requiring some delicacy. For large or elongated countries we can subdivide the area such as either side of a mountain range or north and south for countries in the Sahel. This would allow each country to choose which spray regions and times would give it the maximum benefit with the least dis-benefit to others. It might then be possible to understand and maximise the winner-to-loser ratio and even to decide on compensation. It is defeatist to assume that the outcomes will inevitably be unfavourable. The results of any pair of parameters predicted by each climate model for each country can be shown by plotting the model name on a map with, say, the vertical coordinate being temperature and the horizontal coordinate being precipitation. A close clustering of results from different models will add confidence. We must resist the temptation to place models in rank order. The objective should be model improvement and good science can come by sometimes testing opposites.

Specific Questions

There is a grave risk of a combinatorial explosion suppressing the detection of differences between climate models and so initially we must agree on as many test conditions as possible. The following are suggestions for debate.

What spray rates, change-over periods, concentration profiles and correlation delays should be used for commonly-agreed experiments?

Measurement of the phase and amplitude response to allow choices of correlation lags.

Maps of spray site susceptibility, defined as the annual change of each result parameter per unit of spray volume.

As above with spraying adjusted with a selection of correlations linked to the monsoon seasons. Even if the computer models cannot detect the onset of a monsoon we can use historic records to pick dates.

Investigation of tactical spray rate variation.

Results for groups of spray regions working in unison or subtle harmony especially with Trans-Pacific amplification and attenuation of el Niño / la Niña oscillations.

Design of a world-wide spray plan to cool the planet with the minimum winner/loser ratio.

Identification of the strengths and weaknesses of the various climate models leading to suggestions for improvement.

Chaos

Objectors to cloud albedo control have argued that the climate system is chaotic and so nothing can be done to direct it. There have been a great many phenomena such as planetary motions, chemical reactions, the incidence of disease and the motions of sea waves which were thought to be chaotic by the leading thinkers of the day. But Kepler showed that elliptical planetary orbits followed rules more precise than any man-made machinery. Mendeleev produced his periodic table and was able to predict the properties of hitherto unknown elements. Pasteur developed germ theory. Test tanks can now produce complex sea states with repeatability of a few parts per thousand. An oscilloscope signal from any backplane connector of a computer appears to be entirely random string of zeros and ones but is in fact one of the most highly defined sequences that we can produce.

A favourite demonstration of chaotic behaviour uses the fall of sheets of paper from above the demonstrator’s head. The smallest increase of the angle of incidence between the paper and the apparent airflow produces a pitching moment to increase its value up to the moment of stall. Sheets will be scattered over a wide area. But if a sheets of paper is folded in the form of a paper dart to increase stiffness, and a weight is added to the nose then the area enclosing its falling positions is greatly reduced. Clearly the magnitude of chaos is variable and can be affected by small changes to engineering design. The scatter could be further reduced if the falling item was fitted with optical systems driving control surfaces. We could call it a GBU 12 Paveway bomb which has an accuracy of about one metre despite chaotically random cross winds. Similarly we could fit video cameras and hinge actuators to the nails of Galton’s bagatelle board.

There may really be systems such as turbulence and subatomic physics which are genuinely chaotic. But if we believe that systems which we cannot at present understand are chaotic then we remove completely our chances of scientific discovery. The common factor is that very small changes, like the angle of incidence of a sheet of paper, are amplified. This means that small amount of input energy applied intelligently can produce large changes in output energy. That is just what we need to control the very large amounts of energy in the planetary climate. Apparent chaos implies the possibility of success. Coded modulations could give valuable insights into the climate system as well as saving world food supplies.

Conclusions

The world can be compared to a vehicle with free-castor wheels which is rolling down a hill with increasing gradient. A few passengers are warning that there may be a cliff edge somewhere ahead. Some are suggesting that there might just be time to design and fit brakes, steering and even a reverse gear. Others advise that the slope ahead might level off and so brakes and steering would be a waste of money. Some objectors complain that the passengers could never agree on the best direction to steer. Some are close to claiming that God wants humanity to drive over the cliff edge and that it is wrong to interfere with divine intentions.

We could also consider the climate system as a piano in which the spray regions are the keys, some black some white, on which a wide number of pleasant (or less unpleasant) tunes could be played if a pianist knew when and how hard to strike each key.

Thursday, January 24, 2013

Below a combination of images produced by Dr. Leonid Yurganov, comparing methane levels between January 1-10, 2012 (below left), and January 1-10, 2013 (below right).

The 2013 image shows worryingly high levels of methane between Norway and Svalbard, an area where hydrate destabilization is known to have occurred over the past few years. Even more worrying is the combination of images below. Methane levels came down January 11-20, 2012, but for the same period in 2013, they have risen.

On January 21, 2013, as shown on the image below, methane levels of up to 2234 ppb were recorded at 892 mb. Further examination indicated that this was caused by large releases of methane above North-Africa, apparently associated with the terrorist attack on the natural gas plant in Algeria. An image without data is added underneath, to better distinguish locations on the map.

It didn't take long for even higher methane levels to be reached above the Arctic. The NOAA image below shows methane levels of up to 2241 ppb above the Arctic at 742 mb on January 23, 2013.

Below a NOAA image with temperature anomalies for January 7, 2013, when a huge area of the Arctic experienced anomalies of over 20 degrees Celsius, including a large area close to Svalbard.

Temperatures change daily, as the wind changes direction and as sea currents keep the water moving around the Arctic Ocean. For an area close to Svalbard, the recent 30-day temperature anomalies is over 20 degrees Celsius, as shown on the NOAA image below, and this indicates persistently high temperature anomalies for that area.

Monday, January 21, 2013

“We, the people, still believe that our obligations as Americans are not just to ourselves, but to all posterity. We will respond to the threat of climate change, knowing that the failure to do so would betray our children and future generations. Some may still deny the overwhelming judgment of science, but none can avoid the devastating impact of raging fires, and crippling drought, and more powerful storms. The path towards sustainable energy sources will be long and sometimes difficult. But America cannot resist this transition; we must lead it. We cannot cede to other nations the technology that will power new jobs and new industries – we must claim its promise. That is how we will maintain our economic vitality and our national treasure – our forests and waterways; our croplands and snow-capped peaks. That is how we will preserve our planet, commanded to our care by God. That’s what will lend meaning to the creed our fathers once declared.”

There is a rapid and accelerating decline of Arctic and Far North snow and summer albedo cooling, with the Arctic summer sea ice past tipping point. Several potentially catastrophic (to huge human populations and all future generations) Arctic changes are happening decades ahead of model projections. This is potentially a United States and world food security emergency, and an Arctic methane feedback planetary emergency. The pace is outstripping the capacity of the international published climate modeling science to assess the risks. The climate science assessment process is unable to rapidly assess all the risks and the combined risk of all risk factors.

In 2008 and again in 2012, after the large drops in summer sea ice extent, James Hansen made a public statement that the world is a state of planetary emergency. Starting in 2006 John Holdren presented the scientific evidence that we are beyond dangerous interference with the climate system and challenged to prevent catastrophic interference.

The problem is that the summer sea ice cover, Arctic frozen methane and world food security are not projected by the assessments to be a serious problem for many decades. This despite the fact that for many years scientists have warned that the loss of the Arctic summer sea ice cover would result in a large boost in warming, and that this would cause the release of the vast Arctic stores methane to start. Scientists call the Arctic summer sea ice cover the air conditioner of the entire Northern hemisphere.

We are seeing a multi-year heat and drought situation affecting the world’s top Northern hemisphere food producing regions of the U.S., Russia and China. Increasing drought affecting these regions is projected, but the situation developing right now may be due to the loss of Arctic albedo affecting the weather of the normally temperate climate zone of the Northern hemisphere, on top of sustained direct greenhouse gas warming.

All Arctic sources of global warming vulnerable methane are emitting more methane with the amplified increasing Arctic warming. This includes the destabilization of the methane that is contained in the form of hydrates and free gas in the Arctic seabed. Atmospheric methane is now on a renewed sustained increase - this time due to planetary methane emissions.

Without getting a rapid risk assessment of this situation there seems no hope of any measures to address it. The UN climate negotiations for world emissions reductions are on hold till 2020. Emissions have never been higher and are increasing and the only plan is to burn more fossil fuels and of the worst kind.

Leading climate experts say we are now committed to a warming over 2C, probably to 3C and possibly 4C. We are on track for 6C by 2100. Even an all out emergency scale response would not see any reduction of atmospheric GHGs for many decades.

We call for an urgent high-level risk assessment to capture all these adverse trends and situations. An Arctic climate risk assessment is needed to address the unprecedented risks that are threatening the security of the U.S., the Northern hemisphere and the world at large, and the well-being of both current and future generations.

Above call for a high-level risk assessment was initiated by Dr. Peter Carter of the Climate Emergency Institute. Please try and improve this message and see that it finds its way to the people who need to see this and take action, including leading climate scientists, doctors, politicians and those holding public office positions with a duty of care.

Videos

Global temperatures are rising fast. In the Arctic, temperatures are rising even faster (interactive charts below and right). For 2010 and 2011, NASA recorded anomalies of over 2°C at higher latitudes (64N to 90N), with anomalies of over 3°C at latitudes 79N and 81N in 2010.

For November 2010, anomalies of 12.5°C were recorded at latitude 71N, longitude -79 (Baffin Island, Canada). At specific moments in time and at specific locations, anomalies can be even more striking. As an example, on January 6, 2011, temperature in Coral Harbour, located at the northwest corner of Hudson Bay in the province of Nunavut, Canada, was 30°C (54°F) above average.