Sea level rise: what’s the worst case?

Well, I hope you are not overdosing on the issue of sea level rise. But this paper is somewhat different, a philosophy of science paper. Sort of how we think about thinking.

I would appreciate any comments, as well as suggestions as to which journals I might submit to. I have two in mind, but am open to suggestions (and I may need backups).

Thanks in advance for your comments.

Sea level rise: What’s the worst case?

Abstract. The objective of this paper is to provide a broader framing for how we bound possible scenarios for 21st century sea level rise, in particular how we assess and reason about worst-case scenarios. This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. Modal logic is used as a basis for describing construction of the scenario range, including modal inductivism and falsification. The logic of partial positions and strategies for speculating on black swan events associated with sea level rise are described. The rapidly advancing front of background knowledge is described in terms of how we extend partial positions and approach falsifying extreme scenarios of 21st century atmospheric CO2 concentrations, warming and sea level rise. The application of partial positions and worst-case scenarios in decision making strategies is described for examples having different sensitivities to Type I versus Type II errors.

Introduction

Sea level rise is an issue of significant concern, given the large number of people who live in coastal regions. The concern over sea level rise is not so much about the 20 cm or so that global mean sea level has risen since 1900. Rather, the concern is about projections of 21st century sea level rise based on climate model simulations of human-caused global warming.

Scientists and policy makers using projections of sea level rise are susceptible to making both Type I and Type II errors. An overestimation of a given impact is a Type I error (i.e., a false positive), while an underestimation of the impact is a Type II error (false negative). While we do not yet know the outcome of 21st century sea level rise, and hence Type I and II errors are correctly regarded as potential errors, we can assess errors in reasoning that lead to potential Type I or II errors.

The Intergovernmental Panel on Climate Change (IPCC) assessments have focused on assessing a ‘likely’ range (>66% probability) in response to different emissions concentration pathways. Brysse et al. (2013) argues that the IPCC consensus building process has effectively resulted in a focus on the avoidance of Type I (false-positive errors). A case in point is the assessment of sea level rise in the IPCC AR4 (2007). The AR4 deliberately neglected dynamic ice sheet melt from its projections of future sea level rise because future rates of dynamic ice sheet melt could not be projected with any confidence – a Type II error.

Curry (2011, 2018a) raises a different concern, that the climate change problem has been framed too narrowly, focusing only on human-caused climate change. In the context of this framing, the impacts of long-term natural internal variability, solar variations, volcanic eruptions, geologic processes and land use are relatively neglected as a source of 21st century climate change. This narrow framing potentially introduces a range of both Type I and II errors with regards to projections of 21st century climate change, and leaves us intellectually captive to unchallenged assumptions.

Oppenheimer et al. (2007) contends that the emphasis on consensus in IPCC reports has been on expected outcomes, which then become anchored via numerical estimates in the minds of policy makers. Thus, the tails of the distribution of climate impacts, where experts may disagree on likelihood or where understanding is limited, are often understated in the assessment process. Failure to account for both Type I and Type II errors leaves a discipline or assessment processes in danger of misrepresentation and unnecessary damages to society and human well being.

In an effort to minimize Type II errors regarding projections of future sea level rise, there has been a recent focus on the possible worst-case scenario. The primary concern is related to the potential collapse of the West Antarctic Ice Sheet, which could cause global mean sea level to rise in the 21st century to be substantially above the IPCC AR5 (2013) likely range of 0.26 to 0.82 m. Recent estimates of the maximum possible global sea level rise by the end of the 21st century range from 1.5 to 6 meters (as summarized by LeCozannet et al, 2017; Horton et al., 2014). These extreme values of sea level rise are regarded as extremely unlikely or so unlikely that we cannot even assign a probability. Nevertheless, these extreme, barely possible values of sea level rise are now becoming anchored as outcomes that are driving local adaptation plans.[1]

Reporting the full range of possible outcomes, even if unlikely, controversial or poorly understood, is essential for scientific assessments for policy making. The challenge is to articulate an appropriately broad range of future scenarios, including worst-case scenarios, while rejecting impossible scenarios.

This paper integrates climate science with broader perspectives from the fields of philosophy of science and risk management. The objective is to provide a broader framing of the 21st century sea level rise problem in context of how we assess and reason about worst-case scenarios.

Searching for black swans

Projections of future sea level rise are driven by climate-model generated projections of surface temperature in response to scenarios that increase atmospheric greenhouse gases. What type of climate change or sea level rise events, not covered by the current climate assessment reports, could possibly occur?

Potential surprises relative to background knowledge are often referred to as ‘black swans.’ There are two categories of black swan events (e.g. Aven and Renn, 2015):

Events or processes that are completely unknown to the scientific community (unknown unknowns).

Known events or processes that were ignored for some reason or judged to be of negligible importance by the scientific community (unknown knowns; also referred to as ‘known neglecteds’).

Efforts to avoid surprises begin with a fully imaginative consideration of possible future outcomes. Two general strategies have been employed for articulating black swan events related to climate change:

Physically-based scientific speculation on the possibility of high impact scenarios, even though we can neither model them realistically nor provide an estimate of their probability.

2.1 Dismal theorem and fat tails

In a seminal paper, Weitzmann (2009) articulated the dismal theorem, implying the evaluation of climate change policy is highly sensitive to catastrophic outcomes, even if they occur with vanishingly small, but fat-tailed,[2] probability. The dismal theorem contrasts sharply with the conventional wisdom of not taking seriously extreme temperature change probabilities because such probability estimates are not based on hard science and are statistically insignificant.

Weitzmann argued that probability density function (PDF) tails of the equilibrium climate sensitivity, fattened by structural uncertainty using a Bayesian framework, can have a large effect on the cost-benefit analysis. Weitzmann’s analysis of the equilibrium climate sensitivity (ECS) was based on the IPCC AR4 (2007) assessment that ECS was ‘likely’ (> 66% probability) to be in the range 2 to 4.5oC with a best estimate of 3oC, is ‘very unlikely’ (< 10% probability) to be less than 1.5oC, and values substantially higher than 4.5oC cannot be excluded. Proceeding in the Bayesian paradigm, Weitzmann fitted a Pareto distribution to these values, resulting in a fat tail that produced a probability of ECS of 0.05% exceeding 11oC, and 0.01% probability of exceeding 20oC.

Subsequently, the IPCC AR5 (2013) modified their assessment of ECS, dropping the lower bound of the ‘likely’ range to 1.5oC and 1.0oC for the ‘very likely’ range (>90%), and more clearly defining the upper range with a 10% probability of exceeding 6oC. Most significantly, the IPCC AR5 stated that no best estimate for equilibrium climate sensitivity can now be given because of a lack of agreement on values across assessed lines of evidence and studies. While intuitively it might seem that a lower bottom would be good news, Freeman et al. (2015) considered a family of distributions using the AR5 parameters and found both the lowering of the lower bound and the removal of the best estimate actually fatten the ECS tail.

Annan and Hargreaves (2006) and Lewis and Curry (2015) have criticized high values of ECS derived from estimated PDFs, owing to unjustified assumptions and inappropriate statistical methods. The uncertainty surrounding ECS is not intrinsic to ECS itself, but rather arises from uncertainties in the parameters used to calculate ECS, e.g. external forcing data and magnitude of ocean heat uptake.

Curry (2018a) argues that given the deep uncertainty surrounding the value of climate sensitivity, we simply do not have grounds for formulating a precise probability distribution. With human-caused climate change, we are trying to extrapolate inductive knowledge far outside the range of limited past experience. While artificially-imposed bounds on the extent of possibly ruinous disasters can be misleading (Type II error), so can statistical extrapolation under conditions of deep uncertainty also be misleading (Type I error).

2.2 Physically-based scenario generation

Rather than sampling from a probability distribution, physically-based scenario generation develops different possible future pathways from coherent storylines that are based on particular assumptions.

Formally, each possible future can be regarded as a modal sentence (Betz, 2009), stating what is possibly true of our climate system. Betz articulates two general methodological principles that may guide the construction of the scenario range: modal inductivism and modal falsificationism. Modal inductivism states that a certain statement about the future is possibly true if and only if it is positively inferred from our relevant background knowledge. Modal falsificationism further permits creatively constructed scenarios to be accepted as long as the scenarios cannot be falsified by being incompatible with background knowledge. Modal inductivism is prone to Type II errors, whereas modal falsification is prone to Type I errors.

Betz (2009) argues that modal inductivism explains the controversy surrounding the conclusions in the IPCC AR4 regarding sea level rise (e.g. Oppenheimer et al. 2007). The AR4 summary statement anticipated a likely rise in sea level of 18-59 cm by the year 2100. This result was derived from climate model-based estimates and did not include the potential for increasing contributions from rapid dynamical processes in the Greenland and West Antarctic ice sheets. Although the AR4 recognized the possibility of a larger ice sheet contribution, this possibility is not reflected in its main quantitative results. Betz argues that the possible consequences of rapid ice-dynamical changes were not included because there was no model that could infer positively the ice-dynamical changes.

2.2.1 Modal inductivism: scenario generation by climate models

The IPCC Assessment Reports provide projections of future climate using global climate models that are driven by scenarios of future greenhouse gas emissions. Limitations of the IPCC projections of future climate change are described by the IPCC AR5 (2017; Section 11.3.1, 12.2.3). Internal variability places fundamental limits on the precision with which future climate variables can be projected. There is also substantial uncertainty in the climate sensitivity to specified forcing agents. Further, simplifications and parameterizations induce errors in models, which can have a leading-order impact on projections. Also, models may exclude some processes that could turn out to be important for projections.

Apart from these uncertainties in the climate models, there are three overarching limitations of the climate model projections employed in the IPCC AR5 (Curry, 2018a):

The scenarios of future climate are incomplete, focusing only on emissions scenarios (and neglecting future scenarios of solar variability, volcanic eruptions and multi-decadal and longer term internal variability).

The ensemble of climate models do not sample the full range of possible values of ECS, only covering the range 2.1 to 4.7 oC and neglecting values between 1 and 2.1 oC, with values between 1.5 and 2.1 oC being within the IPCC AR5 likely range

The opportunistic ensemble of climate model simulations used in the IPCC assessment reports does not provide the basis for the determination of statistically meaningful probabilities.

In summary, existing climate models provide a coherent basis for generating scenarios of climate change. However, existing climate model simulations do not produce decision-relevant probabilities and do not allow exploration of all possibilities that are compatible with our knowledge of the basic way the climate system actually behaves. Some of these unexplored possibilities may turn out to be real ones.

2.2.2 Modal falsification: alternative scenario generation

Smith and Stern (2011) argue that there is value in scientific speculation on policy-relevant aspects of plausible, high-impact scenarios, even though we can neither model them realistically nor provide a precise estimate of their probability.

When background knowledge supports doing so, modifying model results to broaden the range of possibilities they represent can generate additional scenarios, including known neglecteds. Simple climate models, process models and data-driven models can also be used as the basis for generating scenarios of future climate. The paleoclimate record provides a rich source of information for developing future scenarios. Network-based dynamical climatology can also be used as the basis for generating scenarios. More creative approaches, such as mental simulation and abductive reasoning, can produce ‘what if’ scenarios (NAS 2018).

In formulating scenarios of future climate change, Curry (2011) raises the issue of framing error, whereby future climate change is considered to be driven solely by scenarios of future greenhouse gas emissions. Known neglecteds include: solar variability and solar indirect effects, volcanic eruptions, natural internal variability of the large-scale ocean circulations, geothermal heat sources and other geologic processes. Expert speculation on the influence of known neglecteds would minimize the potential for missing black swans events that are associated with known events or processes that were ignored for some reason.

The objective of alternative scenario generation is to allow for and stimulate different views and perspectives, in order to break free from prevailing beliefs. Construction of scenarios that provide plausible but unlikely outcomes can lead to the revelation of unknown unknowns or unknown knowns.

Scenario justification

As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible? In particular, how do we assess the possibility of potential black swan scenarios?

Confirmation (verification) versus falsification is at heart of a prominent 20th century philosophical debate. Lukyanenko (2015) argues that verification and falsification each contain contradictions and ultimately fail to capture the full complexity of the scientific process.

If the objective is to capture the full range of policy-relevant scenarios and to broaden the perspective on the concept of scientific justification, then both verification and falsification strategies are relevant and complementary. The difference between modal inductivism and modal falsificationism can also be thought of in context of regarding the allocation of burdens of proof. Consider a contentious scenario, S. According to modal inductivism, the burden of proof falls on the party that says S is possible. By contrast, according to modal falsificationism, the party denying that S is possible carries the burden of proof. Hence verification and falsification play complementary roles in scenario justification.

The problem of generating a plethora of potentially useless future scenarios is avoided by subjecting the scenarios to an assessment as to whether the scenario is deemed possible or impossible, based on our background knowledge. Further, some possible scenarios may be assigned a higher epistemic status if they are well grounded in observations and/or theory.

Under conditions of deep uncertainty, focusing on the upper bound of what is physically possible can reveal useful information. For example, few if any climate scientists would argue that an ECS value of 20 oC is possible. But what about an ECS value of 10 or 6 oC? We should be able to eliminate some extreme values of ECS as impossible, based upon our background understanding of how the climate system processes heat and carbon in response to gradual external forcing from increasing atmospheric carbon dioxide.

3.1 Scenario verification

As a practical matter for considering policy-relevant scenarios of climate change and its impacts, how are we to evaluate whether a scenario is possible or impossible? Betz (2010, 2012) provides a useful framework for evaluating the scenarios relative to their degrees of justification and evaluating the outcomes against our background knowledge. A high degree of justification implies high robustness and relative immunity to falsification.

Below is a classification of future climate scenarios based upon ideas developed by Betz:

Climate model simulations are classified here as unverified possibilities. Oreskes (1994) has argued that verification and validation of numerical models of natural systems is impossible. However there is a debate in the philosophy of science literature on this topic (e.g. Katzav, 2014). The argument is that some climate models may be regarded as producing verified possibilities for some variables (e.g. temperature).

The epistemic status of verified possibilities is greater than that of unverified possibilities; however, the most policy-relevant scenarios may be the unverified possibilities and the borderline impossible ones (potential black swans). Clarifying what is impossible versus what is possible is important to decision makers, and the classification provides important information about uncertainty.

As an example, consider the following classification of values of equilibrium climate sensitivity (overlapping values arise from different scenario generation methods and different judgment rationales):

>4.5 to 10 oC: borderline impossible (for equilibration time scales of a few centuries)

>10oC: impossible (for equilibration time scales of a few centuries)

There is a strongly verified anchor on the lower bound — the no-feedback climate sensitivity, which is nominally ~1 oC. Determination of ECS from observational data is represented by the Lewis and Curry (2018) analysis, the values from which are regarded as corroborated possibilities. The climate model range reported by the IPCC AR5 of 2.1 to 4.7 oC is classified as unverified possibilities. The borderline impossible range is open to dispute. Annan and Hargreaves (2006) argue for an upper bound of 4.5 oC. The IPCC AR5 put a 90% probability at 6 oC. None of the ECS values cited in the AR5 extend much beyond 6 oC (one tail extends to 9 oC), although in the AR4 several long-tailed distributions were cited, extending beyond 10 oC.

It is rational to believe with high confidence a partial position (Betz, 2012) that equilibrium climate sensitivity is at least 1 oC and between 1 and 2.7 oC, which encompasses the strongly verified and corroborated possibilities. This partial position with a high degree of justification is relatively immune to falsification. It is also rational to provisionally extend one’s position to believe values of equilibrium climate sensitivity up to 4.7 oC (the range simulated by climate models), although these values are vulnerable to the growth or modification of background knowledge and improvements to climate models whereby portions of this extended position may prove to be false. This argument supports why a rational proponent should be interested in adopting a partial position with high degree of justification. High degree of justification ensures that a partial position is highly immune to falsification and can be flexibly extended in many different ways when constructing a complete position.

If values beyond 10 oC are impossible, then the fat tail values generated by Weitzmann (2009) are impossible. Is it physically justified to eliminate extreme outcomes of ECS? Because of the nature of the definition of equilibrium climate sensitivity, with a very long equilibration timescale and the possibility of very long timescale feedbacks, attempting to identify an impossible threshold may be an ill-posed problem. Because of the central role that ECS plays in Integrated Assessment Models used to determine the social cost of carbon, this issue is not without consequence.

Even if is impossible to falsify high values of ECS owing to ambiguities in the definition of ‘equilibrium,’ there are physical constraints on how rapidly temperature or sea level can change by 2100 in response to CO2 doubling.

3.2 Scenario falsification and expert judgment

While scientific theories can never be strictly verified, they must be falsifiable. This leads to a corollary that predicted outcomes based on a theory of change should in principle be falsifiable.

How do we approach falsifying extreme scenarios? Extreme scenarios can be evaluated based on the following criteria:

Evaluation of the possibility of each link in the storyline used to create scenario

Evaluation of the possibility of the outcome, in light of physical constraints and possibility of the inferred rate of change.

The first criteria is a mechanistic one, whereby individual processes and links among them are evaluated. The second criteria is an integral constraint on the scenario outcome, related to the possibility of the outcome itself and the required rate of change to achieve the outcome over a specified period.

Assessing the strength of background knowledge is an essential element in assessing extreme scenarios. Extreme scenarios are by definition at the knowledge frontier. Hence the background knowledge against which extreme scenarios are evaluated is continually changing, which argues for frequent re-evaluation of extreme scenarios.

Expert judgment plays a prominent role in the IPCC process. The multiple lines of evidence surrounding equilibrium climate sensitivity used in the IPCC’s expert judgment are quite clear. However, sea level rise projections are a much more complex situation for expert judgment, owing to their dependence on projections of ice sheet behavior with relatively few lines of evidence and a great deal of uncertainty. Hence sea level rise projections have been heavily dependent on expert judgment.

Issues surrounding the process of expert judgment are revealed in context of an expert elicitation on sea level rise conducted by Horton et al. (2014), which presented results of a broad survey of 90 experts. Gregory et al. (2014) criticized several aspects of the elicitation. The first criticism addresses the issue of ‘which experts?’ The respondents were a subset (18%) of the 500 experts whom Horton et al. identified; the other 82% could not be contacted, declined to respond, or supplied incomplete or inconsistent responses.

While overall the elicitation provided similar results as cited by the IPCC AR5, Figure 2 of Horton et al. shows that several of the respondents placed the 83-percentile for global mean sea level rise by 2100 for RCP8.5 to be higher than 2.5 m, i.e. more than 1.5 m above the AR5 likely range, with the highest estimate exceeding 6 m. Gregory et al. argue that such high values are physically untenable. They state that there is a large difference in rigor between the IPCC assessment and an expert elicitation. An expert elicitation is opaque; the respondents are not asked to justify, and we cannot know how they arrived at their conclusions. The IPCC assessment process is designed to avoid Type I errors, whereas the expert elicitation elicited several expert opinions that arguably make a Type II error.

Curry (2011a) argues that because of the complexity of the issues in climate science, individual experts use different mental models for evaluating the interconnected evidence. Biases can abound when reasoning and making judgments about such a complex problem. Bias can occur by excessive reliance on a particular piece of evidence, the presence of cognitive biases in heuristics, failure to account for indeterminacy and ignorance, and logical fallacies and errors including circular reasoning.

Research in cognitive psychology shows that powerful and sometimes subtle biases play significant role in scientific justification. Tversky and Kahnemann (1974) identified numerous cognitive biases that proliferate into our regular and scientific thinking, and often compete with inductive and deductive forms of logical reasoning.

Sea level rise scenario verification and falsification

A comprehensive summary of recent sea level rise projections is provided by Horton et al. (2018). In assessing these projections for application to decision making, a broader framing of possible climate change scenarios is provided here, that includes natural climate variability and geologic processes.

4.1 Scenario generation

Physically-based scenarios of future sea level change are derived from the following methods: extrapolation of recent trends, semi-empirical approaches based on past relationships of sea level rise with temperature, and process-based methods using models.

Sea level rise projections are directly tied to projections of surface temperature, which are based upon simulations from global climate models that are forced by different emissions scenarios.

Most assessments have focused on bounding the likely range (>66%). Since the IPCC AR5 was published in 2013, new scenario and probabilistic approaches have been used for 21st century sea level rise projections. However, these new projections are based on the same climate model simulations used in the IPCC AR5.

Of particular note: the NOAA Technical Report entitled Global and Regional Sea Level Rise Scenarios for the United States (NOAA, 2017) provides a range of global mean sea level rise scenarios for the year 2100. The worst-case upper-bound scenario for global sea level rise (the H++ scenario) is 2.5 meters by the year 2100. The lower bound scenario is 0.3 meters by the year 2100.

Here we critically evaluate the upper and lower bounds: the worst-case scenario and also the lower bound best-case scenario.

3.2 Worst-case scenario

The worst-case scenario is judged to be the most extreme scenario that cannot be falsified as impossible based upon our background knowledge (Betz, 2010). Strategies for generating the worst-case sea level rise scenarios include: process modeling that employs the worst-case estimate for each component, estimates based on the deglaciation of the last ice age and the previous interglacials, and expert judgment.

Most of the recent estimates of the worst-case scenario for global sea level rise in the 21st century range from 1.5 to 3.0 meters, with the recent NOAA Report (NOAA, 2017) using a value of 2.5 meters. In the expert elicitation study of Horton et al. (2014), 5 of the 90 respondents cited a value exceeding 3 m, with the highest value exceeding 6 m. These values of sea level rise imply rates of sea level rise as high as 50-100 mm/year by the end of the 21st century. For reference, the current global rate of sea level rise is about 3 mm/year. Are these scenarios of sea level rise by 2100 plausible? Or even possible?

4.2.1 Worst-case storylines

Worst-case scenarios have been developed around story lines of irreversible reduction in ice mass of the Greenland and/or West Antarctic ice sheets. Worst-case scenarios for 21st century sea level rise have been developed in different ways: convening an expert committee to develop extreme scenarios (e.g. Katsman et al., 2011), conducting a large expert assessment survey (Horton et al., 2014), or combining process models or expert assessment of ice sheet contribution with climate model projections (e.g. Bamber and Aspinall, 2013).

The primary concern over future sea level rise in the 21st century is related to the potential collapse of the West Antarctic Ice Sheet (WAIS). For the WAIS, marine ice shelves and tongues that buttress inland, grounded ice are believed to be critical for the ice-sheet stability. Marine Ice Sheet Instability (runaway retreat of the ice sheet) could be initiated if the buttressing effect of this ice is lost from erosion by a warming ocean or altered circulation in coastal seas.

The most vulnerable region of the WAIS is the Amundsen Sea sector. Scenarios for increased ice discharge from this region have been articulated by Pfeffer et al. (2008), based on kinematic constraints on the discharge of glaciers. DeConto and Pollard (2016) introduced new instability mechanisms related to marine ice-cliff instabilities and ice-shelf hydrofracturing (rain and meltwater-enhanced crevassing and calving). Their high-end estimate exceeded 1.7 m of sea-level rise from Antarctica alone in 2100 under the RCP8.5 scenario.

The most extreme 21st sea level rise scenarios from process-based models are reported by Schlegel et al. (2018). They assessed how uncertainties in snow accumulation, ocean-induced melting, ice viscosity, basal friction, bedrock elevation, and the presence of ice shelves impact the future sea level contribution from the Antarctic ice sheet. They found that over 1.2 m of Antarctic ice sheet contribution to global mean sea level contribution is achievable over the next century, but not likely, as this increase tenable only in response to unrealistically large melt rates and continental ice shelf collapse. As an extreme worst case, plausible combination of model parameters produced simulations of 4.95 m sea level rise from the Antarctic ice sheet by 2100.

Prior to these sophisticated ice sheet model simulations, the rationale for the highest scenarios that were elicited by Horton et al. (2014) – exceeding 6 m – do not seem to be justified by process-based models, but rather by top-down semi-empirical methods that relate sea levels to global mean surface temperatures during current and previous interglacials. Hansen et al. (2016) considered sea level and rates of sea level rise during the late Eemian (previous interglacial, about 124,000 years ago), as a justification for predictions of several meters of sea level rise in the 21st century.

Another possible storyline relates to newly discovered geothermal heat fluxes in the vicinity of the Greenland and Antarctic ice sheets (e.g. DeVries et al. 2017), although these processes have not yet explicitly figured into worst-case sea level rise scenarios.

4.2.2 Worst-case constraints

While associated with physically plausible mechanisms, the actual quantification of the worst-case scenarios for 21st century sea level rise remains highly speculative. As a check on scenarios developed from process models and/or more speculative methods, integral constraints on basic physical processes provide a rationale for potentially falsifying extreme scenarios.

Deglaciation following the last ice age provides an opportunity to examine the stability of marine ice sheets and possible rates of sea level rise. During the Meltwater Pulse 1A (MWP-1A), the most rapid deglaciation occurred around 14.5ka BP. Recent research by DesChamps et al. (2012) constrained the rapid melting to a period of ~340 years. The most probable value of sea level rise during this period was between 14 and 18 m, implying that the rate of sea-level rise exceeded 40 mm/yr during this pulse. Two conflicting scenarios have been proposed for the source of MWP1A – a northern scenario, with partial melting of the large North American and Eurasian ice sheets, and a southern scenario that points to an Antarctic source. If the northern scenario is correct, then MWP-1A is not a very useful constraint for possible 21st century sea level rise.

Additional insights of relevance to the current configuration of the ice sheets are provided from the last interglacial (~130 to ~115 ky ago; the Eemian). Kopp et al. (2009) estimated a late Eemian sea level highstand with median value of sea level exceeding present values by 6.6 m (95% probability and unlikely (33% probability) to have exceeded 9.4 m. Kopp et al. concluded that present ice sheets could sustain a rate of global sea level rise rise of about 56–92 cm per century for several centuries, with these rates potentially spiking to higher values for shorter periods. Kopp et al. inferred that achieving global sea level in excess of 6.6 m higher than present likely required major melting of both the Greenland and the West Antarctic Ice Sheets.

Rohling et al. (2013) provide a geologic/paleoclimatic perspective on the worst-case scenario for 21st century sea level rise by examining the past 5 interglacial periods. They investigated the natural timescales and rates of change in ice-volume adjustment to a disequilibrium state, relative to a forcing increase. Projected rates of sea level rise above 1.8 m by 2100 are larger than the rates at the onset of the last deglaciation, even though today’s global ice volume is only about a third of that at the onset of the last deglaciation. Starting from present-day conditions, such high rates of sea level rise would require unprecedented ice-loss mechanisms without interglacial precedents, such as catastrophic collapse the West Antarctic Ice Sheet or activation of major East Antarctic Ice Sheet retreat.

An alternative strategy for falsifying ice loss scenarios relates to identifying physical constraints on specific ice loss mechanisms. Pfeffer et al. (2008) falsified extreme scenarios based on kinematic constraints on glacier contributions to 21st century sea level rise. They found that a total sea-level rise of about 2 meters by 2100 could occur under physically possible glaciological conditions but only if all variables are quickly accelerated to extremely high limits. They concluded that increases in excess of 2 meters are physically untenable.

The most extreme process-based sea level rise scenarios (e.g. DeConto and Pollard 2016; Schlegel et al., 2018) are derived from linking atmospheric warming with hydrofracturing of buttressing ice shelves and structural collapse of marine-terminating ice cliffs in Antarctica. Prediction of 21st century contributions depends critically on uncertain calibration to sea level rise in the Pliocene (about 3 million years ago) and debated assumptions about the Antarctic contribution to sea level rise during the Eemian.

Worst-case scenarios for 2100 and collapse of the West Antarctic Ice Sheet are driven by the RCP8.5 greenhouse gas concentration scenario. An additional constraint on the worst-case sea level rise scenario is an assessment of whether RCP8.5 is a possible scenario. RCP8.5 is an extreme scenario that may be impossible, given unrealistic assumptions and constraints on recoverable fossil fuel supply (e.g. Wang et al., 2016). Ritchie and Dowlatabadi (2017) explain that RCP8.5 contains a return to coal hypothesis, requiring increased per capita coal use that is based on systematic errors in coal production outlooks. Here, RCP8.5 is classified as borderline impossible.

Scenarios of 21st century sea level rise exceeding about 1.8 m require conditions without natural interglacial precedents. These worst-case scenarios require a cascade of events, each of which are extremely unlikely to borderline impossible, based on our current knowledge base. The joint likelihood of these extremely unlikely events arguably crosses the threshold to impossible.

How to rationally make judgments about the possibility of extreme scenarios remains a topic that has received too little attention.

4.3 Best case scenario

There has been much less focus on the possible best-case scenario, which is defined here as the lowest sea level rise for the 21st century that cannot be falsified as impossible based upon our background knowledge. Consideration of the best case is needed to provide bounds on future sea level rise. Further, verification/falsification analysis of the best case can provide important insights into uncertainty and the possible impacts of known neglecteds.

Parris (2012) recommends a lower bound of 0.2 m for 21st century global mean sea level rise, which is basically the observed rate of sea level rise during the 20th century. NOAA (2017) recommends that this value be revised upward to 0.3 m, because the global mean sea level rise rate as measured by satellite altimeters has averaged 3 mm/year for almost a quarter-century.

It is difficult to defend an argument that it is impossible for the 21st century sea level rise to occur at the same average rate as observed in the 20th century, especially since many if not most individual tide gauge records show no recent acceleration in sea level rise (e.g. Watson 2016).

Is it possible for global sea level to decrease over the 21st century? Kemp et al. (2018) provided an estimate of mean global sea level for the past 3000 years. There are several periods with substantial rates of sea level decline, notably 1000 to 1150 AD and 700 to 400 BC. Century-scale sea level decreases of the magnitude determined by Kemp et al. (about half the magnitude of the 20th century rate)[3] are not sufficient to completely counter the likely sea level rise projected by the IPCC AR5. Given the thermal inertia present in the oceans and ice sheets, it is arguably impossible for global mean sea level rise to decrease on the time scale of the 21st century.

However, it is possible for 21st century sea level rise to be less than in the 20th century. Possible scenarios of solar variations, volcanic eruptions and internal variability associated with large-scale ocean circulations could combine to reduce the 21st century rate of sea level rise relative to the 20th century. The relative importance to sea level change of human-caused warming versus natural climate variability depends on whether equilibrium climate sensitivity is on low end of the range (< 2 oC) or the high end (>4 oC) of current estimates.

The recent acceleration in global mean sea level rise since 1993 is attributed to increased melting of the Greenland ice sheet (e.g. Chen et al. 2017). This acceleration in Greenland melt has been largely attributed to natural variability associated with large-scale ocean and atmospheric circulation patterns – the Atlantic Multidecadal Oscillation (AMO) and the North Atlantic Oscillation (NAO) (e.g. Hahn et al. 2018). A future transition to the cool phase of the AMO and/or positive phase of the NAO would slow down (or possibly even reverse) the mass loss from Greenland, and hence slow down the rate of global sea level rise. Such a scenario for the Greenland mass balance is regarded as a corroborated possibility, since we have seen such a scenario recently during the 1970’s and 1980’s (e.g. Fettweis et al, 2013). The implications of such a scenario in the 21st century would significantly reduce sea level rise potentially for several decades, with the relative importance of this scenario for Greenland depending on whether equilibrium climate sensitivity to CO2 is on low end of the range or the high end of current estimates.

An additional best-case scenario relates to the recent finding by Barletta et al. (2018) that the ground under the rapidly melting Amundsen Sea Embayment of West Antarctica is rising at a rate of more than 4 cm per year. This rise is acting to stabilize the West Antarctic Ice Sheet. Ice loss spurs uplift in the sea floor (isostatic rebound), which is occurring rapidly owing to low viscosity under the Amundsen Sea Embayment. Such processes have a strong direct impact on West Antarctic Ice Sheet evolution at the centennial time scale. Gomez et al. (2015) articulate a negative feedback process whereby the combination of bedrock uplift and sea surface drop associated with ice sheet retreat significantly reduces ice sheet mass loss.

4.4 Possibility distribution

Given the deep uncertainty associated with projections of 21st century sea level rise (e.g. Horton et al., 2018), a way to stratify the current knowledge base about likelihood and possibility of a range of 21st century sea level outcomes is a possibility distribution (e.g. Mauris 2011). As an example, LeCozannet et al. (2017) has constructed a possibility distribution and diagram for projections of 21st century sea level rise scenario outcomes.

Here, a possibility diagram is constructed (Figure 1) under different assumptions than used by LeCozannet et al. The variable U denotes the outcome for 21st century sea level change. U is cumulative, so that an 80 cm outcome must necessarily first pass through lower values of sea level rise. Values less than U therefore represent partial positions for U. The function π(U) represents the state of knowledge of U, distinguishing what is necessary and possible from what is impossible.

π(U) = 1: nothing prevents U from occurring; U is a completely possible value and may be regarded as necessary

π(U) = 0: U is rejected as impossible based on current background knowledge

Intermediate values of π(U) reflect outcomes whereby there would be no particular surprise if U does occur, or no particular surprise if U does not occur. Following the classification introduced in section 3.1, values of U are assigned the following values of π(U) (Figure 2):

These π assignments are based on justifications provided in previous subsections (see also Curry, 2018b); however, this particular classification represents the judgment of one individual. One can envision an ensemble of curves, using different assumptions and judgments. The point is not so much the exact numerical judgments provided here, but rather to demonstrate a way of stratifying the current knowledge base that is consistent with deep uncertainty.

The possibility distribution in Figure 1 does not directly map to a PDF — the level of uncertainty is such that there is no particular basis for selecting a median or mean value for some hypothetical PDF of future sea level rise. LeCozannet et al. (2017) argue that no single PDF can represent the whole range of uncertainty sources related to future sea-level rise.

While there is a great deal of uncertainty surrounding the possible impact of marine ice cliff instability in the 21st century, rejecting such a scenario is a Type II error. Our background knowledge base will change in the future, and it is certainly possible that in the future that such a scenario will be considered to have a greater likelihood. Based upon our current background knowledge, it is arguably more rational to reject the RCP8.5 concentration scenario than it is to reject the ice cliff instability scenario.

Decision making under deep uncertainty about sea level rise

The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are relevant to decision making under deep uncertainty, where precautionary and robust approaches are appropriate. A precautionary appraisal is initiated when there is uncertainty. A robust policy is defined as yielding outcomes that are deemed to be satisfactory across a wide range of plausible future outcomes (e.g. Walker et al. 2016). Robust policy making interfaces well with possibilistic approaches that generate a range of possible futures. Worst-case scenarios are an essential feature of precaution.

These concepts are applied in general terms to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors:

Infrastructure siting in coastal areas: Type II errors are of the greatest concern

Tort litigation: Type I errors are of the greatest concern

5.1 Infrastructure siting in coastal areas

Consider a hypothetical decision related to siting of major infrastructure near the coast, such as an international airport or a nuclear power plant. For infrastructure siting decisions having a multi-decade lifecycle, a Type II error (underestimation of the impact) would have the most adverse consequences. In this case, it is arguably more important to assess the worst-case scenario than to assess what is likely to happen.

NOAA (2017) provides the following advice about scenarios in the context of robust decision making. First, define a scientifically plausible worst-case scenario as a guide for overall system risk and long-term adaptation strategies. Then define a central estimate or mid-range scenario as a baseline for shorter-term planning. This strategy assumes that adaptive management, e.g. including additional flood defenses such as sea walls, is feasible.

Considering the worst-case scenario is consistent with the precautionary principle. The precautionary principle is best applied to situations where the potential harm can be controlled by the decision maker. In this case, the actual siting of the infrastructure is completely controllable.

So for purposes of decision making regarding infrastructure siting, which worst-case scenarios should be considered? Guided by the possibility diagram in Figure 1, a prudent strategy is to select a provisional worst-case scenario of 1 to 1.6 m (a partial position), with a contingent strategy for adding additional flood defenses if needed.

Actual siting decisions involve a large range of factors not considered here (e.g., Hall et al., 2016), but this example of decision making arguably benefits from explicit consideration of the worst-case scenario.

5.2 Climate change litigation

Kilinsky (2008) argues for the theory that plaintiffs can prevail on claims arising from the threat of potential injury attributable to a failure to adapt to or prevent climate change. Allen (2003) discusses the challenges from a climate science perspective related to demonstrating liability for climate change.

In evaluating litigation claims that rely on projections of future climate change and sea level rise, it is instructive to consider the role that the strength of the knowledge base of future climate change and sea level rise might have in these claims. The standard legal definitions for evidentiary standards and burden of proof can be mapped onto the possibility classifications:

Credible evidence: evidence that is not necessarily true but that is worthy of belief and worthy of consideration –> unverified possibilities

Preponderance of the evidence, or balance of probabilities: greater than fifty percent chance that the proposition is true; more likely than not to be true –> verified possibilities

Clear and convincing evidence: highly and substantially more probable to be true than not –> corroborated possibilities

Beyond reasonable doubt: there is no plausible reason to believe otherwise –> necessary.

Based upon this classification, unverified possibilities and worst-case scenarios of sea level rise would not support such a tort case. The range of verified possibilities arguably defines the maximum projected sea level rise that would meet the standard of preponderance of evidence. The challenge in developing evidence for such a case is to demonstrate that the projections based on climate model simulations of global temperature change meet the standards of verified possibilities, with a high degree of justification and relatively immune to falsification, even if the verified scenario represents only a partial position.

Conclusions

The purpose of generating scenarios of future outcomes is that we should not be too surprised when the future actually arrives. Projections of 21st century sea level rise are associated with deep uncertainty and a rapidly advancing knowledge frontier. The dynamic nature of the knowledge frontier on worst-case sea level rise scenarios is highlighted by Kopp et al. (2017), who compared recent projections with past expert assessments. The objective of this paper has been to articulate a strategy for portraying scientific understanding of the full range of possible scenarios of 21st century sea level rise, with a focus on worst-case scenarios and the avoidance of Type II errors.

An argument for alternative scenario generation has been presented, to stimulate different views and perspectives. In particular, considering climate change to be solely driven by scenarios of future greenhouse gas emissions is arguably a framing error, that neglects possible scenarios of future solar variability, volcanic eruptions, natural internal variability of the large-scale ocean circulations, and geothermal and other geologic processes.

A framework for verifying and falsifying future scenarios is presented, in the context of modal logic. A classification of future scenarios is presented, based on levels of robustness and relative immunity to falsification. The logic of partial positions allows for clarifying what we actually know with confidence, versus what is more speculative and uncertain.

A possibility diagram of scenarios of 21st century cumulative sea level rise that ranks the possibilities from necessary to impossible provides a better representation of the deeply uncertain knowledge base than a probability distribution, since no single PDF can represent the whole range of uncertainty sources related to future sea-level rise. Apart from the limits of necessary and impossible, the intermediate possibilities do not map to likelihood since they also include an assessment of the quality of the knowledge base.

Hence, the possibility diagram avoids classifying scenarios as extremely unlikely if they are driven by processes for which we have a low level of understanding.

The possibility diagram for sea level rise projections considers sea level rise outcomes as resulting from a cumulative process, whereby a higher sea level outcome must first pass through lower levels of sea level rise. Therefore, lower values of sea level rise represent a partial position for the higher scenario. Partial positions can discriminate between lower values for which we have greater confidence, and higher values that are more speculative.

The concepts of the possibility distribution, worst case scenarios, scenario verification and partial positions are applied here to two decision making challenges related to sea level rise that have different sensitivities to Type I and II errors. The possibility distribution interfaces well with robust decision making strategies, and the worst-case scenario with partial positions is an important factor in precautionary considerations.

The approach presented here is very different from the practice of the IPCC assessments and their focus on determining a likely range, and provides numerous new challenges to the scientific community. There are some efforts (e.g. Horton et al., 2018) to develop decision-relevant probabilities of future sea level rise as part of a science-based uncertainty quantification. The state of our current understanding of sea level rise is far from being able to support such probabilities. The possibility distribution provides a framework for better classifying our knowledge about sea level rise scenarios.

Dr Curry: in a case such as this, where the extreme sea level rise is so heavily dependent on ice shelf collapse, and/or large glaciers discharging ice at much faster rates, I would focus on the factors which lead to such events.

I’m not an expert in the Antarctic shelf temperature profiles or currents, but I understand the Antarctic curcumpolar current has a slight negative surface temperature anomaly, and that extra energy melting the ice is coming from deep water currents? In that case I would verify the model esemble performance in this area, and what processes can increase the amount of energy being delivered.

Since this is the most worrisome impact of climate change, this type of work should be supplemented with a value of information analysis, which im sure will justify more data gathering in the Antarctic, including better measurements using automated weather stations, buoy strings located so they gather informacion under the seasonal sea ice, measurement of speed and properties of deep water currents which input energy into the Antarctic system, and other items I’m sure you can think of. If we can spend several hundred million dollars taking Martian samples (which I support), we ought to gave a 20 year program funded very generously to get better information.

An attempt to approach scientifically something deeply unscientific has a very small chance of changing anything. Science is the last thing the IPCC cares for. Humor might fare better, how about applying their approach to a New York horse manure problem, circa 1894?

on time scales of a century, not a lot of variation in ECS. part of the issue is that the concept of ECS is somewhat an artificial one, and it is difficult not to conflate this with natural variability (both in obs and models). I modified the statement in the .doc

Im not sure on how to send you a private message…is there a way to message you in private? We are working on producing a tv series, on ice climbing, and extreme sports around the world starting in January 2019, and we are incorporating “Climate Change” in there….would you like to contribute to it? if so, let me know how we can chat in private.

My thoughts are that the world previously had more CO2 and a much warmer world before Antarctic placement and before the closing of the Panama Isthmus and before the deepening of the Southern Ocean and development of the Antarctic Circumpolar Current. The world got colder and colder and the CO2 got reduced and reduced as deep ocean cooling absorbed more CO2. That dynamic hasn’t changed, previous to humans adding CO2 the cooling oceans capacity generally controlled the content of CO2 in the atmosphere. The rivers of super dense cold water that come off Antarctica and down to the bottom of the ocean has to come up somewhere, reducing water vapour in the air and cooling the planet hasn’t changed. The top of the ocean warmed for reasons that aren’t quite clear to make this interglacial, it has always gone back to a glacial world, maybe with pushing rivers of irrigation into the atmosphere might keep it from happening but I don’t see CO2 being a major factor as previously the world had much more CO2 and much warmer and it still moved in the cooling direction. Warming tipping points are not there…to me. Tipping points and pressure is always going to be on the cooling side until Antarctica moves or there is a gap for water to flow through the middle of North America and South America that would allow warmer water to stay at the top and develop the densities to warm the deep ocean. To me geology and plate tectonics has played a major role in how warm the planet is and isn’t giving enough weighting today.

An article from you about epistomology and SLR?!?!? I’m reading this immediately! More of these, please!

Great stuff, and perhaps monumental in terms of dealing with issues like this. Hopefully the Nobel Committee will see this. :-)

One quibble – sticking to centimeters would make things more consistent. Not that I can’t move the decimal point around, but it does require an extra step.

More substantially – aren’t there some feasible human-caused negative feedback scenarios with higher probabilities than some of the extreme scenarios?

For example, a sudden global consensus that maybe nuclear power isn’t that bad, plus a feasible carbon segregation technology might qualify somewhere around (un)verified possibility, and result in both a drop in future carbon usage and a drop in actual GHG-concentrations?

The choice of units is a complex issue. My preference is to use mm km and m (with the equivalent English conversions) for the following reasons;
These units always allow a simple to understand principle number. 100 mm is easier to understand than 0.1 m. Also it reduces the number of units that a person has to know and negotiate. Further conversion from one to the other is always a matter of moving the decimal place three places.

However, I understand that cm is in common usage as is liters (a cubic cm), etc. I am not sure how one can use units that makes everyone happy. Chemists use CGS, Engineers MKS, on so forth. It is very industry dependent.

I think if we can’t sustain emissions growth, we are likely to see an unexpected decline in concentrations. Biosphere uptake lags concentration growth substantially, it will continue to increase. Hopefully the biosphere uptake doesn’t outcompete agricultural demand.

Ice core analysis indicates that the end of an interglacial is marked by a rapid rise in temperature to a sharp peak followed by a fairly rapid rate of cooling. Is there any data available that relates sea level to the temperature curve ? If we knew where we are on the interglacial curve we may have a better idea how much sea level rise we have to plan for.

What happens after 2100? After the last glacial maximum, sea level eventually rose about 125 m for 5 C of global warming. That’s an average long-term rate of 25 m/C. Archer and Botvin looked at several geologic periods and found an average of 18 m/C (“The millennial atmospheric lifetime of anthropogenic CO2,” David Archer and Victor Brovkin, Climatic Change (2008) 90:283–297 DOI 10.1007/s10584-008-9413-1).

However, you do raise point. Land ice was melting much faster than it currently is at the beginning of the century and during previous several centuries. Sea level didn’t rise a lot (maybe sea surface temps stayed low and kept steric rise down).

Couple implications:

1. Glacial melt didn’t add a lot to sea level 1920s to 1950.
2. That large melt implies a large natural forcing.

DA,
After 2100?
From now to 2100 is 82 years.
82 years ago was 1936.
Which predictions made in 1936 are proving valuable today, in 2018?
There has been such great technological advance that little was predictable.
I was born in 1941, so I have some personal insight.
That tells me to stop this mad folly of the climate change catastrophe and concentrate on useful, productive science that advances the knowledge of humanity for its betterment.
At some later stage in your life, I predict that you will realize how wrong you were to push this wasted wheelbarrow, especially as you are doing little that is original. Then, you will feel sorrow and personal loss from part of a career wasted. Geoff.

Technologically and economically – and across sectors and gases – RCP4.5 is an achievable target. With reductions in black carbon and sulfate emissions easily achievable with existing cost competitive technology completing the picture. Makes a hell of a lot more sense than the impossible vagaries of a surface temperature target.

The slow draw down of CO2 in the atmosphere is the result of slow calcium weathering processes and subsequent complexation and sequestration of carbon – see the Archer article Appell referenced. Sequestration can be accelerated with needed – for many reasons – negative emission strategies in the land use and forestry sector. Conserving and restoring soils, grasslands, forests and wetlands and reclaiming deserts.

It’s a start – and a 21st century adventure begins with a single step.

Excellent work. It is very difficult to synthesize all the varied conclusions from sea level research.

Perhaps you addressed this and I missed it but I have 2 quibbles first based on research from Cazenave 2014 “The rate of sea-level rise” that you didnt address.

First she wrote “the period 1994-2002 and the period 2003-2011
of the observed slowdown. GMSL [global mean sea level] time series from five prominent groups processing satellite altimetry data for the global
ocean are considered. During recent years (2003-2011), the GMSL rate was significantly lower than during the 1990s (average of 2.4 mm/yr versus 3.5mm/yr). The IPCC suggests “3.2 [2.8 to 3.6] mm/year between 1993
and 2010”.

The IPCC’s estimates are averages of decadal acceleration and deceleration that obscure sea level rise oscillation, oscillations very similar to what Holgate 2008 reported happening every 20 years.

Second, AR4 suggests thermal expansion was resonsible for 1.6 mm/year and AR5 suggests 1.1 mm/year. That argues that warming oceans have caused 30 to 50% of sea level rise. But other studies found a much lwoer thermal contribution.

They “Inferred steric sea level rate from (1) (∼0.3 mm/yr over 2003–2008) agrees well with the Argo-based value also estimated here (0.37 mm/yr over 2004–2008).”

Those estimates of thermal expansion’s contributions are 3 to 4 times lower than the IPCC’s. Those estimates suggest a much small ocean heat contribution. Those differences between estimates also calls into question how well understood the physics of thermal expansion really is. In part the differences may be due to subjective uses of varied glacial isostatic adjustments.

Hi Jim, Agree that there is substantial uncertainty in steric component, which you think would be the most straightforward. Following Cazenave’s 2014 paper, SLR took a good jump with the El Nino. Missed Holgate’s 20 yr oscillation (much has been written on 60 yr oscillation). Wunsch’s 20% from geothermal is astonishing, no one has been factoring that in. Also, in a recent week in review, i noted a new equation of state for sea water, I wonder how much difference that would make (i expect it to be significant).

Jim, not Judith but I would point you to my deconstruction of Cazenove 2014 in essay PseudoPrecision in ebook Blowing Smoke. Her statistical methods got divorced from the physical reality underlying her hypothesis for why sat alt showed an SLR slowdown for several years. Her hypothesis was changed rainfall patterns in three major catchment basins so more ground water retention so less SLR. Problem is, two of her three basins have fully saturated soils, and the third did not get anywhere near enough rain to make up the difference— a separate bad paper deconstructed in the same essay.

Judith, nice paper and Fig 1 summary of a complex subject. Nothing of substance to add directly.
My own take is that your maximum unverified possibility of ~1.5 meters by 2100 is substantially too high, maybe by a factor of two, for three reasons you may wish to consider.
1. The Kopp paper first Eemian highstand ‘most likely’ 6.6 meters above modern SLR is also over 30 centuries. That averages about 22cm/ century. Granted, some of those 30 centuries could have been significanly faster. But even IF there was a sudden speedup tomorrow to his ‘fastest’ 55cm/ century, you still cannot get to 1 meter by 2100. And the paleoproxies show the Eemian peak temperature was about 2C higher in the tropics, and ice cores suggest as much as 6-8C higher in the polar regions
2. The Amundsen Embayment WAIS SLR risk has a lot of misrepresentations and plainly faulty CAGW assumptions in the published literature. See for example my guest post Tipping Points calling out Rignot from JPL. The assumptional problems are similar to those explined in my Totten Glacier guest post. The Amundsen catchment basin is much larger than the Pine Island Glacier most risked. There is literally no way that even if all of PIG went to sea by 2100 say thanks to underlying volcanoes, enough of the rest of that iceshed could creep seaward to replace it to get anywhere near the published estimates. And Rignot himself measured the creep rates making his estimates impossible.
3. Greenland is bowl shaped, so most of its ice has to melt, not calve. Again guest post Tipping points has details and melt estimates. Very slow melt process. Like Amundsen Embayment, even IF all the glaciers ‘outside the bowl’ somehow sluffed into the sea (highly unlikely) that is not enough ice to make a major difference. Remember, a gigaton of ice (~1 cubic km of ice) can raise sealevel by exactly 2.78 microns. The creep rates of the central Greenland icesheet would not allow sufficient replacement ‘from inside to outside the bowl’ in a century to make a significant difference. Again physical impossibility on top of highly unlikely unverified possibilities.

My point is this. A lot of the seemingly reasonable scientific papers your excellent article relies upon at face value are neither very reasonable nor very scientific. The problem is even more difficult than you portray.

Hi Rud, the rationale for the big SLR contribution is marine ice cliff instability, something new. I think it is premature to rule this out, but highly speculative at this point. I allowed for the MICI scenarios that weren’t based on RCP8.5

DeConto and Pollard claim that even if initiated in 2nd half of 21st century, the collapse could contribute this amount of SLR in 21st century. Apparently they have a new paper under review with even larger values. Not sure what to make of it, but not ready to dismiss as impossible.

Cannot disagree,
But for reference, the positive AMO Greenland ‘high’ ice loss 2000-2010 was ~2E2 GT/ year. (1990-2000 it was more like 2E1/yr). Times another 81 years to 2100 if it persists as AMO turns negative, 1.6E3 to 2100. A wee bit shy of 5.5E5— ‘just’ > two orders of magnitude.
Hey, its your paper, and I already said it was very good as far as it went using peer reviewed lit taken at face value. I know questioning face value opens a huge can of Mike Mann like worms. But uncertainty is what it is.
Highest regards.

curryja | ” the rationale for the big SLR contribution is marine ice cliff instability, something new.”
One sometimes forgets how high the ice cliffs were in the past coming out of the ice age. And what the circumference of Antarctica was then. We are in a shrinking ice age. The amount of ice free to break off is limited. The damage in terms of rising sea level is much less. Ice has always been unstable. Some cliffs are always breaking off somewhere in a warming world but doing less and less damage.
Why did this not happen at some point in the last 2,000 or 10,000 years causing a marked SLR?
Ice cliffs break and fall off under there own weight but only by CO2? What earth or ice shattering science that is.

“Not sure what to make of it, but not ready to dismiss as impossible.”

That seems fair given Holocene pulses (interesting wiki for 1B, always feel a great sense of relief when claims are immediately admitted to be greatly uncertain). Would need some dramatic evidence to believe their SLR claim is credible for today’s world but it should be an interesting paper.

Judith Curry:
I appreciate your efforts to get people to think about worst-case sea level rise and the larger impact upon science to address the issue.

From the for-what-its-worth department here is my assessment:

I am back to the “opportunity lost cost” question I had more than a decade ago and got dinged by Gavin for my efforts.

I start my thinking by framing the boundaries: what has been in the past. What has been the highest global temperature?
What has been the lowest global temperature?
What has been the highest sea levels?
What has been the lowest sea levels?
What has been the highest atmospheric CO2?
What has been the lowest atmospheric CO2?

How great is the difference between these boundary values and our present numbers? These differences form the basis of any subsequent calculations.

Without proposing a particular mechanism, what is the cost for society to reach each of these boundary values? When the dollar cost is believed to be astronomical, then the effort to reach that scenario is worthless. We are then dealing with something “less than.” The next step is to determine how much money is available to achieve some portion of the “less than” value. After determining how much money society has to spend on climate related issues, the next step is to determine whether one goes for the “mitigation” scenario or the “adaptation” scenario base upon the resources available. What is “doable” is a matter of present day resources and whether one wants to spend money now as you won’t have that money to do anything else in the future.

Lastly, the final calculation: is the money spent on doing something in the climate arena more effective than just doing nothing? One might spend money that you have today on something else deemed worth while like health care, homelessness, cheap abundant energy, food production, or maybe even better fishing rods.

As most members of society already do the calculations on the cost of what they want vs what they can afford, albeit some people do reach further than is allowable by law or common sense, the concept of delayed gratification still has meaning. That’s part of growing up. Framing issues for society’s activity on moral grounds rarely competes effectively with pocket book realities.

I am persuadable to act in a particular way in the immediate future, and I do. Its just that when I am dead and gone, my grandchildren will remember little except our “making memories” moments: a boat on a lake and the first fish caught. Stringing lights on this year’s Christmas tree when you are 10 years old. Life is moments of being together.

You asked some questions (but, I think, missed the most important questions). Here are answers to some of your questions.

I start my thinking by framing the boundaries: what has been in the past.

1. What has been the highest global temperature?
a. 36.3 C, 250 Ma ago. Currently 15 C, so global average temperature was 21 C higher than now. Average over the past 542 Ma was around 7 C warmer than now. [1]

1. What global average temperatures were optimum for life on Earth?
a. Approximately 18–28 C (‘Cambrian Explosion’ at 26–29 C, Eocene optimum, tropical rain forests from pole to pole, 26 C)

2. What is the relationship between the mass of biosphere carbon and global average temperature?
a. Mass of biosphere carbon increases as global average temperature increases towards optimum (decreases above optimum).

Conclusion: global warming up to optimum is beneficial for life on Earth

Thank you for both Scotese (2018) and Foster (2017). Informative. At least for the Phanerzoic period, it appears that earth has spent a lot of time in the hothouse temperature zones. Seems like earth is descending into cooling or icehouse.

Bummer that boundaries for sea levels not known, JC’s current paper discussion.

It is not surprising to me that I have missed asking important questions. I do question whether “optimum” is for me or for someone/something else. I look at the people I know and love and I am glad we have had this time together during earth’s Greenhouse.

My wife said that we have some 25 years or so remaining and the global temperature may get a bit warmer or cooler and we will do just fine.

1. Earth is currently in an Icehouse period, not a Greenhouse period. It is just the second time in the past 600 Ma that Earth has been this cold.. The last time was about 300 Ma ago. This is due to a 300 Ma cycle caused by cosmic ray flux intensity, due to the Milky way spiral arms rotation. Also that Antarctica is positioned over the South Pole. So, it seems unlikely Earth can get out of this Icehouse period for another 80 Ma or so. The last one lasted about 70 Ma.

2.

I do question whether “optimum” is for me or for someone/something else.

Warmer would be better. Most oif the temperature increase would be in higher latitudes, in winter and at night. That’s all good. Fora 3 C GMST increase the, the equator would warm by about 1.3 C, average tropical temp by about 2 C, and poles by much (see Scotese 2018).

Other evidence shows that the biosphere would benefit greatly by warming. Also the global economy.

Lastly, the final calculation: is the money spent on doing something in the climate arena more effective than just doing nothing?

The answer to this question is clear. Any money spent on mitigation is a waste. Well targeted funding and policy for adaptation – i.e. making people and economies more resilient to climate changes and severe weather events – is beneficial.

Money spent on mitigation is a waste of money for two reasons:

1. it reduces the economic benefit that global warming provides, plus

2. we are wasting $1.5–$2 trillion per year on the global climate change industry.

So what can we do to prevent SLR over the next century or for a thousand years into the future? Lomborg and his specialist team tell us there is little we can do to make any measurable change to temp by 2100 , even if all countries agreed to all commitments made at Paris COP 21.
And the non OECD countries co2 emissions are soaring and will continue to do so for decades to come. Germany has tried the hardest and wasted 100s of billions of dollars on clueless S&Wind, but are now extending their brown coal mines AGAIN to provide more base-load+ reliable energy to their grid once more.

If you believe the so called “mainstream scientist’s” input through the Royal Society + NAS report there is little chance of a temp reduction in 1,000 years even “if all countries stopped all human co2 emissions today”.

Nic Lewis doesn’t agree with their projection, but this is what the so called mainstream scientists agree on , whether we agree or not. They then tell us that co2 emission would not fall back to pre industrial levels ( 280ppm) for many thousands of years into the future.

I’m not saying this is correct or not, but this is what the so called scientists actually believe. Don’t forget the above scenario is if”ALL human co2 emissions STOPPED today”. SFA hope of that happening anytime soon. Here’s a list of some of the scientists provided at the RS site. A lot of familiar names there.

The following individuals served as the primary writing team for this document: ■ Eric Wolff FRS (UK lead), University of Cambridge ■ Inez Fung (NAS, US lead), University of California, Berkeley ■ Brian Hoskins FRS, Imperial College London and University of Reading ■ John Mitchell FRS, UK Met Office ■ Tim Palmer FRS, University of Oxford ■ Benjamin Santer (NAS), Lawrence Livermore National Laboratory ■ John Shepherd FRS, University of Southampton ■ Keith Shine FRS, University of Reading ■ Susan Solomon (NAS), Massachusetts Institute of Technology ■ Kevin Trenberth, National Center for Atmospheric Research ■ John Walsh, University of Alaska, Fairbanks ■ Don Wuebbles, University of Illinois This document was reviewed in draft form by individuals chosen for their diverse perspectives and technical expertise, in accordance with procedures approved by the Royal Society and the National Academy of Sciences. The reviewers provided comments and suggestions, but were not asked to endorse the views of the writing team, nor did they see the final draft before its release. We wish to thank the following individuals for their review of this report: ■ Richard Alley (NAS), Department of Geosciences, Pennsylvania State University ■ Alec Broers FRS, Diamond Light Source and Bio Nano Consulting (Former President of the Royal Academy of Engineering ) ■ Harry Elderfield FRS, Department of Earth Sciences, University of Cambridge ■ Joanna Haigh FRS, Imperial College London ■ Isaac Held (NAS), NOAA Geophysical Fluid Dynamics Laboratory ■ John Kutzbach (NAS), Center for Climatic Research, University of Wisconsin ■ Jerry Meehl, National Center for Atmospheric Research ■ John Pendry FRS, Imperial College London ■ John Pyle FRS, Department of Chemistry, University of Cambridge ■ Gavin Schmidt, NASA Goddard Space Flight Center ■ Emily Shuckburgh, British Antarctic Survey

I don’t know the context of this series of posts about pacifica but I have read the link that Mosh has given at 7.51 that references a report dated November 2018.

its a pretty good document, but as far as I can see its very much still a work in progress with a final comments date of December 10th. This has caused some complaints that its too short

Several of the comments also complain that the source of the sea level references has not been given. I didn’t see any direct references to 10 feet but I can’t claim to have read it all in full detail.

On page 114 there is a reference to a SLR of up to 5.7 feet by 2100, which period seems to be outside of the scope of this report and has caused concern, as that scenario was coupled with a 1 in 100 year storm.

This meant that commenters were worried that decisions made about the protection of their communities might be based on information that is outside the scope of the plan, which goes only to 2040.

This seems a very short-term overview. 70 to 100 years would be more common elsewhere, but that time scale is dictated by the Kyoto protocol which I don’t think America ever signed?

So presumably we need to wait until some time in the New Year, when all comments are in, all references furnished and the final report drafted.

In addition we have an interesting and perhaps pertinent comment, presumably from an individual, on page 113 point 13. (I use an Ipad)

“All city maps, hazard zones and all external references must have public links to their sources. This document is intentionally vague and confusing. it sends the public and city staff on wild goose chases trying to find the source data.’

so as yet this document- clearly labelled ‘draft’ -should not be taken as any sort of definitive statement and it will need to be sharpened up considerably before it can be presented to what appears to be a sceptical public.

The link mosh originally sent is version 7 of the Pacifica document from this month. It started off life some time ago. The relevant overall commission has now asked only in September that the 40 cities involved (Pacifica?) should also allow for this more drastic scenario of 10 feet which exceeds the ‘medium scenario’ the Pacifica document had originally planned for as a worst case scenario

As a result of this recent advice the Pacifica report of November appears to have rapidly revised things and are now expecting people to make final comments on a work in progress by December 10th, when so much information and references are still missing.

There seems to be lots of people lobbying, as the unreferenced draft guidance now seems to be pulling in areas as susceptible to SLR that goes well beyond what can be justified by the stuff that has been properly referenced and people are concerned their properties will be devalued or not protected when the time comes

I have never heard of people being asked to make final comments on what is still a very immature draft and have no idea what the hurry is, but clearly the report from the commission I just referenced seems to want the Pacifica authority to take into account more alarming scenarios than was originally intended.

Good for a laugh sometimes. Managed retreat is running like hell while infrastructure crashes into the sea because sea defenses are simply too expensive or export the problem to another location.

“Rather than being an inevitable consequence of global sea-level rise, our findings indicate that large-scale loss of coastal wetlands might be avoidable, if sufficient additional accommodation space can be created through careful nature-based adaptation solutions to coastal management.” https://www.nature.com/articles/s41586-018-0476-5

The alarmist-two-step always reminds me of the old Monty Python “I want to buy an argument” sketch:

a. 10 feet of sea level rise!
b. You said 10 feet, you know that’s not true.
a. I didn’t say that.
b. of course you did, it’s right there on the paper.
a. that’s just the media, but it is 10 feet!
b. there you said it again!
a. no I didn’t.
b. look, this isn’t an argument, you’re just saying the opposite of everything you just said. That’s not an argument!
a. yes it is…..

Tonyb
I was looking forward to you showing up here. I knew the subject would arouse your sea level instincts. Pacifica is very close to SF and I watched the houses and apartments fall off the sand cliffs due to sea waves undercutting the shoreline sand cliffs.

On the other hand, SF sea level rise is slow according to the NOAA chart Dr Curry included in the report. So what gives? Location location.

Those near shore locations losing value due to new setbacks from ocean side and insurance for cliffside homes impossible or extremely expensive.

I will look for the film at the falling houses and apartments into the sea.

By the way, my daughter got married in the SF cliff house a month ago, with lovely views over the rocky ocean cliffs that are not falling into the sea.

location location and probably geology geology as well needs to be the mantra.

We have soft red sandstone cliffs on our nearest bit of coast 5 minutes walk away. It then becomes granite. Several houses have fallen into the sea in recent years on the red sandstone and another one is boarded up and ready to go.

Its a continual process, but in the past it was usually just fields and trees but then people decided they wanted to live as close to the sea as possible and built in places they were warned not to. You can clearly see the line of the eroded cliffs stretching out into the water, evidence this has been going on for thousands of years

You shouldn’t build in places called ‘Flood lane, ‘Cliff edge’ ‘Water street’ or ‘Tides reach’ but people do, then expect protection.

I was on the Pacifica Community Working group on sea level rise. Via the Coastal Commission we were advised to consider a 10 foot rise by 2100 as the H++ scenario. Even though there has been no sea level rise since 1980, a scenario of just 2mm/year, the 20th century average, was never considered.

The 10 foot scenario was certainly a driving force for the Pacificans advocating for managed retreat. They believed adapting to such sea level rise or using sea walls to protect hoes and businesses was impossible and we should plan on how to move Pacifica’s infrastructure inland.

The 10 foot scenario is all driven by DeConto and Pollards hypotheses that Antarctica will slide into the sea

Sorry I am still not seeing any evidence whatsover that 10 ft has driven the policy in any way. The documentation is pretty clear> see a lot of verbal claims but zero documentation backing those up.
And Clearly NO documentation in the CITATION that curry used in the article.

“In general, the Coastal Commission recommends using best available science (currently the 2018 OPC SLR Guidance) to identify a range of sea level rise scenarios, including the low, mediumhigh, and, as appropriate, extreme risk aversion scenario22.”
That’s 10.2 feet by 2100 for the extreme risk scenario. I’m not going to drill in to finding notes, probably not on the web anyway, of Pacifica planners citing this for any specific project. But the above quote specifically says that the California Coastal Commission recommends planning for 10.2 feet of sea level rise by 2100 where appropriate.https://documents.coastal.ca.gov/assets/slr/guidance/2018/3_Ch3_2018AdoptedSLRGuidanceUpdate.pdf
Here is the Coastal Commission’s advice for local governments. “Choose multiple sea level rise scenarios based on range of sea level rise projections. The Coastal Commission recommends that all communities evaluate the impacts from the “medium-high risk aversion” scenario. Local governments should also include the “extreme risk aversion” scenario to evaluate the vulnerability of planned or existing assets that have little to no adaptive capacity, that would be irreversibly destroyed or significantly costly to repair, and/or would have considerable public health, public safety, or environmental impacts should that level of sea level rise occur. “https://documents.coastal.ca.gov/assets/slr/guidance/2018/5_Ch5_2018AdoptedSLRGuidanceUpdate.pdf
The California Coastal Commission is therefore explicitly telling local governments to use the 10.2 feet scenario in planning. QED.

The swedish government agency SMHI tells us that sea level maximums of 210 to 250 cm above todays can be expected in year 2100.
They also states that RCP 8.5 is the BAU trend.
So, prof. Curry has a good case.

The mission of the California Coastal Commission is to protect beaches and public access to our beaches. They are inherently anti-armoring of vulnerable cliffs based on their belief that armoring vulnerable bluffs will ultimately cause the loss of beaches.

That anti-armoring belief assumes that local erosion of sea bluffs supplies the sediment that replenishes the beaches. Thus they demand managed retreat be included in all Local Coastal Plans. However their assumptions are highly flawed.

Over 90% of the sediments supplying Pacifica beaches would have come from sediment outflow through the Golden Gate. But dredging, sediment mining and upstream dams have put Pacifica in a sediment deficit. I have repeatedly argued the Commission should first ensure the Pacifica gets its normal share of sediments that are being diverted, before arguing a sea wall would cause our beaches to disappear.

Second, the sea wall constructed in front of Golden Gate park was built in 1929 and remains structurally sound. Despite rise sea level, the beach in from of that sea wall has increased despite 20th century sea level rise. Likewise there is beach expanison in front of tje berm constructed north of Pacifica’s Mori Point, built to protect the Sharp Park golf course,

Governor Brown is a climate change nut blaming every wildfire on climate change and his Coastal Commission likewise expects a disastrous sea level rise due to climate change. Thus they push highly unlikely projections of 10 feet and demand all local coastal plans include managed retreat.

Pacifica’s city council removed managed retreat from their adaptation plan, due to a local uproar, but we all knew they had to. Otherwise it would be political suicide in this November’s election. And we all knew the Coastal Commission would reject the local plan and demand that Pacifica must add managed retreat to their adaptation plan

In an effort to minimize Type II errors regarding projections of future sea level rise, there has been a recent focus on the possible worst-case scenario. The primary concern is related to the potential collapse of the West Antarctic Ice Sheet, which could cause global mean sea level to rise in the 21st century to be substantially above the IPCC AR5 (2013) likely range of 0.26 to 0.82 m. Recent estimates of the maximum possible global sea level rise by the end of the 21st century range from 1.5 to 6 meters (as summarized by LeCozannet et al, 2017; Horton et al., 2014).

It would be interesting to know what impact reducing CO2 emissions would have on the time until the collapse might occur? If the world reduces CO2 emissions by a defined amount by how much would this increase the time until the collapse occurs?

tasfaymartinov can you please give us a reference for your claim that SLs are not rising?
Certainly not as much as some of the activist scientists are claiming, but I’m sure there is some SLR since the end of the LIA.

The FAA uses the term “maximum credible losses” when evaluating insurance requirements for space launch operations. Some variant of that would be better than “worst case”. The transformation of the everything goes wrong RCP 8.5 into the business as usual case is probably a red flag.

Geological data indicate that global mean sea level has fluctuated on 103 to 106 yr time scales during the last ∼25 million years, at times reaching 20 m or more above modern. If correct, this implies substantial variations in the size of the East Antarctic Ice Sheet (EAIS). However, most climate and ice sheet models have not been able to simulate significant EAIS retreat from continental size, given that atmospheric CO2 levels were relatively low throughout this period. Here, we use a continental ice sheet model to show that mechanisms based on recent observations and analysis have the potential to resolve this model– data conflict. In response to atmospheric and ocean temperatures typical of past warm periods, floating ice shelves may be drastically reduced or removed completely by increased oceanic melting, and by hydrofracturing due to surface melt draining into crevasses. Ice at deep grounding lines may be weakened by hydrofracturing and reduced buttressing, and may fail structurally if stresses exceed the ice yield strength, producing rapid retreat. Incorporating these mechanisms in our ice-sheet model accelerates the expected collapse of the West Antarctic Ice Sheet to decadal time scales, and also causes retreat into major East Antarctic subglacial basins, producing ∼17 m global sea-level rise within a few thousand years. The mechanisms are highly parameterized and should be tested by further process studies. But if accurate, they offer one explanation for past sea-level high stands, and suggest that Antarctica may be more vulnerable to warm climates than in most previous studies.

Polar temperatures over the last several million years have, at times, been slightly warmer than today, yet global mean sea level has been 6–9 metres higher as recently as the Last Interglacial (130,000 to 115,000 years ago) and possibly higher during the Pliocene epoch (about three million years ago). In both cases the Antarctic ice sheet has been implicated as the primary contributor, hinting at its future vulnerability. Here we use a model coupling ice sheet and climate dynamics— including previously underappreciated processes linking atmospheric warming with hydrofracturing of buttressing ice shelves and structural collapse of marine-terminating ice cliffs—that is calibrated against Pliocene and Last Interglacial sea-level estimates and applied to future greenhouse gas emission scenarios. Antarctica has the potential to contribute more than a metre of sea-level rise by 2100 and more than 13 metres by 2500, if emissions continue unabated. In this case atmospheric warming will soon become the dominant driver of ice loss, but prolonged ocean warming will delay its recovery for thousands of years.

Recent Antarctic ice-sheet modeling that includes the effects of surface meltwater on ice-sheet dynamics (through hydrofracturing and ice-cliff collapse) has demonstrated the previously underappreciated sensitivity of the ice sheet to atmospheric warming in addition to sub-ice oceanic warming. Here, we improve on our modeling of future ice- sheet retreat by using time-evolving atmospheric climatologies from a high-resolution regional climate model, synchronized with SSTs, subsurface ocean temperatures, and sub-ice melt rates from the NCAR CCSM4 GCM. Ongoing improvements in ice-sheet model physics are tested and calibrated relative to observations of recent and ancient (Pliocene, Last InterGlacial, and Last Deglaciation) ice-sheet responses to warming. The model is applied to a range of future greenhouse-gas emissions scenarios, including modified RCP scenarios corresponding to the 1.5o and 2.0o targets of the Paris Agreement and higher emissions scenarios including RCP8.5. The results imply that a threshold in the stability of the West Antarctic Ice Sheet and outlet glaciers in East Antarctica might be exceeded in the absence of aggressive mitigation policies like those discussed in Paris. We also explore the maximum potential for Antarctica to contribute to future sea-level rise in high greenhouse gas emissions scenarios, by testing a range of model physical parameters within the bounds of observations.

HI Frank, thanks for this link. In my report, I didn’t really tackle adaptation to SLR, but have investigated this somewhat. Some locations like seattle and Rhode Island are well prepared, given large tidal variations and/or history with hurricanes. Others are pathetically vulnerable due to stupid land use decisions like San Franciso and Baltimore.

Judy, yes there are some remarkable spatial differences, however there is still some time to close some gaps IMO. The cited paper looks at long time possibilities in many regions of the world and is in almost every case some kind of otimistic. A worth read…

Is “impossible” too strong a term given the importance you rightly give for possible black swans? It has that hyperbole feel that seems a bit excessive these days. Was global cooling deemed impossible in the days of, say, the Maunder or Dalton Minimum? [I’m not a scientist, so use the large sea salt grains on this. :)]

It should be realized that even 5 feet in the 21st century represents a 10-fold increase over the 20th century average, so it is a prediction of an explosive growth in the sea-level rise rate. In such unstable conditions, who can distinguish with confidence between a ten-fold and a twenty-fold increase in rise rate?

One serious problem for the worst possible case is that we have already been there and we sort of know how it was. It is called the Holocene Climatic Optimum (or Holocene Thermal Maximum). We know for a fact that temperature then was at least 1°C warmer than now. In many places several degrees. And we know that, based on two incontrovertible pieces of evidence:

1. Glaciers during the HCO were significantly less extensive than now. Glaciers have probably receded several thousands years of inconstant growth in the past century and a half, but they were much shorter 6-8 ka ago in most places. Lots of literature on this.

2. The treeline in altitude, despite the elevated CO2 levels that constitute a big bonus to trees, is still lower by 300-400 m in most places. This is equivalent to several degrees in summer temperature. Trees grow in a few years if conditions allow. Again lots of literature on this, even from tropical ranges.

Since the non-polar ice-sheets had melted by 7 ka, the planet had one to two thousand years of significantly warmer temperatures than now. The result is known as the Holocene highstand. The problem is that it varies from place to place due to the many factors that affect local sea levels including a stronger isostatic rebound and no global reconstruction of Holocene sea levels has been seriously attempted recently to my knowledge.

In any case it appears based on bibliography that the Holocene highstand might have been 1-2 m higher than now, not much more. And that was after a thousand or more years of warmer temperatures. Our worst case scenario for just a century can only be a fraction of that (0.3 m?), even if temperature continues increasing at the same rate as in the 20th century.

I think climate scientists need to ponder why the predictions are so dire if after 42 years of global warming (since 1976), we have seen so little effect.

Yes, and for this reason we also the permafrost and clathrate methane and CO2 bombs aren’t plausible. If anything, permafrost lands are likely to become a carbon sink.

The original methane bomb hypothesis is also due to isostatic rebound. It is possible regardless of anthropogenic warming, but it is often wrongly presented in a way that suggests it is something that warming is likely to trigger.

I mean, there is a reason why climatologists are so fond of anthropogenic climate change. You can build a career with little effort and even less talent. The entire scientific system is geared to promote this issue.

Scientific publishing is becoming more and more of a scam. Taxpayers pay for academics research and time to write. Taxpayers pay for publishing the results. Taxpayers pay for Universities and Institutes libraries buying the journals and books that essentially nobody reads. And taxpayers have to pay yet again if they want to read what they have already paid three times. Huge business that instead of promoting science is actually an impediment to its dissemination. The volume published is so high that nobody can keep up and still do some work, so everybody reads and cites the same fashionable articles. Scientific publishing has already become parasitic, draining resources and converting peer review into a joke and an obstacle to making even more money.

Thanks Javier, very helpful. My main concern is finding a journal that will consider this given that I am ‘conflicted’ since my company receives funds from energy companies. For example I suspect Global and Planetary Change would not consider a manuscript from me for those reasons.

I suggest you contact the editor that would handle your manuscript and explain both its main findings and your disclosure situation and see if they are willing to evaluate it and publish it if the reviews are positive. That might save you time by discarding journals that might end up objecting to the article after having it for weeks. Energy-related journals or engineering-related journals are less likely to object to energy-related funding, but you don’t know unless you ask.

In 1963, Lorenz published his seminal paper on ‘Deterministic non-periodic flow’, which was to change the course of weather and climate prediction profoundly over the following decades and to embed the theory of chaos at the heart of meteorology. Indeed, it could be said that his view of the atmosphere (and subsequently also the oceans) as a chaotic system has coloured our thinking of the predictability of weather and subsequently climate from thereon….” Slingo and Palmer 2011

“The global coupled atmosphere-oceanland-cryosphere
system exhibits a wide range of physical and dynamical
phenomena with associated physical, biological, and chemical feedbacks that collectively result in a continuum of temporal and spatial variability. The traditional
boundaries between weather and climate are, therefore, somewhat artificial. The large-scale climate, for instance, determines the environment for microscale (1 km or less) and mesoscale (from several kilometers to several hundred kilometers) processes that govern weather and local climate, and these small-scale processes likely have significant impacts on the evolution of the large-scale circulation (Fig. 1; derived from Meehl et al. 2001).
The accurate representation of this continuum of variability in numerical models is, consequently, a challenging but essential goal. Fundamental barriers to advancing weather and climate prediction on time
scales from days to years, as well as longstanding
systematic errors in weather and climate models, are partly attributable to our limited understanding of and capability for simulating the complex, multiscale interactions intrinsic to atmospheric, oceanic, and cryospheric fluid motions.”

“It is very likely that the mean rate of global averaged sea level rise was 1.7 [1.5 to 1.9] mm/year between 1901 and 2010 . . . and 3.2 [2.8 to 3.6] mm/year between 1993 and 2010. It is likely that similarly high rates occurred between 1920 and 1950.”

The problem here is subsequent studies have demonstrated a lower rate of sea level rise 1920 to 1950, and the rate of SLR from 1993 to 2010, 2.8 mm to 3.6 mm, has now extended to present, and shows no sign of abating. So the post-1993 rate is higher and has lasted much longer.

Lastly, the study that produced the highest rates for 1920 to 1950 has been called into significant dispute by other sea level scientists:

“So the post-1993 rate… has lasted much longer…” than the 1920 to 1950 rate? Hmmmm…

But the problem as always is the internal component. Half of surface warming and half of Arctic ice loss in the past 40 years. And it is the slow motion but terminal collapse of the IPCC forcing paradigm that continues unabated.

The resonant trigger for these internal modes of variability is high solar activity over the last half of the 20th century in the modern grand max.

“The past few decades have been characterized by a period of relatively high solar activity. However, the recent prolonged solar minimum and subsequent weak solar cycle 24 have led to suggestions that the grand solar maximum may be at an end1. Using past variations of solar activity measured by cosmogenic isotope abundance changes, analogue forecasts for possible future solar output have been calculated. An 8% chance of a return to Maunder Minimum-like conditions within the next 40 years was estimated in 2010 (ref. 2). The decline in solar activity has continued, to the time of writing, and is faster than any other such decline in the 9,300 years covered by the cosmogenic isotope data1. If this recent rate of decline is added to the analysis, the 8% probability estimate is now raised to between 15 and 20%.” https://www.nature.com/articles/ncomms8535

Although I – and a few others – have posited the Hale cycle butterfly of solar magnetic reversal as the origin of the 20 to 30 year periodicity.

Oh for God’s sake – the point is still multi-decadal variability and not absurd sub-millimeter precision in a parameter with intrinsic variability. The question as always is not how many millimeters oceans or fractions of a degree surface temperatures have risen but how much is natural. Do I need to repeat myself in the face of such unresponsiveness to comments made?

“The global-mean temperature trends associated with GSW are as large as 0.3 °C per 40 years, and so are capable of doubling, nullifying or even reversing the forced global warming trends on that timescale.” https://www.nature.com/articles/s41612-018-0044-6

This is not even a bloody black swan – it has been a foundation of geophysics for decades. And as for the BS of a positive PDO – that cannot be defined on such a superficial basis and yet he continues to try.

The current 10-year rate is 4.32 mm/yr, which would require adding another section to the height of the above graph.

The current positive phase of the PDO has not ended, though it is currently in neutral.

Into the teeth of the pause, the rate of SLR remained at or above the rate seen 1920 to 1950. So the slight deflections hinted at in the above graph were not sustained. It’s back to a robustly upward trend, which is why studies have been finding an accretion in the rate of sea level rise:

“Claims of the IPCC and related persons with respect to sea level changes is deeply biased and not based on actual observation. A total revision is necessary. Its purpose appears to simply be to scare people.” (Nils-Axel Mörner)

What realistically would be the impact on the level of the seas even if humans were causing global warming? To answer we need only look at the impact of natural global warming.

There are physical limits for how fast ice can melt. The maximum rates recorded were those related to the melting of the glaciers of the Last Ice Age. The corresponding sea level rise amounted to 10 mm/year or 1 m per century. This figure sets the ultimate frame for possible sea level changes… (Professor Nils-Axel Mörner)

Judith’s avian metaphor does not fit the case, because while black and white swans belong to the set of real birds, some named climatic contingencies , e.g.” sharknados”, are what philosophers of language like Saul Kripke term ” rigid designators in the null set.

We all recognize the term “rhinoceros” hough few of us have seen one, for they exist in the set of real mammals and may constitute an existenial threat if provoked.

But while Unicorns are just as broadly represented, each such representation is a rigid designator in the null set:

Black swans are of course a known known that led to the demise of the all swans are white paradigm. Just as climate known knowns have led to the slow motion dinosaur like demise of the IPCC forcing paradigm. Russel is a dinosaur who is legless but hanging in there. And dragon-kings are much more fun than dinosaurs.

“We emphasize the importance of understanding dragon-kings as being often associated with a neighborhood of what can be called equivalently a phase transition, a bifurcation, a catastrophe (in the sense of Rene Thom), or a tipping point. The presence of a phase transition is crucial to learn how to diagnose in advance the symptoms associated with a coming dragon-king.” https://arxiv.org/abs/0907.4290

There is one coming soon to get Russel, #jiminy, JCH, Jim Hansen and the IPCC.

As this is a sea level thread I am sure that any unicorns involved would be those relating to the sea. As the ‘unicorn of the sea’ -the narwhal-most definitely exist, then modelling their behaviour would be perfectly possible

Well while looking through NOAA tides and currents, I saw they had trend tables, so I selected global trends and got a list of 369 gauges, so i downloaded the data (export to csv) and plotted the average.
I was quite shocked to find ti was only 1.362 mm per year.

Judith, great that you have a prominent article based on your sea-level rise article in today’s The Australian newspaper, a counter to the more common warmist doomsday stories. Perhaps sanity will prevail.

Prof Curry: I note you used the graph “post-glacial sea level rise” in your sea level report. It is also currently part of a paper I am seeking peer review for. Did you need or receive copyright clearance for it? Is so where do I go to get it please? I do follow you om Twitter as catandman but your PM facility doesn’t work. Thanks for any help. Just need a accurate pointer. Have brain, will follow up. Brian

Thanks. Missed the link. One more to get now. Etwas Zuruck. To meet a challenge I just worked out how much heat to melt all the glacial ice as well as warm the oceans an average 8 degrees to reach interglacial state over 7Ka. 1.2×10^25 Joules (assume = 100 metre sea rise over todays ocean area). About 1/4 of the heat needed to raise the oceans by 8 degrees. Macro science is fun. E&OE.