Mitigating Risk for Risk Mitigation: What L’Aquila Means for Water Monitoring

On October 22, 2012, six Italian scientists and a government official were sentenced to 6 years in jail, given lifetime bans on holding public office, and ordered to pay compensation of €7.8m in connection with the L’Aquila earthquake. The 6.3 magnitude earthquake injured over 1,000 people and resulted in 300 deaths. The judgment was based on statements made at a press conference following an emergency meeting convened to evaluate whether an episode of 400 tremors over a four month period should be cause for major concern in the region.

Nature provides more details on the specifics of the case but I am more interested in what can be learned from the generalities of the circumstances. The AGU raises the concern that this verdict could “ultimately be harmful to international efforts to understand natural disasters and mitigate associated risk”.

I think this case is illustrative of a cultural shift that is not limited to Italy. In the past, natural disasters were attributed to ‘acts of god’. Natural disasters are increasingly seen as evidence of incompetence for which someone (other than god) needs to be held accountable. The problem is that there is lots of blame to share. Who is to blame for inadequate building codes? Who is to blame for real estate development on unprotected floodplains? Who is the blame for insufficient budgets for maintenance of dams, levees and dikes?

The blamestorming that follows any natural disaster is exposing everyone involved in the delivery of any component in the data-to-decisions chain to elevated risk. We (i.e. the community of stream hydrographers) need to be prepared to manage this risk and this means paying attention to how we communicate with the end-user.

The best defense is to be able to demonstrate that our data publication processes meet the highest standards in the industry for quality, accuracy and timeliness. There should be no systematic (i.e. preventable) reason for our data to be either inaccessible or untrustworthy.

Communication of uncertainty is important. A generic disclaimer on provisional data is not sufficient to fully communicate the increased probability of significant error during unusual events. Whereas methods for explicit quantification of uncertainty are not yet sufficiently robust for operational use, categorical grading of data based on subjectively determined confidence ranges can be managed in a real-time data publication process.

All aspects of a monitoring system are particularly stressed during extreme events. Data gaps and identifiable problems require special attention. Timely, proactive communication is usually the best way to manage expectations of end users.

Communicating with the communicators is important. From the reports I have read it appears that the experts convicted in the L’Aquila case were not blamed for what they said per se but for their failure to correct misreporting of what they said.

We should not be afraid to do our job. We must not hesitate to provide timely data and information needed to mitigate disasters. We do need to be able to demonstrate that we use best practices are for all aspects of the data production process from gauge to page. We do need to make communication of uncertainty an integral part of our products and services. We do need to pay attention to how our products and services are reported on and used. We can, and we should, take great pride in what we do, why we do it, and in the very tangible benefits to society that result from our effort.

Categories

Water News

This blog post is a fantastic insight into the minds of the most influential hydrologists in the world today. The question: “WHAT BOOK OR PAPER HAS BEEN MOST INFLUENTIAL TO YOUR CAREER AND WHY?” was posed to senior hydrologists all over the world and the answers run from the predictable (Groundwater by Freeze and Cherry), the unexpected (House at Pooh Corner), classic (The Method of Multiple Working Hypotheses), eclectic (Slowness) to difficult (Scale of Fluctuation of Rainfall Models). Out of this broad range of influential reading, it is the explanation of why a particular body of work made a difference in a career path that is most illuminating. I particularly like the quote provided by Gregory Pastenack: “Show me a person who has read a thousand books and I’ll show you my best friend; show me a person who has read but one and I will show you my worst enemy.” Happy reading. – Stu Hamilton

The intent of this paper is to develop a system of diverse observation systems for the purpose of ground-truthing satellite observation systems. The challenge is that while it is possible to develop methods to identify incorrect data, the residue is not necessarily all correct data. Confidence in the data has to come from attributes such as whether the data source is well documented, well understood, representative, updated, publicly available, and maintains rich metadata. If broadly adopted, the system-of-systems approach will have potential benefits in guiding users to the most appropriate set of observations for their needs and in highlighting to network owners and operators areas for potential improvement. – Stu Hamilton

Murky Waters: Taking a Snapshot of Freshwater Sustainability in BC

This statement: “There’s a huge opportunity here to improve data collection, monitoring, and reporting. Reliable data would help governments, funders, and non-profits to track progress, make better decisions, and coordinate their efforts” is from Jack Wong, the CEO of the Real Estate Foundation of BC. The recommendations of this study include: “Regular public opinion surveys on freshwater attitudes (…) conducted by a cross-section of water partners to ensure long-term availability of the data”; “A multi-faceted solution (…) involving diverse groups that gather water data to increase the quantity and quality of data and improve data accessibility”; “This report shows a huge opportunity to convene relevant players and discuss solutions for freshwater sustainability data collection, monitoring, and reporting. If successfully implemented, communities across the province will be more informed and better stewards of BC’s most precious resource.” – Stu Hamilton

Attribution of cause to effect in natural environments is a difficult problem. It is one thing to be able to use monitoring data to say what is happening. It is much more challenging to say why it is happening. While difficult, attribution is important. Without compelling attribution, there are deeply entrenched reasons to stay the course and not make the changes necessary to achieve better outcomes. I am interested to know if the relatively simple method used in this paper for attribution of cause to extreme temperatures could be applied for other types of data. For example, wouldn’t it be good to have compelling attribution of cause for harmful algal blooms? – Stu Hamilton

This report makes a compelling argument about how looking after water is in the best strategic interests of the United States. It is better to anticipate predictable problems and take relatively inexpensive actions (e.g. wise use of data to influence proactive measures) to avoid or mitigate challenges to human health and economic development, both of which must be managed to ensure peace and security. – Stu Hamilton

Print this article and put it on the desk of the senior administrators in your water monitoring agency. The arguments made here that climate observation networks offer a magnified return on investment all hold true for water monitoring as well. “Climate change is but one example of the need to make decisions under deep uncertainty. Developing new approaches to decision making that go beyond traditional point and probabilistic predictions is the focus of a new scientific undertaking. Developing adaptation pathways that will be robust under many possible futures will in part require observing systems that are designed with these needs in mind.” “The economic value of such a system at ~ $10 trillion dollars to the world economy in today’s value (known as “net present value” in economics using a 3% discount rate). In the simplest sense, this is the economic value of moving climate scientific understanding forward 15 to 20 years by using better observations, analysis, and modeling capabilities. The studies further estimated that if the world tripled its current economic investments in climate research (observations, analysis, modeling) to achieve such an advanced observing system, the return on investment would be ~ $50 for every $1 invested by society.”

The average global temperature from January to September 2017 was approximately 1.1°C above the pre-industrial era. The years of 2013-2017 are set to be the warmest five-year period on record. The past three years have all been in the top three years in terms of temperature records. The WMO statement is based on five independently maintained global temperature data sets. The rate of increase in CO2 from 2015 to 2016 was the highest on record, 3.3 parts per million/year, reaching 403.3 parts per million. – Stu Hamilton

This paper affirms the central theme of my whitepaper “Improving Outcomes for Freshwater Availability, Security and Sustainability: Water Data Asset Management as a Strategic Investment.” The key to good governance is well-informed stakeholders. Water monitoring best serves public interests when data is managed as a strategic asset. – Stu Hamilton

I like this blog post for how it explains catchment processes as a lead-in to explaining the value of isotope hydrology. I think this approach is a good one for anyone in monitoring to remember when explaining what we do and why we do it. Start with the why and end with the what. – Stu Hamilton

An integrated database of data from 51,101 lakes in the northeast United States has been developed to assist with the problems arising from too many disperse and limited data sources. Three decades of data can now be discovered in its location and context (i.e. land use, geologic, climatic, and hydrologic settings). The database contains 150,000 measures of total phosphorus, 200,000 measures of chlorophyll, and 900,000 measures of Secchi depth. – Stu Hamilton