Hydro-Power – What Is the Future of Mega-Hydro?

The growth of hydro-power projects in the United States has been flat since the 1970s.

That may change with a recent announcement that the Alaska Energy Authority intends to develop a 600MW project on the Susitna River. That project has been under consideration for over 35 years. In the summer of 1979 I was considering paddling Devil’s Canyon in what was only the 4th descent of the rapids. In the end I didn’t go but my paddling buddies enjoyed an experience of a lifetime. We were very concerned at the time there were not many years left before the canyon would be dammed. As it turns out, it was only common sense not mega-development that kept me from ever going back.

The Susitna project is far from a done deal even now.

They are only just scoping out the environmental review process now. What gives me more confidence about the eventual design of the project is that in the 1970s they would have only had a couple of decades of discharge data from Susitna River at Gold Creek to work with whereas they now have another 35 years of data from Gold Creek plus they also have 30 years of data from Susitna River at Sunshine Creek to work with. Having long periods of high quality data from both upstream and downstream of the project site is hugely informative.

I do not know much about the data available to inform the design of mega-projects in Eastern Europe, Asia, Africa, India, and Latin America but I suspect that there will be a much greater reliance on ‘engineering judgment’ in those regions.

Data collection needs to commence long before the serious political and financial discussions do for development of projects on this scale.

Once the political process starts and funding sources are found, there is little enthusiasm for sitting back and waiting 30 or 40 years to collect the needed data. Monitoring agencies need adequate funding so that they can anticipate, and respond to, future need (as USGS has done in this case).

Most mega-hydro projects have turned out to be very effective long-term investments often providing flood control and a reliable source of irrigation water as well as hydroelectric power. The flip side to huge benefits are huge risks if the assumptions implicit in the design are not completely valid. In addition to the risk of catastrophic dam failure there is also the risk of silent failures such insufficient water supply to fill the reservoir to design capacity, sedimentation of the reservoir, and loss of habitat or fish passage resulting in a cascade of environmental degradation. The difference between success and failure may be as simple as the foresight to correctly anticipate future needs for data.

Categories

Water News

This blog post is a fantastic insight into the minds of the most influential hydrologists in the world today. The question: “WHAT BOOK OR PAPER HAS BEEN MOST INFLUENTIAL TO YOUR CAREER AND WHY?” was posed to senior hydrologists all over the world and the answers run from the predictable (Groundwater by Freeze and Cherry), the unexpected (House at Pooh Corner), classic (The Method of Multiple Working Hypotheses), eclectic (Slowness) to difficult (Scale of Fluctuation of Rainfall Models). Out of this broad range of influential reading, it is the explanation of why a particular body of work made a difference in a career path that is most illuminating. I particularly like the quote provided by Gregory Pastenack: “Show me a person who has read a thousand books and I’ll show you my best friend; show me a person who has read but one and I will show you my worst enemy.” Happy reading. – Stu Hamilton

The intent of this paper is to develop a system of diverse observation systems for the purpose of ground-truthing satellite observation systems. The challenge is that while it is possible to develop methods to identify incorrect data, the residue is not necessarily all correct data. Confidence in the data has to come from attributes such as whether the data source is well documented, well understood, representative, updated, publicly available, and maintains rich metadata. If broadly adopted, the system-of-systems approach will have potential benefits in guiding users to the most appropriate set of observations for their needs and in highlighting to network owners and operators areas for potential improvement. – Stu Hamilton

Murky Waters: Taking a Snapshot of Freshwater Sustainability in BC

This statement: “There’s a huge opportunity here to improve data collection, monitoring, and reporting. Reliable data would help governments, funders, and non-profits to track progress, make better decisions, and coordinate their efforts” is from Jack Wong, the CEO of the Real Estate Foundation of BC. The recommendations of this study include: “Regular public opinion surveys on freshwater attitudes (…) conducted by a cross-section of water partners to ensure long-term availability of the data”; “A multi-faceted solution (…) involving diverse groups that gather water data to increase the quantity and quality of data and improve data accessibility”; “This report shows a huge opportunity to convene relevant players and discuss solutions for freshwater sustainability data collection, monitoring, and reporting. If successfully implemented, communities across the province will be more informed and better stewards of BC’s most precious resource.” – Stu Hamilton

Attribution of cause to effect in natural environments is a difficult problem. It is one thing to be able to use monitoring data to say what is happening. It is much more challenging to say why it is happening. While difficult, attribution is important. Without compelling attribution, there are deeply entrenched reasons to stay the course and not make the changes necessary to achieve better outcomes. I am interested to know if the relatively simple method used in this paper for attribution of cause to extreme temperatures could be applied for other types of data. For example, wouldn’t it be good to have compelling attribution of cause for harmful algal blooms? – Stu Hamilton

This report makes a compelling argument about how looking after water is in the best strategic interests of the United States. It is better to anticipate predictable problems and take relatively inexpensive actions (e.g. wise use of data to influence proactive measures) to avoid or mitigate challenges to human health and economic development, both of which must be managed to ensure peace and security. – Stu Hamilton

Print this article and put it on the desk of the senior administrators in your water monitoring agency. The arguments made here that climate observation networks offer a magnified return on investment all hold true for water monitoring as well. “Climate change is but one example of the need to make decisions under deep uncertainty. Developing new approaches to decision making that go beyond traditional point and probabilistic predictions is the focus of a new scientific undertaking. Developing adaptation pathways that will be robust under many possible futures will in part require observing systems that are designed with these needs in mind.” “The economic value of such a system at ~ $10 trillion dollars to the world economy in today’s value (known as “net present value” in economics using a 3% discount rate). In the simplest sense, this is the economic value of moving climate scientific understanding forward 15 to 20 years by using better observations, analysis, and modeling capabilities. The studies further estimated that if the world tripled its current economic investments in climate research (observations, analysis, modeling) to achieve such an advanced observing system, the return on investment would be ~ $50 for every $1 invested by society.”

The average global temperature from January to September 2017 was approximately 1.1°C above the pre-industrial era. The years of 2013-2017 are set to be the warmest five-year period on record. The past three years have all been in the top three years in terms of temperature records. The WMO statement is based on five independently maintained global temperature data sets. The rate of increase in CO2 from 2015 to 2016 was the highest on record, 3.3 parts per million/year, reaching 403.3 parts per million. – Stu Hamilton

This paper affirms the central theme of my whitepaper “Improving Outcomes for Freshwater Availability, Security and Sustainability: Water Data Asset Management as a Strategic Investment.” The key to good governance is well-informed stakeholders. Water monitoring best serves public interests when data is managed as a strategic asset. – Stu Hamilton

I like this blog post for how it explains catchment processes as a lead-in to explaining the value of isotope hydrology. I think this approach is a good one for anyone in monitoring to remember when explaining what we do and why we do it. Start with the why and end with the what. – Stu Hamilton

An integrated database of data from 51,101 lakes in the northeast United States has been developed to assist with the problems arising from too many disperse and limited data sources. Three decades of data can now be discovered in its location and context (i.e. land use, geologic, climatic, and hydrologic settings). The database contains 150,000 measures of total phosphorus, 200,000 measures of chlorophyll, and 900,000 measures of Secchi depth. – Stu Hamilton