Abstract

The one-dimensional numerical simulation model of reservoir hydrodynamics, DYRESM, is used to examine the effects of Retention Time (RT) on reservoir stratification. The model is verified using data from 3 years of high flow (short retention time, RT), on two reservoirs in the Czech Republic. The verified model was used to make a systematic series of simulations with varying RT, outlet location and inflow temperature in order to gauge the affect on heat budgets, temperature stratification, flows and mixing in deep valley reservoirs. The prototype examples corresponded to two temperate and one subtropical reservoir.

The results show important effects of RT on reservoir hydrodynamics up to RT = 200, but a leveling off for RT > 400. Temperature stratification and hydrodynamic conditions are shown to be very sensitive to the inflow temperature and outlet depth. Results indicate that the energy released by large inflows is a major factor in determining the degree of mixing, and hence lower RT values lead to a weaker stratification in general. However, the height of withdrawal can have a significant impact on overall temperature and hence stratification. For example, large withdrawals from a deep outlet can lead to a reservoir with warmer average temperature by removing large amounts of cooler bottom water. Similarly, surface withdrawals may remove warmer water leading to a cooler average temperature. These effects are moderated by inflow temperatures and surface heat exchanges. In this paper an attempt is made to quantify these effects and come up with some general rules of thumb as to how different management strategies might influence the reservoir's average structure.