Tuesday, December 16, 2014

Summary: Dangerously high levels of air pollutants are being released in Mecca during the hajj, the annual holy pilgrimage in which millions of Muslims on foot and in vehicles converge on the Saudi Arabian city, according to new findings.

UC Irvine and other researchers are testing air pollution in the Middle East, including in Mecca during the annual hajj, at burning landfills and elsewhere. Dangerously high levels of smog forming contaminants are being released, the scientists have found.Credit: Image courtesy of Dr. Azhar Siddique

Dangerously high levels of air pollutants are being released in Mecca during the hajj, the annual holy pilgrimage in which millions of Muslims on foot and in vehicles converge on the Saudi Arabian city, according to findings reported today at the American Geophysical Union meeting in San Francisco.

"Hajj is like nothing else on the planet. You have 3 to 4 million people -- a whole good-sized city -- coming into an already existing city," said Isobel Simpson, a UC Irvine research chemist in the Nobel Prize-winning Rowland-Blake atmospheric chemistry laboratory. "The problem is that this intensifies the pollution that already exists. We measured among the highest concentrations our group has ever measured in urban areas -- and we've studied 75 cities around the world in the past two decades."

Scientists from UCI, King Abdulaziz University in Saudi Arabia, the University of Karachi in Pakistan, the New York State Department of Health's Wadsworth Center, and the University at Albany in New York captured and analyzed air samples during the 2012 and 2013 hajjes on roadsides; near massive, air-conditioned tents; and in narrow tunnels that funnel people to the Grand Mosque, the world's largest, in the heart of Mecca.

The worst spot was inside the Al-Masjid Al-Haram tunnel, where pilgrims on foot, hotel workers and security personnel are exposed to fumes from idling vehicles, often for hours. The highest carbon monoxide level -- 57,000 parts per billion -- was recorded in this tunnel during October 2012. That's more than 300 times regional background levels.

Heart attacks are a major concern linked to such exposure: The risk of heart failure hospitalization or death rises sharply as the amount of carbon monoxide in the air escalates, the researchers note in a paper published in the journal Environmental Science & Technology. Headaches, dizziness and nausea have also been associated with inhaling carbon monoxide.

"There's carbon monoxide that increases the risk of heart failure. There's benzene that causes narcosis and leukemia," Simpson said. "But the other way to look at it is that people are not just breathing in benzene or CO, they're breathing in hundreds of components of smog and soot."

The scientists detected a stew of unhealthy chemicals, many connected to serious illnesses by the World Health Organization and others.

"Air pollution is the cause of one in eight deaths and has now become the single biggest environmental health risk globally," said Haider Khwaja of the University at Albany. "There were 4.3 million deaths in 2012 due to indoor air pollution and 3.7 million deaths because of outdoor air pollution, according to WHO. And more than 90 percent of those deaths and lost life years occur in developing countries."

Khwaja experienced sooty air pollution firsthand as a child in Karachi, Pakistan, and saw his elderly father return from the hajj with a wracking cough that took weeks to clear. He and fellow researchers braved the tunnels and roads to take air samples and install continuous monitors in Mecca.

"Suffocating," he said of the air quality.

In addition to the high smog-forming measurements, the team in follow-up work found alarming levels of black carbon and fine particulates that sink deep into lungs. Once the hajj was over, concentrations of all contaminants fell but were still comparable to those in other large cities with poor air quality. Just as unhealthy "bad air" days once plagued Greater Los Angeles, research is now showing degraded air in the oil-rich, sunny Arabian Peninsula and elsewhere in the Middle East. Because the number of pilgrims and permanent residents is increasing, the scientists recommend reducing emissions by targeting fossil fuel sources.

Besides vehicle exhaust, other likely culprits include gasoline high in benzene, a lack of vapor locks around gas station fuel nozzles, and older cars with disintegrating brake liners and other parts. Coolants used for air-conditioned tents sleeping up to 40 people also contribute to greenhouse gas buildup. And the dearth of regulations exacerbates these problems.

The researchers said that Saudi officials are aware of the issues and taking steps to address them, such as working to reduce benzene in area gasoline supplies. Directing Mecca pedestrians and vehicles to separate tunnels would be optimal. In addition, clearing the region's air with time-tested technologies used elsewhere in the world could sharply reduce pollution and save lives.

"This is a major public health problem, and the positive news is that some of the answers are very much within reach, like putting rubber seals on nozzles at gas stations to reduce leaks," Simpson said. "It's a simple, doable solution."

Summary: Not all boreholes are the same. Scientists used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the US. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution using a vapor capture system. The highest values measured by this process exceeded typical mean values in urban air by a factor of about one thousand.

The KIT measurement instrument on board of a minivan directly measures atmospheric emissions on site with a high temporal resolution.Credit: Photo: F. Geiger/KIT.

Not all boreholes are the same. Scientists of the Karlsruhe Institute of Technology (KIT) used mobile measurement equipment to analyze gaseous compounds emitted by the extraction of oil and natural gas in the USA. For the first time, organic pollutants emitted during a fracking process were measured at a high temporal resolution using a vapor capture system. The highest values measured by this process exceeded typical mean values in urban air by a factor of about one thousand, as was reported in the ACP journal.

Emission of trace gases by oil and gas fields was studied by the KIT researchers in the USA (Utah and Colorado) together with US institutes. Background concentrations and the waste gas plumes of single extraction plants and fracking facilities were analyzed. The air quality measurements of several weeks duration took place under the "Uintah Basin Winter Ozone Study" coordinated by the National Oceanic and Atmospheric Administration (NOAA).

The KIT measurements focused on health-damaging aromatic hydrocarbons in air, such as carcinogenic benzene. Maximum concentrations were determined in the waste gas plumes of boreholes. Some extraction plants emitted up to about a hundred times more benzene than others. The highest values of some milligrams of benzene per cubic meter air were measured downstream of an open fracking facility, where returning drilling fluid is stored in open tanks and basins. Much better results were reached by oil and gas extraction plants and plants with closed production processes. In Germany, benzene concentration at the workplace is subject to strict limits: The Federal Emission Control Ordinance gives an annual benzene limit of five micrograms per cubic meter for the protection of human health, which is smaller than the values now measured at the open fracking facility in the US by a factor of about one thousand. The researchers published the results measured in the journal Atmospheric Chemistry and Physics ACP.

"Characteristic emissions of trace gases are encountered everywhere. These are symptomatic of gas and gas extraction. But the values measured for different technologies differ considerably," Felix Geiger of the Institute of Meteorology and Climate Research (IMK) of KIT explains. He is one of the first authors of the study. By means of closed collection tanks and so-called vapor capture systems, for instance, the gases released during operation can be collected and reduced significantly.

"The gas fields in the sparsely populated areas of North America are a good showcase for estimating the range of impacts of different extraction and fracking technologies," explains Professor Johannes Orphal, Head of IMK. "In the densely populated Germany, framework conditions are much stricter and much more attention is paid to reducing and monitoring emissions."

Fracking is increasingly discussed as a technology to extract fossil resources from unconventional deposits. Hydraulic breaking of suitable shale stone layers opens up the fossil fuels stored there and makes them accessible for economically efficient use. For this purpose, boreholes are drilled into these rock formations. Then, they are subjected to high pressure using large amounts of water and auxiliary materials, such as sand, cement, and chemicals. The oil or gas can flow to the surface through the opened microstructures in the rock. Typically, the return flow of the aqueous fracking liquid with the dissolved oil and gas constituents to the surface lasts several days until the production phase proper of purer oil or natural gas. This return flow is collected and then reused until it finally has to be disposed of. Air pollution mainly depends on the treatment of this return flow at the extraction plant. In this respect, currently practiced fracking technologies differ considerably. For the first time now, the resulting local atmospheric emissions were studied at a high temporary resolution. Based on the results, emissions can be assigned directly to the different plant sections of an extraction plant. For measurement, the newly developed, compact, and highly sensitive instrument, a so-called proton transfer reaction mass spectrometer (PTR-MS), of KIT was installed on board of a minivan and driven closer to the different extraction points, the distances being a few tens of meters. In this way, the waste gas plumes of individual extraction sources and fracking processes were studied in detail.

Thursday, December 4, 2014

Summary: Plastic is well-known for sticking around in the environment for years without breaking down, contributing significantly to litter and landfills. But scientists have now discovered that bacteria from the guts of a worm known to munch on food packaging can degrade polyethylene, the most common plastic.The finding could lead to new ways to help get rid of the otherwise persistent waste, the scientists say.

Some bacteria from the guts of waxworms could help us eliminate plastic trash.Credit: ACS

Plastic is well-known for sticking around in the environment for years without breaking down, contributing significantly to litter and landfills. But scientists have now discovered that bacteria from the guts of a worm known to munch on food packaging can degrade polyethylene, the most common plastic. Reported in the ACS journal Environmental Science & Technology, the finding could lead to new ways to help get rid of the otherwise persistent waste, the scientists say.

Jun Yang and colleagues point out that the global plastics industry churns out about 140 million tons of polyethylene every year. Much of it goes into the bags, bottles and boxes that many of us use regularly -- and then throw out. Scientists have been trying to figure out for years how to make this plastic trash go away. Some of the most recent studies have tried siccing bacteria on plastic to degrade it, but these required first exposing the plastic to light or heat. Yang's team wanted to find bacteria that could degrade polyethylene in one step.

The researchers turned to a plastic-eating moth larva, known as a waxworm. They found that at least two strains of the waxworm's gut microbes could degrade polyethylene without a pretreatment step. They say the results point toward a new, more direct way to biodegrade plastic.

The authors acknowledge funding from the National Natural Science Foundation of China, the National Basic Research Program of China and the Shenzhen Key Laboratory of Bioenergy.

Monday, November 17, 2014

Summary: The 'surfactant' chemicals found in samples of fracking fluid collected in five states were no more toxic than substances commonly found in homes, according to a first-of-its-kind analysis.

Fracking fluid is largely composed of water and sand, but oil and gas companies also add a variety of other chemicals, including anti-bacterial agents, corrosion inhibitors and surfactants. Surfactants reduce the surface tension between water and oil, allowing for more oil to be extracted from porous rock underground.

In a new study published in the journal Analytical Chemistry, the research team identified the surfactants found in fracking fluid samples from Colorado, Louisiana, Nevada, Pennsylvania and Texas. The results showed that the chemicals found in the fluid samples were also commonly found in everyday products, from toothpaste to laxatives to detergent to ice cream.

"This is the first published paper that identifies some of the organic fracking chemicals going down the well that companies use," said Michael Thurman, lead author of the paper and a co-founder of the Laboratory for Environmental Mass Spectrometry in CU-Boulder's College of Engineering and Applied Science. "We found chemicals in the samples we were running that most of us are putting down our drains at home."

Imma Ferrer, chief scientist at the mass spectrometry laboratory and co-author of the paper said, "Our unique instrumentation with accurate mass and intimate knowledge of ion chemistry was used to identify these chemicals." The mass spectrometry laboratory is sponsored by Agilent Technologies, Inc., which provides state-of-the art instrumentation and support.

The fluid samples analyzed for the study were provided through partnerships with Colorado State University and colleagues at CU-Boulder.

Hydraulic fracturing, or "fracking," is a technique used to increase the amount of oil and gas that can be extracted from the ground by forcing fluid down the well. Fracking has allowed for an explosion of oil and gas operations across the country. In the U.S. the number of natural gas wells has increased by 200,000 in the last two decades, according to the U.S. Energy Information Administration.

Among the concerns raised by the fracking boom is that the chemicals used in the fracking fluid might contaminate ground and surface water supplies. But determining the risk of contamination--or proving that any contamination has occurred in the past--has been difficult because oil and gas companies have been reluctant to share exactly what's in their proprietary fluid mixtures, citing stiff competition within the industry.

Recent state and federal regulations require companies to disclose what is being used in their fracking fluids, but the resulting lists typically use broad chemical categories to describe the actual ingredients.

The results of the new study are important not only because they give a picture of the possible toxicity of the fluid but because a detailed list of the ingredients can be used as a "fingerprint" to trace whether suspected contamination of water supplies actually originated from a fracking operation.

The authors caution that their results may not be applicable to all wells. Individual well operators use unique fracking fluid mixtures that may be modified depending on the underlying geology. Ferrer and Thurman are now working to analyze more water samples collected from other wells as part of a larger study at CU-Boulder exploring the impacts of natural gas development.

Thurman notes that there are other concerns about fracking--including air pollution, the antimicrobial biocides used in fracking fluids, wastewater disposal triggering earthquakes and the large amount of water used--that are important to investigate and ameliorate. But water pollution from surfactants in fracking fluid may not be as big a concern as previously thought.

"What we have learned in this piece of work is that the really toxic surfactants aren't being used in the wells we have tested," he said.

Monday, November 10, 2014

Summary:
The 'food' sources that support Florida red tides are more diverse and complex than previously realized, according to five years' worth of research on red tide and nutrients. The microbiology, physiology, ecology and physical oceanography factors affecting red tides were documented in new detail and suggestions for resource managers addressing red tide in the coastal waters of southwest Florida were offered.

The rosette of Niskin bottles is submerged to collect water samples.

The multi-partner project was funded by the National Oceanic and Atmospheric Administration's ECOHAB program and included 14 research papers from seven institutions.

The research team studied four red tide blooms caused by the harmful algae species Karenia brevis in 2001, '07, '08 and '09, plus the non-bloom year 2010. Their goal was to understand which nutrients supported these red tides and the extent to which coastal pollution might contribute, helping reveal what drives red tide in southwest Florida.

Study partners documented 12 sources of nutrients in southwest Florida waters -- including some never before associated with K. brevis. Results supported the consensus that blooms start 10-40 miles offshore, away from the direct influence of land-based nutrient pollution, but once moved inshore blooms can use both human-contributed and natural nutrients for growth.

The project documented the microbiology, physiology, ecology and physical oceanography factors affecting red tides in new detail, provided a synthesis of results and offered suggestions for resource managers addressing red tide in the coastal waters of southwest Florida.

Florida red tide blooms -- which occur naturally in the Gulf of Mexico and most frequently off southwest Florida -- are higher-than-normal concentrations of the microscopic algae species K. brevis, a plant-like organism whose toxins can kill fish and other marine species, make shellfish toxic to eat and cause respiratory irritation in humans. These blooms occurred centuries before the mid-to-late twentieth century population boom along Florida's coast. Now, with large numbers of coastal residents and visitors in Florida, blooms can significantly affect public health and the economy.

Public information and short-term forecasts help mitigate red tide impacts, but ongoing research is critical to inform resource managers working to understand and potentially reduce nutrients available to blooms.

"Data go a long way toward increasing our understanding," said Dr. Cynthia Heil, Senior Research Scientist at Bigelow Laboratory for Ocean Sciences in Maine, who co-edited the special issue of Harmful Algae and was formerly with FWC's Fish and Wildlife Research Institute. "This report, which includes data from four different red tides and numerous laboratory studies and modeling efforts by biological, chemical and physical oceanographers, shows the collaborative efforts needed to understand why Florida red tides are so frequent and harmful in this region."

Co-editor Dr. Judith O'Neil, Research Associate Professor at the University of Maryland Center for Environmental Science, added, "We learned that K. brevis is an adaptable and flexible organism. We identified 12 different sources of nutrients that it can take up and use. One of the most interesting things that hadn't previously been taken into account is this organism's ability to not just use sunlight, like plants, but to also consume other single-celled organisms as a nutrient source. Additionally, its migratory behavior and directed swimming allows K. brevis access to nutrient sources everywhere it finds them -- at the surface, bottom and throughout the water column."

According to the study, K. brevis can get the nutrients nitrogen and/or phosphorus from the following sources (bold sources were newly linked to K. brevis blooms through the ECOHAB project):

Undersea sediments

Decaying fish

Water flowing out of estuaries

Deposits from the atmosphere

Nitrogen from the air transformed, or "fixed," into a more useable form by the naturally occurring bacteria Trichodesmium. (They are a type of cyanobacteria, which use energy from sun to make food, like plants. They can multiply and form blooms.)

Waste from zooplankton -- small aquatic animals visible to the naked eye

The "grazing" of smaller zooplankton, dubbed "microzooplankton" because they can only be seen under a microscope. (Grazing includes their "sloppy eating" of other tiny life forms, along with their waste.)

Picoplankton -- tiny life forms that K. brevis consumes

Bacteria transforming nitrogen in the water into more useful forms

Light creating available nutrients from natural, dissolved compounds like tannins in the water

Nitrogen from the air "fixed" by other cyanobacteria that are NOT Trichodesmium

The researchers concluded that many of these nutrient sources are individually more than enough to support observed blooms, but no single nutrient source is solely responsible.

Naturally occurring Trichodesmium (defined above) provided the most nitrogen, but not all, for K. brevis blooms developing offshore. Nearer to shore and within estuaries, major nitrogen sources believed to support blooms included estuary water carrying land-based nutrients to sea, underwater sediments and dead fish decomposing, in addition to other sources.

A few coastal sources -- estuary water, deposits from the atmosphere and underwater sediments -- are known to carry natural nutrients as well as some enhanced levels due to human activity. With other nutrient sources -- such as microscopic life forms -- connections with human activities are less direct, so it is harder to predict how they might be influencing red tides.

"Nature is messy, but this project has put several new pieces in place," said Dr. Kellie Dixon, Senior Scientist at Mote Marine Laboratory and Co-Principal Investigator for the ECOHAB project. "Until now we had not looked at this many of the 12 sources and their specific quantities simultaneously. Some of the sources, like nutrients released from the sediments, had never been measured in southwest Florida's coastal waters until we studied them for ECOHAB."

The project blended nutrient studies with physical oceanography, shedding new light on how blooms are brought to shore.

"Until now, effective management of harmful algal blooms caused by K. brevis was complicated because we didn't know enough about how different nutrient sources and forms taken up by K. brevis interacted with the physical environment," said Matt Garrett of the Fish and Wildlife Research Institute, who managed the ECOHAB project. "This project provides data that can help inform management recommendations on how to control nutrient sources and possibly improve forecasting models."

The special issue of Harmful Algae includes the following management recommendations:

Water use across the country reached its lowest recorded level in nearly 45 years. According to a new USGS report, about 355 billion gallons of water per day (Bgal/d) were withdrawn for use in the entire United States during 2010.

This represents a 13 percent reduction of water use from 2005 when about 410 Bgal/d were withdrawn and the lowest level since before 1970.

“Reaching this 45-year low shows the positive trends in conservation that stem from improvements in water-use technologies and management,” said Mike Connor, deputy secretary of the Interior. “Even as the U.S. population continues to grow, people are learning to be more water conscious and do their part to help sustain the limited freshwater resources in the country.”

Total water withdrawals by State and barchart showing categories by State from west to east, 2010. (Larger image.)

In 2010, more than 50 percent of the total withdrawals in the United States were accounted for by 12 states in order of withdrawal amounts: California, Texas, Idaho, Florida, Illinois, North Carolina, Arkansas, Colorado, Michigan, New York, Alabama and Ohio.

California accounted for 11 percent of the total withdrawals for all categories and 10 percent of total freshwater withdrawals for all categories nationwide. Texas accounted for about 7 percent of total withdrawals for all categories, predominantly for thermoelectric power, irrigation and public supply.

Florida had the largest saline withdrawals, accounting for 18 percent of the total in the country, mostly saline surface-water withdrawals for thermoelectric power. Oklahoma and Texas accounted for about 70 percent of the total saline groundwater withdrawals in the United States, mostly for mining.

“Since 1950, the USGS has tracked the national water-use statistics,” said Suzette Kimball, acting USGS director. “By providing data down to the county level, we are able to ensure that water resource managers across the nation have the information necessary to make strong water-use and conservation decisions.”

Trends in total water withdrawals by water-use category, 1950–2010.
(Larger image.)

Water withdrawn for thermoelectric power was the largest use nationally, with the other leading uses being irrigation, public supply and self-supplied industrial water, respectively. Withdrawals declined in each of these categories. Collectively, all of these uses represented 94 percent of total withdrawals from 2005-2010.

Thermoelectric power declined 20 percent, the largest percent decline.

Irrigation withdrawals (all freshwater) declined 9 percent.

Public-supply withdrawals declined 5 percent.

Self-supplied industrial withdrawals declined 12 percent.

A number of factors can be attributed to the 20 percent decline in thermoelectric-power withdrawals, including an increase in the number of power plants built or converted since the 1970’s that use more efficient cooling-system technologies, declines in withdrawals to protect aquatic habitat and environments, power plant closures and a decline in the use of coal to fuel power plants.

"Irrigation withdrawals in the United States continued to decline since 2005, and more croplands were reported as using higher-efficiency irrigation systems in 2010,” said Molly Maupin, USGS hydrologist. “Shifts toward more sprinkler and micro-irrigation systems nationally and declining withdrawals in the West have contributed to a drop in the national average application rate from 2.32 acre-feet per acre in 2005 to 2.07 acre-feet per acre in 2010."

For the first time, withdrawals for public water supply declined between 2005 and 2010, despite a 4 percent increase in the nation’s total population. The number of people served by public-supply systems continued to increase and the public-supply per capita use declined to 89 gallons per day in 2010 from 100 gallons per day in 2005.

Declines in industrial withdrawals can be attributed to factors such as greater efficiencies in industrial processes, more emphasis on water reuse and recycling, and the 2008 U.S. recession, resulting in lower industrial production in major water-using industries.

In a separate report, USGS estimated thermoelectric-power withdrawals and consumptive use for 2010, based on linked heat- and water-budget models that integrated power plant characteristics, cooling system types and data on heat flows into and out of 1,290 power plants in the United States. These data include the first national estimates of consumptive use for thermoelectric power since 1995, and the models offer a new approach for nationally consistent estimates.

In August, USGS released the 2010 water-use estimates for California in advance of the national report. The estimates showed that in 2010, Californians withdrew an estimated total of 38 Bgal/day, compared with 46 Bgal/day in 2005. Surface water withdrawals in the state were down whereas groundwater withdrawals and freshwater withdrawals were up. Most freshwater withdrawals in California are for irrigation.

The USGS is the world’s largest provider of water data and the premier water research agency in the federal government.

Tuesday, October 28, 2014

Fracking is a highly controversial and divisive issue. Proponents argue that it could be the biggest energy boom since the Arabian oil fields were opened almost 80 years ago, but this comes at a serious cost to the environment. Among the detrimental effects of the process is that the waste water it produces is over five times saltier than seawater, which is, to put it mildly, not good. A research team led by MIT that has found an economical way of removing salt from fracking waste water that promises to not only reduce pollution, but conserve water as well.

Hydraulic fracturing, or fracking uses water pressure to shatter oil shale formations, releasing oil and natural gas from deposits that would otherwise be uneconomical to exploit. One of the major problems with this process is that as the water is pumped through the oil shale, it picks up salt, and by the time it’s pumped back to the surface, it’s extremely salty – in the order of 192,000 parts per million (ppm). In contrast, seawater is only 35,000 ppm. This makes it not only too salty to be disposed of without reprocessing, but it’s also too salty to be reused in fracking.

The MIT research team sought to find the most cost effective means of desalinating fracking water. They found that electrodialysis is not only a promising way of cleaning up fracking waste water, but could also provide oil explorers with a closed-loop system that places less demand on local water supplies.

Electrodialysis is not a new technology. It was developed half a century ago and is currently used to desalinate brackish water and seawater, for small-scale drinking water plants, in food processing, greenhouses, hydroponics, and desalinating various chemicals.

In electrodialysis, a series of membranes divide streams of water of different salinity into stacks. An electric current on either side of the stack draws the sodium and chlorine ions of the salt across the membranes, leaving the water behind. The end result is a very salty stream of water, and a relatively pure stream.

According to MIT, electrodialysis has been overlooked as a way of treating fracking waste water until now because the process was thought to only be effective on water that wasn't of such high salinity. However, the team’s research found that electrodialysis is not only practical, but economically viable – not the least because water conducts electricity better as it gets saltier, therefore the electrodialysis process works better.

The team found that the key was to desalinate the water in stages and rather than making the water potable, it only had to be cleaned up enough to be pumped back into a fracking well and used again. This not only has the potential to reduce the costs, but also alleviate pressure on local water supplies and minimize the need for disposal of contaminated water.

In addition, the process described by MIT is extremely flexible, allowing engineers to "dial" the saline output. This is important, because reusing the water will mean finding the most effective level of salinity for fracking, which is a question still to be answered.

According to the team, there’s still a lot of work to be done before the process is practical. In addition to tweaking the electrodialysis design, laboratory work needs to be done on removing oil, gas, and mineral contaminants that may clog the membranes, and new equipment needs to be designed, built, and tested to apply the new technology.

Tuesday, October 14, 2014

A study of the removal of two dams in Oregon suggests that rivers can return surprisingly fast to a condition close to their natural state, both physically and biologically, and that the biological recovery might outpace the physical recovery.

The analysis, published by researchers from Oregon State University in the journal PLOS One, examined portions of two rivers -- the Calapooia River and Rogue River. It illustrated how rapidly rivers can recover, both from the long-term impact of the dam and from the short-term impact of releasing stored sediment when the dam is removed.

Most dams have decades of accumulated sediment behind them, and a primary concern has been whether the sudden release of all that sediment could cause significant damage to river ecology or infrastructure.

However, this study concluded that the continued presence of a dam on the river constituted more of a sustained and significant alteration of river status than did the sediment pulse caused by dam removal.

"The processes of ecological and physical recovery of river systems following dam removal are important, because thousands of dams are being removed all over the world," said Desirée Tullos, an associate professor in the OSU Department of Biological and Ecological Engineering.

"Dams are a significant element in our nation's aging infrastructure," she said. "In many cases, the dams haven't been adequately maintained and they are literally falling apart. Depending on the benefits provided by the dam, it's often cheaper to remove them than to repair them."

According to the American Society of Civil Engineers, the United States has 84,000 dams with an average age of 52 years. Almost 2,000 are now considered both deficient and "high hazard," and it would take $21 billion to repair them. Rehabilitating all dams would cost $57 billion. Thus, the removal of older dams that generate only modest benefits is happening at an increasing rate.

In this study, the scientists examined the two rivers both before and after removal of the Brownsville Dam on the Calapooia River and the Savage Rapids Dam on the Rogue River. Within about one year after dam removal, the river ecology at both sites, as assessed by aquatic insect populations, was similar to the conditions upstream where there had been no dam impact.

Recovery of the physical structure of the river took a little longer. Following dam removal, some river pools downstream weren't as deep as they used to be, some bars became thicker and larger, and the grain size of river beds changed. But those geomorphic changes diminished quickly as periodic floods flushed the river system, scientists said.

Within about two years, surveys indicated that the river was returning to the pre-removal structure, indicating that the impacts of the sediment released with dam removal were temporary and didn't appear to do any long-term damage.

Instead, it was the presence of the dam that appeared to have the most persistent impact on the river biology and structure -- what scientists call a "press" disturbance that will remain in place so long as the dam is there.

This press disturbance of dams can increase water temperatures, change sediment flow, and alter the types of fish, plants and insects that live in portions of rivers. But the river also recovered rapidly from those impacts once the dam was gone.

It's likely, the researchers said, that the rapid recovery found at these sites will mirror recovery on rivers with much larger dams, but more studies are needed.

For example, large scale and rapid changes are now taking place on the Elwha River in Washington state, following the largest dam removal project in the world. The ecological recovery there appears to be occurring rapidly as well. In 2014, Chinook salmon were observed in the area formerly occupied by one of the reservoirs, the first salmon to see that spot in 102 years.

"Disturbance is a natural river process," Tullos said. "In the end, most of these large pulses of sediment aren't that big of a deal, and there's often no need to panic. The most surprising finding to us was that indicators of the biological recovery appeared to happen faster than our indicators of the physical recovery."

The rates of recovery will vary across sites, though. Rivers with steeper gradients, more energetic flow patterns, and non-cohesive sediments will recover more quickly than flatter rivers with cohesive sediments, researchers said.

This research was supported by the Oregon Watershed Enhancement Board, the National Oceanic and Atmospheric Association and the National Marine Fisheries Service. It was a collaboration of researchers from the OSU College of Agricultural Sciences, College of Engineering, and College of Science.

Wednesday, October 8, 2014

Source: Johns Hopkins Bloomberg School of Public HealthSummary: A new study suggests that drops of fuel spilled at gas stations - which occur frequently with fill-ups - could cumulatively be causing long-term environmental damage to soil and groundwater in residential areas in close proximity to the stations.

Few studies have considered the potential environmental impact of routine gasoline spills and instead have focused on problems associated with large-scale leaks. Researchers with the Johns Hopkins Bloomberg School of Public Health, publishing online Sept. 19 in the Journal of Contaminant Hydrology, developed a mathematical model and conducted experiments suggesting these small spills may be a larger issue than previously thought.

"Gas station owners have worked very hard to prevent gasoline from leaking out of underground storage tanks," says study leader Markus Hilpert, PhD, a senior scientist in the Department of Environmental Health Sciences in the Johns Hopkins Bloomberg School of Public Health. "But our research shows we should also be paying attention to the small spills that routinely occur when you refill your vehicle's tank."

Over the lifespan of a gas station, Hilpert says, concrete pads underneath the pumps can accumulate significant amounts of gasoline, which can eventually penetrate the concrete and escape into underlying soil and groundwater, potentially impacting the health of those who use wells as a water source. Conservatively, the researchers estimate, roughly 1,500 liters of gasoline are spilled at a typical gas station each decade.

"Even if only a small percentage reaches the ground, this could be problematic because gasoline contains harmful chemicals including benzene, a known human carcinogen," Hilpert says. Hilpert and Patrick N. Breysse, PhD, a professor in the Department of Environmental Health Sciences, developed a mathematical model to measure the amount of gasoline that permeates through the concrete of the gas-dispensing stations and the amount of gasoline that vaporizes into the air.

The model demonstrates that spilled gasoline droplets remain on concrete surfaces for minutes or longer, and a significant fraction of spilled gasoline droplets infiltrate into the pavement, as concrete is not impervious.

"When gasoline spills onto concrete, the droplet will eventually disappear from the surface. If no stain is left behind, there has been a belief that no gasoline infiltrated the pavement, and all of it evaporated," Hilpert says. "According to our laboratory-based research and supported by our mathematical model, this assumption is incorrect. Our experiments suggest that even the smallest gasoline spills can have a lasting impact."

Since the health effects of living near gasoline stations have not been well studied, Breysse says there is an urgency to look more closely, especially since the new trend is to build larger filling stations with many more pumps. These stations continue to be located near residential areas where soil and groundwater could be affected.

"The environmental and public health impacts of chronic gasoline spills are poorly understood," says Breysse. "Chronic gasoline spills could well become significant public health issues since the gas station industry is currently trending away from small-scale service stations that typically dispense around 100,000 gallons per month to high-volume retailers that dispense more than 10 times this amount."

"In a perfect world, it would be ideal to avoid chronic spills," Hilpert says. "However, if these spills do occur, it is also important to prevent rainwater from flowing over the concrete pads underneath the pumps. Otherwise, storm runoff gets contaminated with benzene and other harmful chemicals and can infiltrate into adjacent soil patches or form stormwater that may end up in natural bodies of water."

Monday, October 6, 2014

Source: Aarhus UniversitySummary: What happens to soap and detergent surfactants when they run down the drain? Do they seep into the groundwater, lakes and streams, where they could pose a risk to fish and frogs? Not likely. This is shown in a new and very comprehensive report of the potential impact on the environment of the enormous amounts of common surfactants used day in and day out by consumers all over the world.

Senior researcher Hans Sanderson keeps an eye on laboratory technician Pia Petersen as she pours soapsuds down the drain with a clear conscience. They are both employed at the Department of Environmental Science, AU Roskilde.Credit: Steen Voigt, Aarhus University

You can brush your teeth, and wash yourself and your clothes with a clear conscience. The most common soaps, shampoos and detergents actually pose a minimal risk to the environment. This is the conclusion of a comprehensive survey that covers more than 250 scientific studies over several decades.

When you take a shower and rinse the soap and shampoo off your body, the foam conveniently disappears between your toes and down the drain. Have you ever thought about what happens to the surfactants afterwards? Whether they seep into the groundwater, lakes and streams, where they could pose a risk to fish and frogs?

Not likely. This is shown in a new and very comprehensive report of the potential impact on the environment of the enormous amounts of common surfactants used day in and day out by consumers all over the world.

"We humans use several million tons of surfactants a year on a global scale. It amounts to billions of kilos, so these are substances that you really don't want to release into the environment unless you're thoroughly familiar with them," says senior researcher Hans Sanderson, Department of Environmental Science, Aarhus University, who is one of the authors of the report.

More Than 250 Studies

For the purpose of promoting the sustainable use of surfactants, the researchers analysed their findings regarding the use, disposal, treatment and risk to the aquatic environment of the most important surfactant ingredients in North America. Although the studies are based in North America, they nevertheless apply on a global scale because they are more or less identical all over the world.

The result is a 100-page virtually encyclopaedic list that sums up more than 250 scientific studies spanning forty to fifty years, at an overall cost of approximately USD 30 million.

"It's the most comprehensive and definitive report to date regarding the environmental properties of detergent substances in soap products -- in other words, personal care and cleaning products," says Hans Sanderson.

Soap is Rapidly Degraded

The report shows that when the substances are used correctly and responsibly, and once they have been through a proper treatment plant, the risk to the surrounding environment is very low.

"The substances are made so that they degrade rapidly and thus don't pose a risk to the environment. I can't think of any other substances released into the environment in such large amounts via everyday use by all of us. It's the most commonly used substances of all that go directly into the wastewater, so it's important to keep track of them and ensure that there are no unpleasant surprises in the treatment plants or in the environment," says Hans Sanderson.

How Do Surfactants Work?

Surfactants have a special ability to dissolve fat while at the same time being water soluble. This is because they consist of components that have a hydrophilic head and a hydrophobic tail. The hydrophobic tails repel water but are fond of fat. This means that the tails are the ones that dissolve the fat when we wash ourselves with soap. The surfactant's hydrophilic heads ensure that the fat is carried away in the rinse water.

Monday, September 29, 2014

Source: University of AlbertaSummary: A civil engineering research team has developed a new way to clean oil sands process affected water and reclaim tailings ponds in Alberta's oil sands industry. Using sunlight as a renewable energy source instead of UV lamps, and adding chlorine to the tailings, oil sands process affected water is decontaminated and detoxified -- immediately.

Civil engineering graduate student Zengquan Shu simulates the solar UV/chlorine treatment process. Laboratory-scale tests found the solar UV/chlorine treatment process removed 75 to 84 per cent of the toxins found in tailings ponds.Credit: Image courtesy of University of Alberta

Cleaning up oil sands tailings has just gotten a lot greener thanks to a novel technique developed by University of Alberta civil engineering professors that uses solar energy to accelerate tailings pond reclamation efforts by industry.

Instead of using UV lamps as a light source to treat oil sands process affected water (OSPW) retained in tailings ponds, professors Mohamed Gamal El-Din and James Bolton have found that using the sunlight as a renewable energy source treats the wastewater just as efficiently but at a much lower cost.

"We know it works, so now the challenge is to transfer it into the field," says Gamal El-Din, who also worked on the project with graduate students Zengquan Shu, Chao Li, post doctorate fellow Arvinder Singh and biological sciences professor Miodrag Belosevic.

"This alternative process not only addresses the need for managing these tailings ponds, but it may further be applied to treat municipal wastewater as well. Being a solar-driven process, the cost would be minimal compared to what's being used in the field now."

Oilsands tailings ponds contain a mixture of suspended solids, salts, and other dissolvable compounds like benzene, acids, and hydrocarbons. Typically, these tailings ponds take 20 plus years before they can be reclaimed. The solar UV/chlorine treatment process when applied to the tailings ponds would make OSPW decontamination and detoxification immediate.

The sun's energy will partially remove these organic contaminants due to the direct sunlight. But, when the sunlight reacts with the chlorine (or bleach) added to the wastewater, it produces hydroxyl radicals (powerful oxidative reagents) that remove the remaining toxins more efficiently. The chlorine leaves no residuals as the sunlight causes it to decompose.

In laboratory-scale tests the solar UV/chlorine treatment process was found to remove 75 to 84 per cent of these toxins.

"With this solar process, right now, the wastewater on the top of the tailings ponds is being treated. But because we have nothing in place at the moment to circulate the water, the process isn't being applied to the rest of the pond," says Gamal El-Din.

"Because we are limited by the sunlight's penetration of the water, we now must come up with an innovative design for a mixing system like rafts floating on the ponds that would circulate the water. Installing this would still be much more cost effective for companies. It is expected that the UV/chlorine process will treat the OSPW to the point that the effluent can be fed to a municipal wastewater treatment plant, which will then complete the purification process sufficiently so the water can be discharged safely into rivers.

"This process has been gaining a lot of attention from the oil sands industry. We're now seeking funds for a pilot-pant demonstration and are looking at commercializing the technology."

Their findings were published in the Environmental Science & Technology journal.

Tuesday, September 23, 2014

Source: Washington State UniversitySummary: A unique method has been developed to use microbes buried in pond sediment to power waste cleanup in rural areas. The first microbe-powered, self-sustaining wastewater treatment system could lead to an inexpensive and quick way to clean up waste from large farming operations and rural sewage treatment plants while reducing pollution.

Washington State University researchers have developed a unique method to use microbes buried in pond sediment to power waste cleanup in rural areas.

The first microbe-powered, self-sustaining wastewater treatment system could lead to an inexpensive and quick way to clean up waste from large farming operations and rural sewage treatment plants while reducing pollution.

Professor Haluk Beyenal and graduate student Timothy Ewing in the Voiland College of Engineering and Architecture discuss the system in the online edition of Journal of Power Sources and have filed for a patent.

Cutting Greenhouse Gases

Traditionally, waste from dairy farms in rural areas is placed in a series of ponds to be eaten by bacteria, generating carbon dioxide and methane pollution, until the waste is safely treated. In urban areas with larger infrastructure, electrically powered aerators mix water in the ponds, allowing for the waste to be cleaned faster and with fewer harmful emissions.

As much as 5 percent of energy used in the U.S. goes for waste water treatment, said Beyenal. Most rural communities and farmers, meanwhile, can't afford the cleaner, electrically powered aerators.

Microbial fuel cells use biological reactions from microbes in water to create electricity. The WSU researchers developed a microbial fuel cell that does the work of the aerator, using only the power of microbes in the sewage lagoons to generate electricity.

The researchers created favorable conditions for growth of microbes that are able to naturally generate electrons as part of their metabolic processes. The microbes were able to successfully power aerators in the lab for more than a year, and the researchers are hoping to test a full-scale pilot for eventual commercialization.

Hope for Dairies

The researchers believe that the microbial fuel cell technology is on the cusp of providing useful power solutions for communities.

"Everyone is looking to improve dairies to keep them in business and to keep these family businesses going,'' said Ewing.

The technology could also be used in underdeveloped countries to more effectively clean polluted water: "This is the first step towards sustainable wastewater treatment,'' Ewing said.

Beyenal has been conducting research for several years on microbial fuel cells for low-power electronic devices, particularly for use in remote areas or underwater where using batteries is challenging. Last year, he and his graduate students used the microbes to power lights for a holiday tree.

Monday, September 15, 2014

Source: University of ManchesterSummary: Tiny single-cell organisms discovered living underground could help with the problem of nuclear waste disposal, say researchers. Although bacteria with waste-eating properties have been discovered in relatively pristine soils before, this is the first time that microbes that can survive in the very harsh conditions expected in radioactive waste disposal sites have been found.

The bacterium (inset) was found in soil samples in the Peak District.
Credit: Image courtesy of University of Manchester

Tiny single-cell organisms discovered living underground could help with the problem of nuclear waste disposal, say researchers involved in a study at The University of Manchester.

Although bacteria with waste-eating properties have been discovered in relatively pristine soils before, this is the first time that microbes that can survive in the very harsh conditions expected in radioactive waste disposal sites have been found. The findings are published in the ISME (Multidisciplinary Journal of Microbial Ecology) journal.

The disposal of our nuclear waste is very challenging, with very large volumes destined for burial deep underground. The largest volume of radioactive waste, termed 'intermediate level' and comprising of 364,000m3 (enough to fill four Albert Halls), will be encased in concrete prior to disposal into underground vaults. When ground waters eventually reach these waste materials, they will react with the cement and become highly alkaline. This change drives a series of chemical reactions, triggering the breakdown of the various 'cellulose' based materials that are present in these complex wastes.

One such product linked to these activities, isosaccharinic acid (ISA), causes much concern as it can react with a wide range of radionuclides -- unstable and toxic elements that are formed during the production of nuclear power and make up the radioactive component of nuclear waste. If the ISA binds to radionuclides, such as uranium, then the radionuclides will become far more soluble and more likely to flow out of the underground vaults to surface environments, where they could enter drinking water or the food chain. However, the researchers' new findings indicate that microorganisms may prevent this becoming a problem.

Working on soil samples from a highly alkaline industrial site in the Peak District, which is not radioactive but does suffer from severe contamination with highly alkaline lime kiln wastes, they discovered specialist "extremophile" bacteria that thrive under the alkaline conditions expected in cement-based radioactive waste. The organisms are not only superbly adapted to live in the highly alkaline lime wastes, but they can use the ISA as a source of food and energy under conditions that mimic those expected in and around intermediate level radwaste disposal sites. For example, when there is no oxygen (a likely scenario in underground disposal vaults) to help these bacteria "breath" and break down the ISA, these simple single-cell microorganisms are able to switch their metabolism to breathe using other chemicals in the water, such as nitrate or iron.

The fascinating biological processes that they use to support life under such extreme conditions are being studied by the Manchester group, as well as the stabilizing effects of these humble bacteria on radioactive waste. The ultimate aim of this work is to improve our understanding of the safe disposal of radioactive waste underground by studying the unusual diet of these hazardous waste eating microbes. One of the researchers, Professor Jonathan Lloyd, from the University's School of Earth, Atmospheric and Environmental Sciences, said: "We are very interested in these Peak District microorganisms. Given that they must have evolved to thrive at the highly alkaline lime-kiln site in only a few decades, it is highly likely that similar bacteria will behave in the same way and adapt to living off ISA in and around buried cement-based nuclear waste quite quickly.

"Nuclear waste will remain buried deep underground for many thousands of years so there is plenty of time for the bacteria to become adapted. Our next step will be to see what impact they have on radioactive materials. We expect them to help keep radioactive materials fixed underground through their unusual dietary habits, and their ability to naturally degrade ISA."

Thursday, September 11, 2014

Source: Institute for Integrated Cell-Material Sciences, Kyoto UniversitySummary: An advanced membrane has been developed for the purpose of cleaning up greenhouse gases. The membranes are cheaper, long-lasting, selective and highly permeable compared to commercially available ones.

PIM-1 is a highly permeable membrane compared with commercially available ones. The orange balloon on the left illustrates this point as a higher volume of nitrogen gas is able to pass through PIM-1 into the balloon compared with the membrane on the right, connected to the pink balloon.

Greenhouse gases, originating from industrial processes and the burning of fossil fuels, blanket the Earth and are the culprits behind current global warming woes. The most abundant among them is carbon dioxide, which made up 84% of the United States' greenhouse gases in 2012, and can linger in Earth's atmosphere for up to thousands of years.

Countries all over the world are looking to reduce their carbon dioxide footprint. However, carbon dioxide is essentially a waste product with little immediate commercial value and large treatment costs. Therefore, new low-cost technologies are sorely needed to incentivize greenhouse gas capture by industry.

Easan Sivaniah -- an associate professor at Kyoto University's Institute for Integrated Cell-Material Sciences (iCeMS) -- led an international team of researchers from iCeMS and the University of Cambridge to create an advanced membrane capable of rapidly separating gases.

The membrane they worked on, referred to as PIM-1, is "typically embedded with a network of channels and cavities less than 2 nm in diameter that can trap gases of interest once they enter," said Qilei Song, who was involved in the study. "The only problem is that their intrinsic properties make them rather flimsy and their starting selectivity is weak."

To overcome PIM-1's weaknesses, Sivaniah's team heated PIM-1 at temperatures ranging from 120 to 450 °C in the presence of oxygen, a process referred to as thermal oxidation. "Oxygen, under high temperatures, chemically reacts with PIM-1 to reinforce the strength of channels while controlling the size of so-called gate openings leading into the cavities, which allows for higher selectivity," said Song.

The resulting improved PIM-1 was found to be twice as selective for carbon dioxide while allowing air to pass through it 100 times faster compared with commercially available polymers. PIM-1 can also be used for other applications such as capturing carbon dioxide from the burning of fossil fuels, enriching the oxygen content in air for efficient combustion engines, hydrogen gas production, and processes to generate plastic.

"Basically, we developed a method for making a polymer that can truly contribute to a sustainable environment," said Sivaniah. "And because it is affordable and long-lasting, our polymer could potentially cut the cost of capturing carbon dioxide by as much as 1000 times."

Thursday, September 4, 2014

Air pollution regulations over the last decade in Taiyuan, China, have substantially improved the health of people living there, accounting for a greater than 50% reduction in costs associated with loss of life and disability between 2001 and 2010, according to researchers at the Columbia Center for Children's Environmental Health (CCCEH) at the Mailman School of Public Health, the Shanxi Medical University, the Center of Diseases Control and Prevention of Taiyuan Municipality, and Shanghai Fudan University School of Public Health.

The study is the first to document the health and economic benefits of policies to reduce the burden of air pollution in a highly polluted area of China, and provides a model to measure how policies to improve air quality can protect human health. Results appear online in the journal Environment International.

Taiyuan, the capital of Shanxi Province, is a major center in China for energy production and metallurgical industries. To combat air pollution, the Shanxi Provincial Government implemented many new environmental policies and regulations. Between 2000 and 2012, these included mandating the closure of many polluting sources, auditing companies that produced large amounts of toxic and hazardous materials, setting pollutant emissions standards, and promoting energy efficiency and pollution reduction. As a result, concentrations of particulate matter (PM10) declined by more than half, from 196 µg/m3 in 2001 to 89 µg/m3 in 2010, as measured at eight sites throughout the city.

Reductions in particulate matter between 2001 and 2010 were associated with 2,810 fewer premature deaths, 31,810 fewer hospital admissions, 141,457 fewer outpatient visits, 969 fewer ER visits, and 951 fewer cases of bronchitis. The team estimated that there were more than 30,000 fewer DALYs -- disability-adjusted life years, a standard measure of the loss of healthy years -- attributed to air pollution in Taiyuan in 2010 compared to 2001. The cost of premature death due to air pollution decreased by 3.83 billion Yuan, or approximately $621 million.

Particulate matter is released by coal-burning plants and other sources. These small particles can lodge themselves deeply in human lungs, and are associated with heart and lung conditions and premature death.

"Our results suggest that the air quality improvement from 2001 to 2010 resulted in substantial health benefits. In fact, the health and financial impacts of air pollution could potentially be greater than those reported due to our selection of only a few health outcomes that could be quantitatively estimated and translated into monetary values," says lead investigator Deliang Tang, DrPH, associate professor of Environmental Health Sciences at the Mailman School of Public Health.

The study builds on similar research from CCCEH in China, showing improvements in air quality were linked with improved childhood developmental scores.

"Over the last ten years, our research in two Chinese cities have demonstrated that strong government policies to reduce air pollution can result in substantial health benefits for children and adults," says Frederica Perera, PhD, director of the Columbia Center for Children's Environmental Health at the Mailman School of Public Health. "These findings make the argument for stronger and broader regulations in Chinese cities where air pollution remains a serious health problem."

According to the Chinese Ministry of Environmental Protection, only three of 74 cities the government monitors meet minimum air standards. In March, Premier Li Kequiang announced that the country would "declare war against pollution," by reducing particulate matter and closing outdated industrial plants.

Friday, August 22, 2014

A whiff of chlorine is virtually synonymous with taking a dip in a swimming pool. While it helps to kill off bacteria, it also serves as a subtle reminder that you are wading around in chemically treated water (if tasting the odd mouthful just isn't enough). Switzerland's Naturbad Riehen swimming pool is entirely chemical-free, relying instead on a biological filter system to provide clean and natural water for thousands of patrons, no itchy red eyes in sight.

A town of around 20,000 people, Riehen sits just outside of Basel on the Swiss-German border. This section of the border wraps around Riehen in such a way that in 2006, the town's old swimming pool had to be demolished to make way for a tunnel connecting two German cities on either side. With the controversial roadway completed in 2013, the people of Riehen were quick to reclaim their territory.

"The citizens wanted their pool back and believed that a natural swimming pool would suit their interest in bathing, swimming and playing, just as well as a traditional pool," Christian Lupp, Recreation and Sports representative from the Municipality of Riehen tells Gizmag. "Furthermore, there was the understanding that the natural water is better for the skin and eyes and feels smoother and softer".

Features include a wading pool for toddlers, a separate pool with a sloping gravel beach.

The first man-made natural swimming pools date back to the early 1980s in Austria, though these were largely for private use. The precise mechanics vary between each natural pool, but they typically contain the swimming water inside a membrane and then a separate water regeneration zone to clean it. Aquatic plants kill off germs while absorbing nutrients from the water for growth. Often the water is pumped across the surface of rocks or gravel to which the bacteria clings, functioning as a natural filter.

Since the 1980s, the concept has been commercialized and spread to different parts the world. Natural pools for public use have popped up in Germany, the UK and one is currently under construction at Webber Park in Minneapolis, set to be the first in the United States. Lupp says a point of difference for the Naturbad Riehen is that it's au naturel from the ground up, allowing for better integration of the pool's natural technology with its wooden infrastructure.

"Many other projects are conversions of traditional pools," he explains. "Our pool is absolutely built from zero, allowing the extraordinary possibility of a holistic design and a combination between the natural technology and its according architecture."

An on-site cafe offers refreshments and snacks, with wooden decking and grass.

Swiss architects Herzog & de Meuron designed Riehen's new swimming pool to accommodate the town's families and blend in with the greenery that surrounds. Features include a wading pool for toddlers, a separate pool with a sloping gravel beach, a water-slide, a 25-meter (82 ft) lap pool and a diving board. An on-site cafe offers refreshments and snacks, with wooden decking and grass providing a place for some time out.

Officially opening for business in mid-June, the Naturbad Riehen is equipped to deal with 2,000 daily visitors. We're guessing the residents of Riehen are pretty happy with their new pool, with the possible exception of those in the business of selling swimming goggles.

Wednesday, August 20, 2014

A new building has been opened for Tel Aviv University's Porter School of Environmental Studies.

You'd hope that a school of environmental studies would practice what it preaches. Well, Tel Aviv University's Porter School of Environmental Studies does so emphatically. Its newly inaugurated building is, it says, the first LEED Platinum-certified in Israel and the greenest in the Middle East.

Leadership in Energy & Environmental Design (LEED) certification has become a widely recognized mark of environmental good practice in the design, construction, maintenance and operation of buildings. Amongst the LEED Platinum-certified buildings that were recently featured, BioCasa 82 in Italy was claimed to be Europe's first LEED Platinum home, the Munich-based NuOffice was claimed to be the world's most sustainable office building and Dubai's Chance Initiative was claimed to be the world's most sustainable building overall.

The PSES building was designed in collaboration by Geotectura Studio and Axelrod Grobman Architects with the aim of being a "living laboratory." As well as providing spaces for education and learning, it was decided that the building should be a demonstrative educational platform in itself, with users and visitors able to examine the environmental technologies installed therein.

The Capsule in the PSES building houses a workshop and meeting space
(Photo: Shai Epstein).

Amongst the public and education spaces in the building are an auditorium, a spacious atrium that can be used for meetings and exhibitions, classrooms, lecture halls, research offices, meeting rooms and offices. The temperature in the PSES building is regulated using a solar energy-powered air conditioning system, along with a structure design optimized for local conditions. Grey water, meanwhile, is recycled and reused elsewhere in the building.

In addition to a green roof, the building features an "EcoWall" which is described as an iconic element of the building's aesthetic, but is also a functional part of its environmental efforts. The EcoWall provides protection from the sun in the building's atrium, but also capitalizes on its south-facing orientation by hosting the array of solar panels used to power the building's air conditioning. Terraces along the EcoWall can also be used for experimental research.

The PSES building has a green roof (Photo: Shai Epstein).

The PSES building also features a striking Capsule element as part of its design. The Capsule is a 3D elliptical structure that's suspended in the building's atrium and that pokes out of the EcoWall. Housed in the Capsule is a workshop and meeting room with "state of the art multimedia technology." The external surface of the Capsule is covered in connected LEDs that are used to display environmental information, such as energy statistics of the PSES Building and pollution levels in Tel Aviv.

The PSES building was inaugurated in May and held its first graduation ceremony in June.

Monday, August 4, 2014

Date: July 30, 2014Source: Oregon State UniversitySummary: The world's oceans are vast and deep, yet rapidly advancing technology and the quest for extracting resources from previously unreachable depths is beginning to put the deep seas on the cusp of peril, an international team of scientists has warned.

A new OSU study looks at how exploiting the ocean's vast resources have put
it in peril. Credit: Image courtesy of Oregon State University

The world's oceans are vast and deep, yet rapidly advancing technology and the quest for extracting resources from previously unreachable depths is beginning to put the deep seas on the cusp of peril, an international team of scientists warned this week.

In an analysis in Biogeosciences, which is published by the European Geosciences Union, the researchers outline "services" or benefits provided by the deep ocean to society. Yet using these services, now and in the future, is likely to make a significant impact on that habitat and what it ultimately does for society, they point out in their analysis.

"The deep sea is the largest habitat on Earth, it is incredibly important to humans and it is facing a variety of stressors from increased human exploitation to impacts from climate change," said Andrew Thurber, an Oregon State University marine scientist and lead author on the study. "As we embark upon greater exploitation of this vast environment and start thinking about conserving its resources, it is imperative to know what this habitat already does for us."

"Our analysis is an effort to begin to summarize what the deep sea provides to humans because we take it for granted or simply do not know that the deep sea does anything to shape our daily lives," he added. "The truth is that the deep sea affects us, whether we live on the coast or far from the ocean -- and its impact on the globe is pervasive."

The deep sea is important to many critical processes that affect Earth's climate, including acting as a "sink" for greenhouse gases -- helping offset the growing amounts of carbon dioxide emitted into the atmosphere. It also regenerates nutrients through upwelling that fuel the marine food web in productive coastal systems such as the Pacific Northwest of the United States, Chile and others. Increasingly, fishing and mining industries are going deeper and deeper into the oceans to extract natural resources.

"One concern is that many of these areas are in international waters and outside of any national jurisdiction," noted Thurber, an assistant professor (senior research) in Oregon State's College of Earth, Ocean, and Atmospheric Sciences. "Yet the impacts are global, so we need a global effort to begin protecting and managing these key, albeit vast, habitats."

Fishing is an obvious concern, the scientists say. Advances in technology have enabled commercial fisheries to harvest fish at increasing depths -- an average of 62.5 meters deeper every decade, according to fisheries scientists. This raises a variety of potential issues.

"The ability to fish deeper is shifting some fisheries to deeper stocks, and opening up harvests of new species," Thurber said. "In some local cases, individual fisheries are managed aggressively, but due to how slow the majority of the fish grow in the deep, some fish populations are still in decline -- even with the best management practices."

The orange roughy off New Zealand, for instance, is both a model of effective and conservation-based management, yet its populations continue to decline, though at a slower rate than they would have experienced without careful management, Thurber noted.

"We also have to be concerned about pollution that makes its way from our continental shelves into the deep sea," he added. "Before it was 'out of sight, out of mind.' However, some of the pollution can either make it into the fish that we harvest, or harm the fishers that collect the fish for us. It is one of the reasons need to identify how uses of the deep sea in the short term can have long-term consequences. Few things happen fast down there."

Mining is a major threat to the deep sea, the researchers point out in their analysis. In particular, the quest for rare earth and metal resources, which began decades ago, has skyrocketed in recent years because of their increased use in electronics, and because of dwindling or limited distribution of supplies on land. Mining the deep ocean for manganese nodules, for example -- which are rich in nickel -- requires machines that may directly impact large swaths of the seafloor and send up a sediment plume that could potentially affect an even larger area, the scientists note.

These mining resources are not limited to muddy habitats, Thurber pointed out. Massive sulfides present at hydrothermal vents are another resource targeted by mining interests.

"The deep sea has been an active area for oil and gas harvesting for many years," he said, "yet large reservoirs of methane and other potential energy sources remain unexploited. In addition to new energy sources, the potential for novel pharmaceuticals is also vast.

"There are additional threats to these unique habitats, including ocean acidification, warming temperatures and possible changes to ocean circulation through climate change."

The next step, the researchers say, is to attach an economic value to both the services provided by the deep sea -- and the activities that may threaten those services.

"What became clear as we put together this synopsis is that there is vast potential for future resources but we already benefit greatly through this environment," Thurber said. ""What this means is that while the choices to harvest or mine will be decided over the coming decades, it is important to note that the stakeholders of this environment represent the entire world's population."

"The Bible, the Koran, the Torah, and early Greek texts all reference the deep sea," he added. "Maybe it's time for all of us to take a closer look at what it has to offer and decide if and how we protect it."

Wednesday, July 9, 2014

The dramatic increase in earthquakes in central Oklahoma since 2009 is likely attributable to subsurface wastewater injection at just a handful of disposal wells, finds a new study to be published in the journal Science on July 3, 2014.

The research team was led by Katie Keranen, professor of geophysics at Cornell University, who says Oklahoma earthquakes constitute nearly half of all central and eastern U.S. seismicity from 2008 to 2013, many occurring in areas of high-rate water disposal.

"Induced seismicity is one of the primary challenges for expanded shale gas and unconventional hydrocarbon development. Our results provide insight into the process by which the earthquakes are induced and suggest that adherence to standard best practices may substantially reduce the risk of inducing seismicity," said Keranen. "The best practices include avoiding wastewater disposal near major faults and the use of appropriate monitoring and mitigation strategies."

The study also concluded:

Four of the highest-volume disposal wells in Oklahoma (~0.05% of wells) are capable of triggering ~20% of recent central U.S. earthquakes in a swarm covering nearly 2,000 square kilometers, as shown by analysis of modeled pore pressure increase at relocated earthquake hypocenters.

Earthquakes are induced at distances over 30 km from the disposal wells. These distances are far beyond existing criteria of 5 km from the well for diagnosis of induced earthquakes.

The area of increased pressure related to these wells continually expands, increasing the probability of encountering a larger fault and thus increasing the risk of triggering a higher-magnitude earthquake.

"Earthquake and subsurface pressure monitoring should be routinely conducted in regions of wastewater disposal and all data from those should be publicly accessible. This should also include detailed monitoring and reporting of pumping volumes and pressures," said Keranen. 'In many states the data are more difficult to obtain than for Oklahoma; databases should be standardized nationally. Independent quality assurance checks would increase confidence. "

Tuesday, July 1, 2014

Thanks to its extensive composting and recycling facilities, the city of Edmonton, Canada is already diverting approximately 60 percent of its municipal waste from the landfill. That figure is expected to rise to 90 percent, however, once the city's new Waste-to-Biofuels and Chemicals Facility starts converting garbage (that can't be composted or recycled) into methanol and ethanol. It's the world's first such plant to operate on an industrial scale, and we recently got a guided tour of the place.

The process begins with garbage trucks dumping their loads on the tipping floor at the Integrated Processing and Transfer Facility. The trash is manually and mechanically sorted, with things like appliances being set aside for electronic parts recycling and e-waste disposal, while organic matter heads off to the Composting Facility.

Recyclable materials are already pre-separated by citizens as part of the city's blue bag program. They avoid the garbage stream entirely, going straight to the Materials Recovery Facility for recycling.

Soon, though, high-carbon materials such as wood, fabric and discarded plastic will be getting shredded into Refuse Derived Fuel (RDF), also known as "garbage fluff." It will be transferred to the Waste-to-Biofuels and Chemicals Facility, which is owned and operated by Enerkem Alberta Biofuels.

There, it will be heated in a low-oxygen atmosphere. This will cause its chemical bonds to break (without the material actually burning), releasing their carbon and hydrogen content to form what's known as syngas. This will in turn be cleaned up and converted into chemical products and biofuels – such as methanol and ethanol.

The Waste-to-Biofuels and Chemicals Facility is scheduled to go online in the next several weeks. It is ultimately expected to convert 100,000 tonnes (110,231 tons) of municipal solid waste into 38 million liters (10 million gallons) of biofuels and chemicals annually.

You can see our video tour of the facility below, conducted by the Edmonton Waste Management Centre's Education Programs Co-ordinator, Garry Spotowski. There are also photos of the process in the gallery.

Wednesday, June 18, 2014

Coming soon to a kitchen near you - magnets in your refrigerator. And we're not talking about slapping your kid's artwork inside the fridge next to the milk and butter.

It's the next generation of residential food and drink cooling, and it's powered by magnets. Gone will be the almost century-old unit in your kitchen that uses a heat-transfer process based on liquid refrigerants called vapor compression refrigeration. Condensers and refrigerants will be replaced with magnets and special alloys that get hot and cold based on their proximity to magnetic fields. The technology could also be used for air-conditioning.

Magnetic refrigeration, proponents say, is a rapidly approaching technology that will amount to a revolution in domestic energy use.

"It's the equivalent to a gas-powered car moving to electric - that's the kind of leap we're making in refrigeration," said Ed Vineyard, a senior researcher at the U.S. Department of Energy's Oak Ridge National Laboratory. Vineyard's Building Technologies Program has teamed up with GE to bring magnetic refrigeration to the public in around five years.

The idea behind refrigerators and air conditioners is all the same. In their broadest sense, they are heat pumps - devices that take heat energy from inside your refrigerator box or room and move it outside. Removing this energy makes the temperature go down.

In most contemporary home and commercial refrigeration systems, mechanical work compresses and expands a liquid refrigerant. The pressure drop associated with expansion lowers the temperature of the refrigerant, which then cools air blown over it by a fan into the refrigerator box or the cooled room. In magnetic refrigeration systems, the compressor is replaced with magnetic fields that interact with solid refrigerants and the water-based cooling fluid. Changing the strength of magnetic fields alters how much heat is pulled away from the refrigerator box.

Along with this refrigerator revolution comes a dramatic drop in the amount of energy you need to cool your cucumbers and cantaloupes. ORNL says magnetic refrigeration "is a promising alternative to the vapor compression systems used in today's appliances" that could theoretically drop energy consumption by 25 percent compared to current technology. Those liquid refrigerant chemicals that can be damaging to the environment and hard to recycle at the end of a refrigerator's life are also being replaced by cheaper water-based fluid.

Oak Ridge National Laboratory's Ayyoud Momen works on the team's "breadboard" prototype refrigerator-freezer: a flexible platform used to evaluate material compatibility and to analyze components including the magnet, generators, motor, pump, heat exchangers, plumbing and leakless rotating valve. Courtesy ORNL.

Developers expect the new refrigerators to cost a bit more than vapor compression models, but buyers should see savings through spending less on electricity over the long term. If the technology is adopted broadly, it could mean major electricity savings on the national scale. Besides savings from more efficient refrigerators, magnetic cooling would lower electricity use in heating, ventilation and air-conditioning equipment, which accounts for around 60 percent of the average household's energy use.

AAEES Career Center Search Results

AAEES News and Events

More Blogs by AAEES

AAEES

About Me

The American Academy of Environmental Engineering and Scientists is a not-for-profit 501(c)(6) organization serving the Environmental Engineering and Environmental Science professions by providing Board Certification to those who qualify through experience and testing. The Academy also provides training through workshops and seminars, participates in accrediting universities, publishes a periodical and other reference material, interacts with students and young professionals, sponsors a university lecture series, and rewards outstanding achievements through its international awards program.