Blog Archive

28 Jun 2015

A swarm of small satellites could give critical infrastructure an Internet connection that never goes down.

WHY IT MATTERS

Being able to collect data in emergency situations where conventional networks are cut off could be widely useful.

A rendering showing compact communications satellites in relatively low orbit; they could help provide data connections to critical infrastructure.

Anthony Previte, CEO of the space company Terran Orbital, was set on the path to his company’s latest project by a nurse he encountered amid the chaos of 9/11, one block north of Ground Zero.

She was running frantically down the street because the nearby hospital had run out of fuel oil. With most cell-phone batteries depleted and landlines knocked out, the only way to call for more was on foot. Previte got to thinking that important equipment like generators should have ways to communicate anytime, even after a disaster. Today he’s working to make that possible by launching multiple constellations of “nano satellites” designed to provide small, battery-powered sensors with a cheap data connection that never goes down.

Previte says his system will have many civilian and commercial applications and save lives in the wake of natural disasters or terrorist attacks. “If every generator has a sensor on it that reports back to a satellite, then whoever is in charge—FEMA, the government, the military—can move fuel around, with intelligent decisions,” he says.

More and more commercial and industrial equipment is becoming connected to data networks so they can be managed more efficiently, forming what’s known as the Internet of things. Terran’s always-on connections might make that approach more dependable.

Satellite Internet connections available today are mostly targeted at people, not machines, and they’re expensive. They use large satellites parked in geostationary orbits roughly 36,000 kilometers over the equator, meaning that significant energy is required to reach them with a signal from the ground.

Terran is launching small satellites that orbit at only 600 kilometers. That lower altitude makes it practical for low-powered, even disposable, sensors to use a satellite data link, says Previte.

The connection is designed to be more reliable than it is fast. The U.S. army is to use Terran sensors to track vehicles and troops that transmit at tens of kilobytes per second. But Terran expects lower-powered sensors to send up data at about a tenth that speed.

In addition to aiding in disaster relief and tracking shipping containers, planes, and boats, Previte envisions the sensors being used for environmental monitoring. For example, they could be dropped out of a helicopter or drone into a growing oil spill, or onto an active volcano to track lava flows. Terran anticipates significant interest from farmers, who could place sensors in fields or even around the necks of cows.

Previte estimates that Terran can build the disposable, low-powered sensors in bulk for roughly $80 each. Customers will pay a subscription on top of that for their connections. Terran will also sell complete satellites to customers who want the exclusive use of one. “It used to be $400 million for a satellite,” says Previte. Terran will be able to offer them for figures in the low millions of dollars, he says.

In addition to its deal with the U.S. Army, the company says it already has commercial clients, but none that are prepared to disclose their relationship publicly. Deploying a constellation of nano satellites requires 18 to 36 months of lead time, and these companies want to surprise the competition, says Previte.

Terran, which serves as a consultant to integrate satellite payloads and helps with mission control operations, will say that it supported the launch of nine satellites in 2014 and 10 so far this year. The company has also built six small satellites from scratch in 2015.

Jordi Puig-Suari, Terran’s chief science officer, is one of two inventors of the CubeSat, a generic blueprint for miniaturized satellites that are typically a cubic liter in size. Different payloads can be installed in a CubeSat using off-the-shelf electronic components (satellites traditionally have custom-built electronics). Further cost savings come from the way the small satellites can be fitted into unused space inside rockets launching larger satellites or space vehicles.

Kerri Cahoy, an assistant professor of aeronautics and astronautics at MIT, said Terran’s model makes it distinct from most satellite companies.

The development of smaller, cheaper satellite technologies in recent years has led many companies to explore new ways of using low Earth orbit (LEO) satellites.

Many focus on remote imaging—for example, to gather regular photos or infrared imagery. But Cahoy says LEO satellites should make a good low-cost communications network. “It sure beats trying to figure out how to connect a large number of distributed ground sensors to a cable or wire-based ground network,” she says. And satellites can more easily cover large swaths of territory than cellular or Wi-Fi networks, which need many base stations.

25 Jun 2015

The fabrication of branched titanium dioxide nanomaterials enhances the connectivity in solar cells for a better performance.

Electrical contacts made of branched titanium dioxide nanowires developed by developed by researchers from the King Abdullah University of Science and Technology (KAUST) could improve the efficiency of solar cells.

Titanium dioxide is a white pigment commonly used in paint and a versatile electronic material commonly used in solar cells. It is a cheap and abundant compound and has similar electronic states to those of the light-absorbing compounds used in photovoltaics.

This match ensures that electrical charges are efficiently funnelled away from the active region of a solar cell into the titanium dioxide and from there towards the electrical contacts of the device. “It is the most successful electron transporting material in hybrid organic/inorganic photovoltaics,” explains research leader Aram Amassian.

Schematic of the low-cost fabrcation of titanium dioxide nanomaterials by electrospinning.

To ensure good contact between the titanium dioxide and the solar cell material, the titanium dioxide needs a very large surface area for maximum capture of electrical charges. The surface area can be expanded through the use of nanostructured materials such as meshes made from nanowires. These meshes have not been able to effectively transport the electrical charges across the nanowires.

Different architectural structures could help improve charge transport — for instance branched structures would have a much stronger electrical connection. But, established techniques for growing such branched structures is inefficient and produces nanostructures with many impurities and defects.

KAUST researchers have now established a two-stage process for an efficient fabrication of branched titanium dioxide materials. The first step is to deposit nanofibers using an electrospinning technique, where a narrow jet of a titanium dioxide solution is ejected from a needle using electrostatic charges (see image), resulting in a network of electron highways. “Electrospinning of metal oxide nanofibers has emerged as a potentially low cost, rapid and useful technique to grow one-dimensional nanostructures on a variety of substrates,” says Amassian.

In the second step these structures are heated further which results in the hydrothermal growth of branched nanostructures. The hyperbranched materials perform better than conventional nanofibers in solar cells, which indicates their potential viability in other devices such as batteries, or catalytic applications.

The team reduced the processing temperature substantially — down to 300 degrees Celsius — but the demand for a high processing temperature remains a practical barrier to the technology, says Amassian. “Future work to halve this temperature could help implement hyperbranched electron transport materials for solar cell fabrication on flexible and stretchable plastic or textile substrates.”

25 Jun 2015

Funding from KAUST is helping to bring innovative solar power technology to fruition with startup company QD Solar.

Enabling researchers to take their ideas from the earliest research stages right through to the development and commercialization of a final product is a foundation goal at King Abdullah University of Science and Technology (KAUST). One route is for KAUST to provide active support to startup companies that have strong connections with the university’s research activities.

Nicola Bettio, manager of the funding scheme, says seed funding and early-stage capital are made available to startup companies through the KAUST Innovation Fund. “We invest in companies launched by our faculty, researchers and students, alongside international startups that are willing to move their operations to KAUST and Saudi Arabia,” Bettio explains. “By encouraging the transition of science into innovation and new ventures we can help to establish a fertile ecosystem for early-stage technology-based companies in Saudi Arabia.”

Efficient and cost-effective solar power generation will bring significant economic benefits and drastically reduce the carbon footprint of Saudi Arabia and the rest of the world. For the past several years, KAUST has been supporting research into new technologies using ‘colloidal quantum dots’ – a potential alternative to the established design of solar cells.

With an ideal design for solar panels the paint made from quantum dots is highly flexible.

Colloidal quantum dots are tiny semiconductor particles with the capacity to harvest energy from both the visible spectrum and the previously-untapped near-infrared portion of the sun’s light. Edward Sargent, a senior researcher at the University of Toronto, has been working on the development and application of colloidal quantum dots for the past ten years, and helped found QD Solar, a Canadian startup company aiming to commercialize the new technology.

“When KAUST was founded back in 2009, there was a very strong emphasis on advancing solar technology that led to the creation of KAUST’s Solar and Photovoltaics Engineering Research Center (SPERC),” explains Sargent. “The university employed top professors in the field who have been a key asset during the development of the new quantum dot technique I have been working on. I was fortunate enough to be awarded a KAUST Investigators grant around that time, enabling me to pursue my research in collaboration with these top scientists.”

In 2009, Sargent’s team at the University of Toronto received a grant from KAUST to advance their research into colloidal quantum dots specifically for solar power applications. Since then, the team’s advances have evolved in leaps and bounds. “Our latest design incorporates the dots into a solution, or ‘paint’, which can be applied to flexible surfaces very easily and cheaply,” Sargent explains. “We have also vastly improved the power conversion efficiency of the dots over the past six years, thanks in part to the funding from KAUST.”

Sargent and his team, in collaboration with KAUST researchers, are now investigating ways of combining the dots with existing silicon-based solar cells to create new hybrid structure solar panels. This will allow them to harness a far greater fraction of the sun’s energy than ever before, improving the panels’ efficiency and capacity to generate more power.

Marc Vermeersch, the SPERC managing director, points out that current solar cell technologies are reaching their physical limits. He explains that a key advantage of creating QD Solar’s hybrid system is that existing solar panel producers will be able to modify their manufacturing practices relatively easily to incorporate the new design, rather than starting from scratch. Manufacturers will be able to achieve significant economic and efficiency improvements using QD Solar’s quantum dot-based solar products, Bettio adds.

KAUST anticipates welcoming a QD Solar presence to Saudi Arabia in the near future, as Bettio explains: “QD Solar is working with the KAUST Innovation Fund to raise capital to establish a significant development facility in our Research & Technology Park. Ultimately, this may lead to the establishment of a Saudi player in the photovoltaic industry and the creation of a hybrid solar panel manufacturing facility in the Kingdom.”

A team of researchers has created a new implantable drug-delivery system using nanowires that can be wirelessly controlled.

The nanowires respond to an electromagnetic field generated by a separate device, which can be used to control the release of a preloaded drug. The system eliminates tubes and wires required by other implantable devices that can lead to infection and other complications, said team leader Richard Borgens, Purdue University’s Mari Hulman George Professor of Applied Neuroscience and director of Purdue’s Center for Paralysis Research.

“This tool allows us to apply drugs as needed directly to the site of injury, which could have broad medical applications,” Borgens said. “The technology is in the early stages of testing, but it is our hope that this could one day be used to deliver drugs directly to spinal cord injuries, ulcerations, deep bone injuries or tumors, and avoid the terrible side effects of systemic treatment with steroids or chemotherapy.”

The team tested the drug-delivery system in mice with compression injuries to their spinal cords and administered the corticosteroid dexamethasone. The study measured a molecular marker of inflammation and scar formation in the central nervous system and found that it was reduced after one week of treatment. A paper detailing the results will be published in an upcoming issue of the Journal of Controlled Release and is currently available online.

IMAGE: An image of a field of polypyrrole nanowires captured by a scanning electron microscope is shown. A team of Purdue University researchers developed a new implantable drug-delivery system using the… view more

Credit: (Purdue University image/courtesy of Richard Borgens)

The nanowires are made of polypyrrole, a conductive polymer material that responds to electromagnetic fields. Wen Gao, a postdoctoral researcher in the Center for Paralysis Research who worked on the project with Borgens, grew the nanowires vertically over a thin gold base, like tiny fibers making up a piece of shag carpet hundreds of times smaller than a human cell. The nanowires can be loaded with a drug and, when the correct electromagnetic field is applied, the nanowires release small amounts of the payload. This process can be started and stopped at will, like flipping a switch, by using the corresponding electromagnetic field stimulating device, Borgens said.

The researchers captured and transported a patch of the nanowire carpet on water droplets that were used used to deliver it to the site of injury. The nanowire patches adhere to the site of injury through surface tension, Gao said.

The magnitude and wave form of the electromagnetic field must be tuned to obtain the optimum release of the drug, and the precise mechanisms that release the drug are not yet well understood, she said. The team is investigating the release process.

The electromagnetic field is likely affecting the interaction between the nanomaterial and the drug molecules, Borgens said.

“We think it is a combination of charge effects and the shape change of the polymer that allows it to store and release drugs,” he said. “It is a reversible process. Once the electromagnetic field is removed, the polymer snaps back to the initial architecture and retains the remaining drug molecules.”

For each different drug the team would need to find the corresponding optimal electromagnetic field for its release, Gao said.

This study builds on previous work by Borgens and Gao. Gao first had to figure out how to grow polypyrrole in a long vertical architecture, which allows it to hold larger amounts of a drug and extends the potential treatment period. The team then demonstrated it could be manipulated to release dexamethasone on demand. A paper detailing the work, titled “Action at a Distance: Functional Drug Delivery Using Electromagnetic-Field-Responsive Polypyrrole Nanowires,” was published in the journal Langmuir.

Other team members involved in the research include John Cirillo, who designed and constructed the electromagnetic field stimulating system; Youngnam Cho, a former faculty member at Purdue’s Center for Paralysis Research; and Jianming Li, a research assistant professor at the center.

For the most recent study the team used mice that had been genetically modified such that the protein Glial Fibrillary Acidic Protein, or GFAP, is luminescent. GFAP is expressed in cells called astrocytes that gather in high numbers at central nervous system injuries. Astrocytes are a part of the inflammatory process and form a scar tissue, Borgens said.

A 1-2 millimeter patch of the nanowires doped with dexamethasone was placed onto spinal cord lesions that had been surgically exposed, Borgens said. The lesions were then closed and an electromagnetic field was applied for two hours a day for one week. By the end of the week the treated mice had a weaker GFAP signal than the control groups, which included mice that were not treated and those that received a nanowire patch but were not exposed to the electromagnetic field. In some cases, treated mice had no detectable GFAP signal.

Whether the reduction in astrocytes had any significant impact on spinal cord healing or functional outcomes was not studied. In addition, the concentration of drug maintained during treatment is not known because it is below the limits of systemic detection, Borgens said.

“This method allows a very, very small dose of a drug to effectively serve as a big dose right where you need it,” Borgens said. “By the time the drug diffuses from the site out into the rest of the body it is in amounts that are undetectable in the usual tests to monitor the concentration of drugs in the bloodstream.”

Polypyrrole is an inert and biocompatable material, but the team is working to create a biodegradeable form that would dissolve after the treatment period ended, he said.

The team also is trying to increase the depth at which the drug delivery device will work. The current system appears to be limited to a depth in tissue of less than 3 centimeters, Gao said.

Story Source:

The above post is reprinted from materials provided by Purdue University. The original item was written by Elizabeth K. Gardner. Note: Materials may be edited for content and length.

24 Jun 2015

A team of Lehigh University engineers have demonstrated a bacterial method for the low-cost, environmentally friendly synthesis of aqueous soluble quantum dot (QD) nanocrystals at room temperature.

Principal researchers Steven McIntosh, Bryan Berger and Christopher Kiely, along with a team of chemical engineering, bioengineering, and material science students present this novel approach for the reproducible biosynthesis of extracellular, water-soluble QDs in the July 1 issue of the journal Green Chemistry (“Biomanufacturing of CdS quantum dots”). This is the first example of engineers harnessing nature’s unique ability to achieve cost effective and scalable manufacturing of QDs using a bacterial process.

Using an engineered strain of Stenotrophomonas maltophilia to control particle size, Lehigh researchers biosynthesized quantum dots using bacteria and cadmium sulfide to provide a route to low-cost, scalable and green synthesis of CdS nanocrystals with extrinsic crystallite size control in the quantum confinement range. The result is CdS semiconductor nanocrystals with associated size-dependent band gap and photoluminescent properties. (Image: Linda Nye for Lehigh University)

Using an engineered strain of Stenotrophomonas maltophilia to control particle size, the team biosynthesized QDs using bacteria and cadmium sulfide to provide a route to low-cost, scalable and green synthesis of CdS nanocrystals with extrinsic crystallite size control in the quantum confinement range. The solution yields extracellular, water-soluble quantum dots from low-cost precursors at ambient temperatures and pressure. The result is CdS semiconductor nanocrystals with associated size-dependent band gap and photoluminescent properties.

This biosynthetic approach provides a viable pathway to realize the promise of green biomanufacturing of these materials. The Lehigh team presented this process recently to a national showcase of investors and industrial partners at the TechConnect 2015 World Innovation Conference and National Innovation Showcase in Washington, D.C. June 14-17.

“Biosynthetic QDs will enable the development of an environmentally-friendly, bio-inspired process unlike current approaches that rely on high temperatures, pressures, toxic solvents and expensive precursors,” Berger says. “We have developed a unique, ‘green’ approach that substantially reduces both cost and environmental impact.”

Quantum dots, which have use in diverse applications such as medical imaging, lighting, display technologies, solar cells, photocatalysts, renewable energy and optoelectronics, are typically expensive and complicated to manufacture. In particular, current chemical synthesis methods use high temperatures and toxic solvents, which make environmental remediation expensive and challenging.

This newly described process allows for the manufacturing of quantum dots using an environmentally benign process and at a fraction of the cost. Whereas in conventional production techniques QDs currently cost $1,000-$10,000 per gram, the biomanufacturing technique cuts that cost to about $1-$10 per gram. The substantial reduction in cost potentially enables large-scale production of QDs viable for use in commercial applications.

“We estimate yields on the order of grams per liter from batch cultures under optimized conditions, and are able to reproduce a wide size range of CdS QDs,” said Steven McIntosh.

The research is funded by the National Science Foundation’s Division of Emerging Frontiers in Research and Innovation (EFRI Grant No. 1332349) and builds on the success of the initial funding, supplied by Lehigh’s Faculty Innovation Grant (FIG) and Collaborative Research Opportunity Grant (CORE) programs.

The Lehigh research group is also investigating, through the NSF’s EFRI division, the expansion of this work to include a wide range of other functional materials. Functional materials are those with controlled composition, size, and structure to facilitate desired interactions with light, electrical or magnetic fields, or chemical environment to provide unique functionality in a wide range of applications from energy to medicine.

McIntosh said, “While biosynthesis of structural materials is relatively well established, harnessing nature to create functional inorganic materials will provide a pathway to a future environmentally friendly biomanufacturing based economy. We believe that this work is the first step on this path.”

Source: Lehigh University

23 Jun 2015

*** Team GNT™ – Noteworthy as preface to this “informing article” by Mr. Frolich is that immediate solutions are going to be patchwork at best. It could be suggested that Public Policy of the last 3-plus decades , has failed horribly the citizens of California with the direction and outcomes of ‘Water Resource Policy’. We have Zero interest in debating that point.

‘And the Good News Is’ … ???? With focus being brought to bear … there are solutions for the mid and long term and we believe (Team GNT™) that “Nanotechnologies” will be at the forefrontalong with a directional shift in ‘Water Resource Public Policy’, in solving the looming crisis.***

The nine cities with the worst drought conditions in the country are all located in California, which is now entering its fourth consecutive year of drought as demand for water is at an all-time high.

The long-term drought has already had dire consequences for the state’s agriculture sector, municipal water systems, the environment, and all other water consumers.

Based on data provided by the U.S. Drought Monitor, a collaboration between academic and government organizations, 24/7 Wall St. identified nine large U.S. urban areas that have been under persistent, serious drought conditions over the first six months of this year.

The Drought Monitor classifies drought by five levels of intensity: from D0, described as abnormally dry, to D4, described as exceptional drought. Last year, 100% of California was under at least severe drought conditions, or D2, for the first time since Drought Monitor began collecting data. It was also the first time that exceptional drought — the highest level — had been recorded in the state. This year, 100% of three urban areas in the state are in a state of exceptional drought. And 100% of all nine areas reviewed are in at least extreme drought, or D3.

According to Brad Rippey, a meteorologist with the United States Department of Agriculture (USDA), California has a Mediterranean climate in which the vast majority of precipitation falls during the six month period from October through March. In fact, more than 80% of California’s rainfall is during the cold months. As a result, “it’s very difficult to get significant changes in the drought picture during the warm season,” Rippey said. He added that even when it rains during the summer, evaporation due to high temperatures largely offsets any accumulation.

A considerable portion of California’s environmental, agricultural, and municipal water needs depends on 161 reservoirs, which are typically replenished during the winter months. As of May 31, the state’s reservoirs added less than 6.5 million acre-feet of water over the winter, 78% of the typical recharge of about 8.2 million acre-feet. A single acre-foot contains more than 325,000 gallons of water. This was the fourth consecutive year that reservoir recharge failed to breach the historical average.

The U.S. Drought Monitor is produced by the National Drought Mitigation Center at the University of Nebraska-Lincoln, the USDA and the National Oceanic and Atmospheric Administration (NOAA). 24/7 Wall St. identified the nine urban areas with populations of 75,000 or more where the highest percentages of the land area was in a state of exceptional drought in the first six months of 2015. All data are as of the week ending June 2.

These are the nine cities running out of water.

Bakersfield, CA

Exceptional drought coverage (first half of 2015):72.8%

Extreme drought coverage (first half of 2015): 100%

Population: 523,994
Over the first half of this year, nearly 73% of Bakersfield was in a state of exceptional drought, the ninth largest percentage compared with all large U.S. urban areas. The possible impacts of exceptional drought include widespread crop failures and reservoir and stream depletions, which can result in water emergencies. The drought in Bakersfield has improved somewhat from the same period last year, when nearly 90% of the area was in a state of exceptional drought — the highest in the nation at that time. Like many other areas in California, however, Bakersfield has suffered through more than four years of drought, and any improvement is likely negligible. The Isabella Reservoir on the Kern River is one of the larger reservoirs in the state with a capacity of 568,000 acre-feet. The reservoir has supplied water to Bakersfield since 1953. Today, Isabella’s water level is at less than 8% of its full capacity after falling dramatically each summer since 2011.

Sacramento, CA

Exceptional drought coverage (first half of 2015): 78.3%

Extreme drought coverage (first half of 2015): 100%

Population: 1,723,634

Sacramento is the most populous city running out of water, with 1.72 million residents. The city is located just north of the Sacramento-San Joaquin River Delta, a major source of water not just for Sacramento residents but for a great deal of California. The delta also helps provide water to millions of acres of California farmland. The Sacramento and San Joaquin rivers supply nearly 80 California reservoirs. With the ongoing drought, current storage levels are well below historical averages. On average over the first half of this year, exceptional drought covered more than 78% of Sacramento. The remaining area is far from drought-free, as 100% of Sacramento was in a state of extreme drought over that period — like every other city on this list.

Chico, CA

Exceptional drought coverage (first half of 2015): 85.3%

Extreme drought coverage (first half of 2015): 100%

Population: 98,176

Starting in June this year, new state legislation requires Chico residents to consume 32% less water than they did in 2013. Water bills now include water budgeting information and penalizes residents with higher fees based on how much consumption exceeds the recommended amount. The new rule may be a challenge for some residents, as Chico had among the highest per capita daily water consumption in the state in 2013, according to the ChicoER, a local news outlet. According to The Weather Channel, in April of this year a jet stream shift brought rain and snow to parts of Northern California where Chico is located, a welcome relief to the area’s long-running dry spell. Despite the short-term relief, Chico still suffers from drought — an average of more than 85% of the city was in a state of exceptional drought over the first half of this year.

Lancaster-Palmdale, CA

Exceptional drought coverage (first half of 2015): 87.9%

Extreme drought coverage (first half of 2015): 100%

Population: 341,219

Compared to the first half of last year, drought conditions in Lancaster-Palmdale are worse this year. Last year, nearly 80% of the city was in extreme drought and just 10% in exceptional drought. This year, 100% of the city was classified as being in a state of extreme drought and nearly 88% in exceptional drought. Many Lancaster-Palmdale residents, particularly those in the Palmdale Water District, receive their water from the district’s water wells, the Littlerock Dam, or — like many Californians — the California Aqueduct.

The Colorado River Basin is also a major water source for the region, including Las Vegas to the northeast of Lancaster-Palmdale and Los Angeles to the southwest. Rippey explained that with only three or four wet years in over a decade, the Colorado River Basin region has endured a staggering near 15-year drought. The river, which used to flow into the ocean, now ends in Mexico. Like every other city suffering the most from drought, Lancaster-Palmdale residents are subject to various water restrictions.

Yuba City, CA

Exceptional drought coverage (first half of 2015): 95.4%

Extreme drought coverage (first half of 2015): 100%

Population: 116,719

Yuba City is located on the Feather River, which runs south through Sacramento. The river begins at Lake Oroville, the site of the Oroville Dam and the source of the California Aqueduct — also known as the State Water Project (SWP). The dam’s water levels reached a record low in November 2014. While water levels have increased considerably since then, they remain at a fraction of the reservoir’s capacity. More than 95% of Yuba City was in a state of exceptional drought over the first six months of the year, making it one of only five urban areas to have exceptional drought covering more than 90% of their land area. Like other areas suffering the most from drought, the proportion of Yuba’s workforce employed in agricultural jobs is several times greater than the national proportion. The drought has had considerable economic consequences in the region. Agricultural employment dropped 30.3% from 2012 through 2013, versus the nearly 2% nationwide growth.

Fresno, CA

Exceptional drought coverage (first half of 2015): 100%

Extreme drought coverage (first half of 2015): 100%

Population: 654,628

All of Fresno has endured at least moderate drought conditions during the first half of each year since 2012. For the first time this year, 100% of the city was in a state of exceptional drought, up from 75% in the same period in 2014, and one of only four urban areas experiencing maximum drought conditions in their entire area. Like in many parts of California and several other cities suffering the most from drought, Fresno’s economy relies heavily on agriculture. A major source of water in Fresno is groundwater pumped from aquifers, or natural underground basins. In addition, water is delivered directly from the Sierra Nevada mountains to replenish dwindling surface water levels. Precipitation over the winter was yet again disappointing, and snowpack in the Sierra Nevada mountains was measured at a record low this past April.

Modesto, CA

Exceptional drought coverage (first half of 2015): 100%

Extreme drought coverage (first half of 2015): 100%

Population: 358,172

Like several other drought-stricken cities, Modesto is located in California’s Central Valley between the Sierra Nevada mountains and the San Joaquin River, which are both essential sources of water for the region. Lack of precipitation during the area’s multi-year drought and particularly over this past winter has resulted in record-low snowmelt levels in the mountains. In addition, the San Joaquin River supplies 34 reservoirs, which together are at 39% of their capacity as of the end of May. One of the city’s major sources of water is the Modesto Reservoir, which draws water from the Tuolumne River. The reservoir is smaller than most in California. Over the past four years, the reservoir’s water levels reached their lowest point in September 2012 and are currently just below the historical average.

Merced, CA

Exceptional drought coverage (first half of 2015): 100%

Extreme drought coverage (first half of 2015): 100%

Population: 136,969

Merced is in the Central Valley, an agricultural hub, which not only accounts for a considerable portion of California’s economic output, but also supports the majority of the nation’s agricultural production. The agricultural sector in the Merced metro area accounted for 13.1% of area employment, far higher than the comparable nationwide proportion of 2%.

Agricultural businesses suffer more than perhaps any other industry during severe drought conditions. Agricultural employment shrank by 12.5% in Merced from 2012 through 2013, and the drought has only worsened since then. Over the first half of 2014, exceptional drought covered 78% of Merced, one of the highest percentages in the nation at that time. Over the same period this year, 100% of the city was at the maximum drought level.

Hanford, CA

Exceptional drought coverage (first half of 2015): 100%

Extreme drought coverage (first half of 2015): 100%

Population: 87,941

With 100% of Hanford covered by exceptional drought conditions, the city is tied with Merced, Modesto, and Fresno for the worst drought conditions in the nation. Like the other three cities, Hanford, too, is located in the Central Valley. In addition to statewide restrictions as well as city emergency regulations already in place, city officials adopted additional water restrictions this June, such as barring serving of water at restaurants other than by request as well as vehicle and driveway washing bans. In addition to water restrictions and crop and environmental damage, the drought has impacted the region’s air quality. According to a recent report from the American Lung Association, Hanford had nearly the worst air pollution of any U.S. city. The report identified the dry, hot summers and stagnant air as key contributing factors to high concentrations of particulate matter and smog.

23 Jun 2015

Solar energy is the world’s most plentiful and ubiquitous energy source, and researchers around the world are pursuing ways to convert sunlight into a useful form.

Most people are aware of solar photovoltaics that generate electricity and solar panels that produce hot water. But there is another thrust of solar research: turning sunlight into liquid fuels.

Research in solar-derived liquid fuels, or solar fuels, aims to make a range of products that are compatible with our energy infrastructure today, such as gasoline, jet fuel and hydrogen. The goal is to store sunlight in liquid form, conveniently overcoming the transient nature of sunlight. I am among the growing number of researchers focused on this field.

How can this be done? And what scientific challenges remain before one can fill up a car on solar-generated fuel?

Packing heat: concentrating sunlight into a reactor to split H2O and CO2 – a step toward making liquid fuels. (Courtesy of Professor David Hahn, University of Florida, Author provided)

Speeding up nature

The production of solar fuels is particularly attractive because it addresses both the conversion and storage problem endemic to sunlight; namely, the sun is available for only one-third of the day. This is a distinct advantage compared to other solar conversion technologies. Solar photovoltaic panels, for instance, must be coupled to a complex distribution and storage network, such as batteries, when production of electric power doesn’t equal demand.

The term “solar fuel” is a bit of a misnomer. In fact, all fossil fuels are technically solar-derived. Solar energy drives photosynthesis to form plant matter through the reaction of carbon dioxide (CO2) and water (H2O) and, over millions of years, the decay of plant matter creates the hydrocarbons we use to power our society.

One thermochemical approach strips oxygen from steam and carbon dioxide gas using the sun’s heat. Then, the resulting gases are combined chemically in a separate process to make liquid fuels. (Image: Jonathan Scheffe, Author provided)

The downside of this process is that nature’s efficiency at producing hydrocarbons is excruciatingly low, and humankind’s hunger for energy has never been greater. The result is that the current rate of fossil fuel consumption is much larger than the rate they are produced by nature alone, which provides the motivation to increase nature’s efficiencies and speed up the process of solar fuel production through artificial means.

This is the true meaning of “solar fuel” as it is used today, but the ultimate goal is the same: namely, the conversion of solar energy, CO2 and H2O to chemical forms such as gasoline.

To the lab

The first step in creating manmade solar fuels is to break down CO2 and/or H2O molecules, often to carbon monoxide (CO) or carbon and hydrogen (H2). This is no easy feat, as both of these molecules are very stable (H2 does not form spontaneously from H2O!) and therefore this step requires a substantial amount of energy supplied from sunlight, either directly in the form of photons or indirectly as electricity or heat.

This step is often the most crucial component of the process and represents the greatest roadblock to commercialization of solar fuel technologies today, as it largely defines the efficiency of the overall fuel production process and therefore the cost.

Downstream of this step, the resulting molecules – in this case, a mixture of CO and hydrogen called synthesis gas – may be converted through a variety of existing technologies depending on the final product desired. This step of converting hydrocarbon gases to liquid form is already performed at an industrial scale, thanks to large corporations such as Shell Global Solutions and Sasol that use these technologies to leverage to low cost of today’s natural gas to make more valuable liquid fuels.

Recently, a European Union-sponsored project called SOLAR-JET (Solar chemical reactor demonstration and Optimization for Long-term Availability of Renewable JET fuel) demonstrated the first-ever conversion of solar energy to jet fuel, or kerosene. Researchers coupled the solar-driven production of synthesis gas, also called syngas, from CO2 and H2O with a downstream gas-to-liquids reactor – in this case a Fischer-Tropsch reactor at Shell’s Headquarters in Amsterdam.

The production of liquid fuels is especially important for the aviation industry that relies on energy-dense fuels and represents another important advantage of solar fuel production compared to solar electricity.

The SOLAR-JET project, which I worked on with several other researchers, utilized a process called solar thermochemical fuel production, in which solar energy is concentrated using optics – mirrors and lenses – much the way a magnifying glass can start a fire. The resulting heat is then absorbed in a chamber that acts as a chemical reactor. The absorbed heat is then used to dissociate H2O and/or CO2 through a catalytic-type process – one of the most technically challenging steps for all solar fuel conversion processes. The resulting products (hydrogen or synthesis gas) can then be captured and further converted to liquid fuels downstream.

Artificial photosynthesis

There are numerous other strategies to drive these reactions needed for the first step of solar fuel production, including those that utilize light – photons – directly or indirectly in the form of electricity.

For example, so-called artificial photosynthesis utilizes photons directly in a catalytic process, rather than absorbing them as heat, to break down H2O and CO2 molecules.

Former Energy Secretary Steven Chu visiting the Joint Center for Artificial Photosynthesis, which received an additional US$75 million in funding earlier this year. The lab is pursuing converting light (not heat) directly into fuels. (Image: Lawrence Berkeley National Laboratory)

Electrochemical approaches utilize electricity that could be generated from a photovoltaic cell to drive the separation of H2O and CO2 through a process known as electrolysis.

To date, the key barriers to commercialization of all of these technologies are primarily related to their low efficiencies – that is related to the amount of energy needed to produce a liquid fuel – and overall robustness. For example, the efficiency of the SOLAR-JET thermochemical conversion project discussed above is still less than 2%, but for this technology to become commercially viable, efficiencies greater than 10% will need to be achieved.

A team working at the University of Florida funded by research agency ARPA-E is working toward these efficiency goals using another thermochemical process that uses optics to generate heat. Yet robustness because of extreme temperatures (greater than 1200 Celsius or over 2000 Farenheit) is still a major concern that is being addressed.

Furthermore, for solar fuel production to truly reduce greenhouse gas levels, it must be coupled with methods to capture CO2 from the air. This is still a relatively immature technology, but companies such as Climeworks are working to make this a reality.

Add in the complexity of integrating a temporally varying energy input (the sun) with a chemical reactor and the overall scope of the challenge can appear large. Nevertheless, advances are being made daily that give hope that solar fuels at higher efficiencies will soon be a reality.

Source: By Jonathan Scheffe, Associate Professor Department of Mechanical and Aerospace Enginering at University of Florida, via The Conversation

Stanford scientists have invented a device that produces clean-burning hydrogen from water 24 hours a day, seven days a week. Unlike conventional water splitters, the Stanford device uses a single low-cost catalyst to generate hydrogen bubbles on one electrode and oxygen bubbles on the other. (Image: L.A. Cicero/Stanford University)

‘We have developed a low-voltage, single-catalyst water splitter that continuously generates hydrogen and oxygen for more than 200 hours, an exciting world-record performance,’ said study co-author Yi Cui, an associate professor of materials science and engineering at Stanford and of photon science at the SLAC National Accelerator Laboratory.

In an engineering first, Cui and his colleagues used lithium-ion battery technology to create one low-cost catalyst that is capable of driving the entire water-splitting reaction.

‘Our group has pioneered the idea of using lithium-ion batteries to search for catalysts,’ Cui said. ‘Our hope is that this technique will lead to the discovery of new catalysts for other reactions beyond water splitting.’

Clean hydrogen

Hydrogen has long been promoted as an emissions-free alternative to gasoline. Despite its sustainable reputation, most commercial-grade hydrogen is made from natural gas, a fossil fuel that contributes to global warming. As an alternative, scientists have been trying to develop a cheap and efficient way to extract pure hydrogen from water.

A conventional water-splitting device consists of two electrodes submerged in a water-based electrolyte. A low-voltage current applied to the electrodes drives a catalytic reaction that separates molecules of H2O, releasing bubbles of hydrogen on one electrode and oxygen on the other.

Each electrode is embedded with a different catalyst, typically platinum and iridium, two rare and costly metals. But in 2014, Stanford chemist Hongjie Dai developed a water splitter made of inexpensive nickel and iron that runs on an ordinary 1.5-volt battery.

Single catalyst

In the new study, Cui and his colleagues advanced that technology further.

‘Our water splitter is unique, because we only use one catalyst, nickel-iron oxide, for both electrodes,’ said graduate student Haotian Wang, lead author of the study. ‘This bifunctional catalyst can split water continuously for more than a week with a steady input of just 1.5 volts of electricity. That’s an unprecedented water-splitting efficiency of 82 percent at room temperature.’

In conventional water splitters, the hydrogen and oxygen catalysts often require different electrolytes with different pH — one acidic, one alkaline — to remain stable and active. ‘For practical water splitting, an expensive barrier is needed to separate the two electrolytes, adding to the cost of the device,’ Wang said. ‘But our single-catalyst water splitter operates efficiently in one electrolyte with a uniform pH.’

Wang and his colleagues discovered that nickel-iron oxide, which is cheap and easy to produce, is actually more stable than some commercial catalysts made of precious metals.

‘We built a conventional water splitter with two benchmark catalysts, one platinum and one iridium,’ Wang said. ‘At first the device only needed 1.56 volts of electricity to split water, but within 30 hours we had to increase the voltage nearly 40 percent. That’s a significant loss of efficiency.’

Marriage of batteries and catalysis

To find catalytic material suitable for both electrodes, the Stanford team borrowed a technique used in battery research called lithium-induced electrochemical tuning. The idea is to use lithium ions to chemically break the metal oxide catalyst into smaller and smaller pieces.

‘Breaking down metal oxide into tiny particles increases its surface area and exposes lots of ultra-small, interconnected grain boundaries that become active sites for the water-splitting catalytic reaction,’ Cui said. ‘This process creates tiny particles that are strongly connected, so the catalyst has very good electrical conductivity and stability.’

‘Haotian eventually discovered that nickel-iron oxide is a world-record performing material that can catalyze both the hydrogen and the oxygen reaction,’ Cui said. ‘No other catalyst can do this with such great performance.’

Using one catalyst made of nickel and iron has significant implications in terms of cost, he added.

‘Not only are the materials cheaper, but having a single catalyst also reduces two sets of capital investment to one,’ Cui said. ‘We believe that electrochemical tuning can be used to find new catalysts for other chemical fuels beyond hydrogen. The technique has been used in battery research for many years, but it’s a new approach for catalysis. The marriage of these two fields is very powerful.’

The company uses a novel battery composition based on a semi-solid material that eliminates much of the bulk of conventional lithium-ion batteries—which are typically made up mostly of inactive, non-energy-storing materials—while dramatically increasing the energy density. Chiang and 24M CEO Throop Wilder also say that they can reduce the time needed to make a battery by 80 percent and the cost by 30 to 50 percent.

After five years of research and development, 24M has raised $50 million in funding from Charles River Ventures, North Bridge Venture Partners, and its strategic partners, along with a $4.5 million grant from the U.S. Department of Energy. It has strategic partnerships with the Japanese heavy-industry giant IHI and from PTT, the formerly state-owned Thai oil and gas company, which is increasingly moving into alternative energy.

Since a 2011 paper in the journal Advanced Energy Materials previewed 24M’s technology, the company has received a large amount of press coverage for a stealth-mode startup, including articles in this publication as well as a long, adulatory profile on the website Quartz. The company calls its new battery “the most significant advancement in lithium-ion technology since its debut more than 20 years ago.”

That would be a remarkable accomplishment, with the potential to drive the electric-vehicle market to a new level and accelerate the spread of renewable energy. But 24M faces a challenge that many previous companies with promising technology have failed to solve: how to revolutionize a manufacturing industry with huge amounts of capital sunk into extensive existing capacity.

Yet-Ming Chiang is personally familiar with this quandary: he was one of the founders of A123, the lithium-ion startup that received nearly a quarter of a million dollars in funding from the U.S. government, went public in the largest IPO of 2009, and filed for bankruptcy in 2012. A123 was done in by an EV market that grew slower than expected and by its close relationship with EV maker Fisker, which itself failed in 2013 and was purchased by the Chinese auto parts company Wanxiang Group. But it also stumbled in trying to compete with more established battery makers such as LG and Panasonic.

Chiang acknowledges the dilemma: “In the last decade, there have been a lot of new lithium-ion plants built, and the EV market has not materialized to fill these factories.”

Now, energy storage demand is soaring—for vehicles, for power grids, and for residences with distributed renewable generation, such as rooftop solar arrays. Capacity in the United States is expected to more than triple this year, and rapid growth will continue through the end of the decade, according to GTM Research.

That means that demand for a less expensive, more efficient technology should be robust. But the building boom of the previous decade means there’s also excess manufacturing capacity available to supply that market. LG Chem’s plant in Holland, Michigan, which makes batteries for the Chevrolet Volt and the Cadillac ELR, currently operates at about 30 percent of capacity, according to CEO Prabhakar Patil.

And established players are already increasing their output. Mercedes-Benz parent company Daimler, for instance, announced late last year it will invest 100 million euros ($113 million) to expand its lithium-ion manufacturing capacity through its subsidiary Deutsche ACCUmotive.

Meanwhile, improvements in the manufacturing of conventional lithium-ion batteries are reducing the cost per kilowatt-hour of existing systems—even as research into next-generation chemistries, such as lithium-sulfur and lithium-air, continues at institutions around the world.

In short, 24M is attempting to transform a worldwide manufacturing industry in which established players with deep pockets are investing hundreds of millions in the expansion of existing processes. That’s a tough road even for a startup with a novel and exciting technology.

Chiang and CEO Wilder are undeterred. “This is the next great mega-market,” says Wilder. “To quote Elon Musk, the world is going to need multiple gigafactories.”

Sep 22, 2018 / Comments Off on MIT: New battery technology gobbles up carbon dioxide – Ultimately may help reduce the emission of the greenhouse gas to the atmosphere + Could Carbon Dioxide Capture Batteries Replace Phone and EV Batteries?