Nanotechnology Today answer tips

AnswerTips are small bubbles of information triggered by double-clicking any word on this page.
it allows you to access definitions and fast facts on millions of words, personalities or slang.
we hope this delivers quick and integrated answers for our content

Nanotechnology Today VIDEO

Nanotechnology Today

Nanotechnology Today sitemeter

hit tale

.

Wednesday, February 28, 2007

Nanotech promises big things for poor -- but will promises be kept? Health care in developing countries could be greatly improved by nanotechnology

WASHINGTON, D.C. -- "Nanotechnology has the potential to generate enormous health benefits for the more than five billion people living in the developing world," according to Dr. Peter A. Singer,

senior scientist at the McLaughlin-Rotman Centre for Global Health and Professor of Medicine at University of Toronto. "Nanotechnology might provide less-industrialized countries with powerful new tools for diagnosing and treating disease, and might increase the availability of clean water."

"But it remains to be seen whether novel applications of nanotechnology will deliver on their promise. A fundamental problem is that people are not engaged and are not talking to each other. Business has little incentive—as shown by the lack of new drugs for malaria, dengue fever and other diseases that disproportionately affect people in developing countries—to invest in the appropriate nanotechnology research targeted at the developing world. Government foreign assistance agencies do not often focus, or focus adequately, on science and technology. With scant public awareness of nanotechnology in any country, there are few efforts by nongovernmental organizations (NGOs) and community groups to examine how nanotechnology could be directed toward, for example, improving public health in the developing world."

Dr. Singer’s group in Toronto published a study in 2005 identifying and ranking the ten nanotechnologies most likely to benefit the developing world in the near future. Nanotechnology applications related to energy storage, production, and conversion; agricultural productivity enhancement; water treatment and remediation; and diagnosis and treatment of diseases topped the list. Dr. Singer’s group has also shown that there is a surprising amount of nanotechnology R&D activity in several developing countries, and that these nations are directing their nanotechnology innovation systems to address their more pressing needs.

"Countries like Brazil, India, China and South Africa have significant nanotechnology research initiatives that could be directed toward the particular needs of the poor. But there is still a danger—if market forces are the only dynamic—that small minorities of people in wealthy nations will benefit from nanotechnology breakthroughs in the health sector, while large majorities, mainly in the developing world, will not," noted Dr. Andrew Maynard, chief science advisor for the Woodrow Wilson Center’s Project on Emerging Nanotechnologies. "Responsible development of nanotechnology must include benefits for people in both rich and poor nations and at relatively low cost. This also requires that careful attention be paid to possible risks nanotechnology poses for human health and the environment."

Dr. Piotr Grodzinski, director of the Nanotechnology Alliance for Cancer at the National Cancer Institute, National Institutes of Health (NIH) discussed the impact of nanotechnology on diagnostics and therapies for cancer. He said, "It is my belief that nanomaterials and nanomedical devices will play increasingly critical and beneficial roles in improving the way we diagnose, treat, and ultimately prevent cancer and other diseases. But we face challenges; the complexity of clinical implementation and the treatment cost may cause gradual, rather than immediate, distribution of these novel yet effective approaches."

"For example, in the future, it may be possible for citizens in Bangladesh to place contaminated water in inexpensive transparent bottles that will disinfect the water when placed in direct sunlight, or for doctors in Mexico to give patients inhalable vaccines that do not need refrigeration," Dr. Maynard noted. "Nanotechnologies could revolutionize health care in developing countries and make treatments more readily available for diseases that claim millions of lives around the world each year."

The discussion took place at a program entitled "Using Nanotechnology to Improve Health Care in Developing Countries," held at the Woodrow Wilson International Center for Scholars. The event was organized by the Wilson Center’s Project on Emerging Nanotechnologies and Global Health Initiative. It was moderated by Dr. Jeff Spieler, chief of research, technology and utilization for the Office of Population and Reproductive Health at the U.S. Agency for International Development (USAID).

Dr. Piotr Grodzinski is director of National Cancer Institute (NCI) Nanotechnology Alliance for Cancer in Bethesda, Maryland. He is an internationally recognized authority in the areas of bio- and nano-chip assays and microfluidics, and has built and led research efforts in private industry and national labs. Dr. Grodzinski received a Ph.D. in Materials Science from the University of Southern California, Los Angeles in 1992.

Dr. Andrew Maynard is chief science advisor for the Project on Emerging Nanotechnologies. His research interest in nanomaterials started in the late 1980s, while working on his doctorate in ultrafine aerosol analysis at the University of Cambridge in the UK. Through his work with the UK Health and Safety Executive, the U.S. National Institute of Occupational Safety and Health and the U.S. National Nanotechnology Initiative, he has led current science-based efforts to understand and manage the potential risks of nanotechnology. Dr. Maynard is considered one of the foremost international experts on addressing possible nanotechnology risks and developing safe nanotechnologies, and brings a much-needed science perspective to a field fraught with uncertainty.

Dr. Peter A. Singer is senior scientist and co-director of the Program on Life Sciences, Ethics and Policy at the McLaughlin Rotman Centre for Global Health, University Health Network; professor of Medicine, University of Toronto; and a distinguished investigator of the Canadian Institutes of Health Research. He studied internal medicine at the University of Toronto, medical ethics at the University of Chicago, public health at Yale University, and management at Harvard Business School. His contributions have included improvements in quality end of life care, fair priority setting in healthcare organizations, teaching bioethics, pandemic influenza planning, global biosecurity, and harnessing life sciences and emergent technologies to improve health in developing countries.

Dr. Jeff Spieler is chief of research, technology and utilization for the Office of Population and Reproductive Health at the U.S. Agency for International Development (USAID). He joined USAID in 1983 as the senior biomedical research advisor in population after spending 11 years at the Special Programme of Research, Development and Research Training in Human Reproduction of the World Health Organization (WHO). Before joining the WHO, he worked for 5 years as a bench scientist at Lederle Laboratories Pharmaceutical Company in Pearl River, NY. Dr. Spieler received a B.S. in Zoology from the University of Florida in 1967, a M.S. in Zoological Sciences and Reproductive Biology from Rutgers University in 1971, and an Honorary Doctorate in Public Service from the University of Florida in 2002. ###

Nanotechnology is the ability to measure, see, manipulate and manufacture things usually between 1 and 100 nanometers. A nanometer is one billionth of a meter; a human hair is roughly 100,000 nanometers wide.

The Project on Emerging Nanotechnologies is an initiative launched by the Woodrow Wilson International Center for Scholars and The Pew Charitable Trusts in 2005. It is dedicated to helping business, government and the public anticipate and manage possible health and environmental implications of nanotechnology. For more information about the project, log on to nanotechproject.org.

The Global Health Initiative at the Woodrow Wilson International Center for Scholars brings practitioners, scientists, scholars, and policymakers together in a neutral forum to discuss the most pressing health issues of the 21st century in the hope of ultimately increasing understanding of health issues and inspiring policy decisions that will improve the lives of citizens around the world. For more information about the Initiative, log on to wilsoncenter.org/globalhealth.

Tuesday, February 27, 2007

Common Ingredient in Big Macs and Sodas Can Stabilize Gold Nanoparticles for Medical Use, Researchers Find, Gold nanoparticles could be used to detect and treat cancer and other diseases. High Resolution Image

COLUMBIA, Mo. - The future of cancer detection and treatment may be in gold nanoparticles - tiny pieces of gold so small they cannot be seen by the naked eye. The potential of gold nanoparticles has been hindered by the difficulty of making them in a stable, nontoxic form that can be injected into a patient. New research at the University of Missouri-Columbia has found that a plant extract can be used to overcome this problem, creating a new type of gold nanoparticle that is stable and nontoxic and can be administered orally or injected.

Because gold nanoparticles have a high surface reactivity and biocompatible properties, they can be used for in vivo (inside the body) molecular imaging and therapeutic applications, including cancer detection and therapy. The promise of nanomedicine comes from the high surface area and size relationships of nanoparticles to cells, making it possible to target individual cells for diagnostic imaging or therapy. Gold nanoparticles could function as in vivo sensors, photoactive agents for optical imaging, drug carriers, contrast enhancers in computer tomography and X-ray absorbers in cancer therapy. Despite their promise, however, scientists have been plagued with problems making nontoxic gold nanoparticle constructs.

Kattesh Katti, professor of radiology and physics in MU's School of Medicine and College of Arts and Science, and director of the University of Missouri Cancer Nanotechnology Platform, worked with other MU scientists in the fields of physics, radiology, chemistry and veterinary medicine. The team tested plant extracts for their ability as nontoxic vehicles to stabilize and deliver nanoparticles for in vivo nanomedicinal applications. The researchers became interested in gum arabic, a substance taken from species of the acacia tree, because it is already used to stabilize everyday foods such as yogurt, Big Macs and soda. Gum arabic has unique structural features, including a highly branched polysaccharide structure consisting of a complex mixture of potassium, calcium and magnesium salts derived from arabic acid. The scientists found that gum arabic could be used to absorb and assimilate metals and create a "coating" that makes gold nanoparticles stable and nontoxic.

Katti and Raghuraman Kannan, assistant professor of radiology, have been collaborating on the development of biocompatible gold and silver nanoparticles for medical applications.

"We found that gum arabic can effectively 'lock' gold nanoparticles to produce nontoxic, nanoparticulate constructs that can be used for potential applications in nanomedicine," Katti said. "We have developed a new class of hybrid gold nanoparticles that are stable and can be administered either orally or through intravenous injection within the biological system."

This finding could lead to the development of readily injectable gold nanoparticles that are nontoxic and stable. Mansoor Amiji, professor of pharmaceutical sciences in the Bouve College of HealthSciences' School of Pharmacy and co-director of the Nanomedicine Education and Research Consortium at Northeastern University in Boston, said this represents a major scientific discovery that will initiate a new generation of biocompatible gold nanoparticles.

"The excellent in vivo stability profiles of such gold nanoconstructs will open up new pathways for the intratumoral delivery of gold nanoparticles in diagnostic imaging and therapeutic applications for cancer," Amiji said.

The new generation of trimeric amino acids peptides discovered by Katti in 1999 (referred to by Amijii as 'Katti Peptides') have provided a solid chemical platform and have become sources of a number of other discoveries. Their applications in the development of drugs for Wilscons' disease; their utility for the generation of a wide spectrum of metallic nanoparticles, including gold and silver; and as amphiphilic building blocks in a variety of drug designs were demonstrated by Katti, in collaboration with Kannan and MU's Stan Casteel.

A paper describing the team's recent findings, "Gum arabic as a Phytochemical Construct for the Stabilization of Gold Nanoparticles: In Vivo Pharmacokinetics and X-ray Contrast-Imaging Studies," was recently published in the February edition of the journal Small. Katti's collaborators on this paper include Casteel, Kannan, David Robertson, Evan Boote, Genevieve M. Fent, Kavita Katti, Vijaya Kattumauri and Meera Chandrasekhar.

This work has been supported with a grant from the National Institutes of Health/National Cancer Institute under the Cancer Nanotechnology Platform program. -30-

Monday, February 26, 2007

A team of researchers at the University of Colorado at Boulder has developed a new technique to generate laser-like X-ray beams, removing a major obstacle in the decades-long quest to build a tabletop X-ray laser that could be used for biological and medical imaging.

For nearly half a century, scientists have been trying to figure out how to build a cost-effective and reasonably sized X-ray laser to provide super-high imaging resolution, according to CU-Boulder physics professors Henry Kapteyn and Margaret Murnane, who led the team at JILA, a joint institute of CU-Boulder and the National Institute of Standards and Technology. Most of today's X-ray lasers require so much power that they rely on fusion laser facilities the size of football stadiums, making their use impractical.

"We've come up with a good end run around the requirement for a monstrous power source," Kapteyn said.

A paper on the subject by Murnane and Kapteyn, CU-Boulder graduate students Xiaoshi Zhang, Amy Lytle, Tenio Popmintchev, Xibin Zhou and Senior Research Associate Oren Cohen of JILA was published in the online version of the journal Nature Physics on Feb. 25.

If they can extend the new technique all the way into the hard X-ray region of the electromagnetic spectrum, which they think is just a matter of time because there are no physical principles blocking the way, the ramifications would be felt in numerous fields.

"If we can do this, it might make it possible to improve X-ray imaging resolution by a thousand times, with impacts in medicine, biology and nanotechnology," Murnane said. "For example, the X-rays we get in the hospital are limited by spatial resolution. They can't detect really small cancers because the X-ray source in your doctor's office is like a light bulb, not like a laser. If you had a bright, laser-like X-ray beam, you could image with far higher resolution."

To generate laser-like X-ray beams, the team used a powerful laser to pluck an electron from an atom of argon, a highly stable chemical element, and then slam it back into the same atom. The boomerang action generates a weak, but directed beam of X-rays.

The obstacle they needed to hurdle was combining different X-ray waves emitted from a large number of atoms to generate an X-ray beam bright enough to be useful, according to Kapteyn. In other words, they needed to generate big enough waves flowing together to make a strong X-ray.

The biggest problem was the waves of X-rays do not all come out "marching in step" because visible laser light and X-ray beams travel at different speeds in the argon gas, Murnane said. This meant that while some X-ray waves combined with other waves from similar regions to become stronger, waves from different regions would cancel each other out, making the X-ray output weaker.

To correct this, the researchers sent some weak pulses of visible laser light into the gas in the opposite direction of the laser beam generating the X-rays. The weak laser beam manipulates the electrons plucked from the argon atoms, whose emissions are out of sync with the main beam, and then slams them back into the atoms to generate X-rays at just the right time, intensifying the strength of the beam by over a hundred times.

"Think of a kid on a swing," Kapteyn said. "If you keep pushing at the right time the swing goes higher and higher, but if you don't push it at the right time, you'll eventually stop it.

"What we found is essentially another beam of light to control exactly when the swing is getting pushed. By putting the light in the right place, we don't allow the swing to be pushed at the wrong time." ###

The team plans to continue the research through the Engineering Research Center for Extreme Ultraviolet Science and Technology, a National Science Foundation-supported center comprised of researchers from CU-Boulder, Colorado State University and the University of California at Berkeley. The current research was supported with NSF grants.

Sunday, February 25, 2007

SAN FRANCISCO, Calif. —The safest possible future for advancing nanotechnology in a sustainable world can be reached by using green chemistry, says James E. Hutchison, a professor of chemistry at the University of Oregon.

"Around the world, there is a growing urgency about nanotechnology and its possible health and environmental impacts," Hutchison said in his talk Feb. 18 during a workshop at the annual meeting of the American Association for the Advancement of Science. "There is a concern that these issues will hinder commercialization of this industry."

Nanotechnology refers to research on materials that are nanometer in size - or about 1 billionth of a meter and applicable to virtually every technology and medicine. The field of nanoscience, Hutchison said, is still in the discovery phase, in which new materials are being synthesized for testing for very specific physical properties. During such work, there often are unintended properties of material that potentially can be hazardous to the environment or human health but are, for now, an acceptable risk in secured research environments, he said.

In an earlier session, Vicki Colvin, professor of chemistry and chemical engineering at Rice University in Houston, had echoed similar comments but concluded that nanotechnology can be safe by proceeding with an approach she called "safety by design." Barriers to engineering safe nanoparticles, she said in her topical lecture, include testing for toxicity of materials already produced built around a policy to manage risks and doing fundamental basic research to address such things as purification of nanomaterials and surface areas where nanoparticles and proteins come together. She said extensive libraries of nanoparticles are needed to help assure safety.

As nanoparticles decrease in size, Colvin said, they have special unique properties. "At what point does size become a barrier?" she asked. Another safety issue is in the exposure to nanoparticles through inhalation and skin contact.

In his talk, Hutchison focused on what he called a proactive approach to advancing from the current discovery phase in the creation of nanomaterials into a production phase that is efficient and reduces waste. He suggested a green framework for moving the industry forward.

Now is the time, Hutchison said, for scientists to "seriously consider the design of materials, processes and applications that minimize hazard and waste, and this will be essential as nanoscience discoveries transition to the products of nanotechnology."

Green chemistry, he argues, can sharply reduce the use of toxic solvents and produce safer products with reduced chances for unintended consequences. It also can provide opportunity for new innovations. "Green chemistry allows us to think about new space and new parameters," Hutchison said. "We have the opportunity to develop the technology correctly from the beginning, rather than trying to rework an entrenched technology."

Such an approach, he said, contributed to public opposition to genetically modified crops. Colvin, in her talk, also referred to the backlash against genetically modified organisms, as well as the unintended impacts of DDT (toxicity to animals), some pesticides (cancer in humans) and refrigerant (ozone destruction).

Hutchison, who is director of the UO's Materials Science Institute, is developing diverse libraries of nanoparticles, such as those called for by Colvin. In his UO lab, Hutchison said, "We systematically bury the structural parameters and use in vivo and in vitro assays to determine the relationship between biological response and structural parameters."

One such library covers gold nanoparticles for use in basic research. By studying these nanoparticles, he said, researchers can get an idea of what kinds of new electronic, optical and pharmaceutical products eventually may come to market. Hutchison received a patent in 2005 for his synthesis of gold nanoparticles using green chemistry.

Hutchison told the AAAS gathering that he recently published a technique for purifying nanoparticles that uses membranes with nanopores so small that only impurities pass through - a green approach that allows the purification of particles rapidly without using organic solvents. "Before this accomplishment, purifying the material used up 15 or so liters of solvent per gram of particles," he said. "If solvent is the density of water, that's 15,000 times more mass used to purify it than we get out of it."

The nanotechnology industry, Hutchison said, has reached an important moment in time. "There is an opportunity to stay ahead of the curve," he said. "We should commit ourselves to design these materials and processes to be green from the beginning, and this will provide a lot of freedom from layers of regulation to researchers and companies, allowing for more innovation."

Saturday, February 24, 2007

ASU embarks on innovative fuel cell project, Fuel cells designed to meet large power needs

Roller coaster gas prices and rising energy costs for the home have created uneasiness about the future of our fossil-fuel based economy. One near-term solution being pursued by researchers at the Biodesign Institute at ASU is a new fuel cell technology for renewable energy and the fledgling hydrogen economy.

Don Gervasio, associate research professor at the institute's Center for Applied NanoBioscience, is overseeing an ASU team that was awarded a $1.5 million grant by the Department of Energy (DOE) to develop new fuel cell components to more efficiently generate electrical power. The technology is designed for use in large fuel cells that can generate 100 kilowatts of power, which is enough for a car, a home or a remote power station.

Most power generators come with an unwanted side effect: heat. For a car, excess heat is handled by running coolant from the radiator through the engine block. To date, attempts to use a similar cooling technology with most fuel cells have been unsuccessful.

"Even though a fuel cell operates at a higher efficiency than a car engine, it still puts out a considerable amount of heat," said Gervasio. "For a long time people thought that a room temperature fuel cell would be ideal for automobiles, but it turns out that if you power an automobile using a fuel cell operating at room temperature, you need a radiator as big as the car."

Fuel Cell Sandwich

Fuel cells are generating significant interest because they offer a more efficient alternative to heat engines and avoid nasty pollutants like carbon monoxide, nitrogen oxides and ozone.

A typical fuel cell is an elaborate assembly of membranes sandwiched by electrodes and plates that send gases into the fuel cell to generate electricity. According to the DOE, the membrane and electrode parts of the fuel cell account for more than half of the costs of the fuel cell stacks. A reduction in those costs would make the fuel cell system more competitive with standard gasoline engines.

By designing a membrane that operates at high temperatures (a medium oven setting of 250 F, or 120 C), Gervasio and colleagues want to reduce both the amount of heat management needed to operate the fuel cell and its overall size, weight and costs.

A hydrogen powered fuel cell has positive and negative ends just like a battery. It works by splitting hydrogen gas into its component protons and electrons at the negative electrode, which react with oxygen from air at the positive electrode. This produces electricity while leaving only water as a byproduct.

The 'cheese' of the fuel cell stack, the fuel cell membrane, completes the electrical circuit by funneling protons through the membrane from one electrode to the other. Just as importantly, it also forces energized electrons to move across a circuit outside the membrane, producing an electron current to power devices such as a light bulb or electric motor.

Some commercial membrane designs reduce the operating voltage generated from the fuel cell by as much as 50 percent of its theoretical value, and most don't operate at temperatures much above room temperature, Gervasio said. This makes the development of a new membrane essential if fuel cells are to be used to reduce consumption of fossil fuels and have a central role in the hydrogen economy.

The Secret Sauce

Another aspect of the fuel cell membrane that ASU scientists hope to improve is its source of electrolytes, the salts that carry charge through the inside of the fuel cell. ASU Regents' Professor Austen Angell, a co-leader of the group, has been using protic ionic liquids to accelerate the movement of protons, which are essential for completing the circuit and generating electric power.

Currently, high temperature fuel cell systems use phosphoric acid in a polymer matrix as the membrane electrolyte, but the voltage generated is about half what could theoretically be achieved. One of the protic ionic liquids that Angell and his fellow researchers have experimented with has generated electric potentials approaching the theoretical limit, but has not yet been able to maintain the voltage at higher currents.

Another advantage of protic ionic liquids compared to the phosphoric acid system is that they don't contain any water. Water, the only byproduct of generating energy from hydrogen in a fuel cell, can often clog up the system and prevent the hydrogen and oxygen gases from flowing into the cell.

One protic ionic mixture being tested uses the combination of two ammonium salts, ammonium nitrate and ammonium bisulfate. "These are some of the cheapest chemicals on the market, and they work like a charm," said Angell.

Fashioning a water-free electrolyte system is not without its difficulties. Individually, the ammonium salts are solid at room temperature, like table salt. However, when the salts are combined at just the right ratio, they can melt into a liquid at the operating temperatures of the fuel cell, allowing incorporation into the membrane.

A Real Pickle

Another co-leader of the project, Jeff Yarger, professor of chemistry and biochemistry, is building an analytical system using a powerful tool, NMR spectroscopy, to both troubleshoot fuel cell development and help uncover the mechanisms of proton conduction across the fuel cell membrane.

"The whole point of the membrane is to get protons across as fast as possible," said Yarger. "If they are getting stuck in the membrane we want to be able to see where they are getting stuck and find a way to fix it."

Yarger and his team can measure how quickly the protons move across the membrane, which will aid in membrane design. "From a practical engineering perspective you'd want the membrane to be as solid as possible, but from a proton diffusion perspective you want the membrane to be as liquid as possible," said Yarger.

An assortment of different polymers will be used to absorb the protic ionic liquids to find the best combination of stability and conductivity. With continued optimization and a better understanding of how the fuel cells work, the researchers hope to break through the barriers that have limited widespread adoption of fuel cells. Funding for the fuel cell project will continue until 2011. ###

Friday, February 23, 2007

Fluid dynamics works on nanoscale in real world, Scientists show theory works outside of a vacuum.

In 2000, Georgia Tech researchers showed that fluid dynamics theory could be modified to work on the nanoscale, albeit in a vacuum.

Now, seven years later they've shown that it can be modified to work in the real world, too – that is, outside of a vacuum. The results appear in the February 9 issue of Physical Review Letters (PRL).

Understanding the motion of fluids is the basis for a tremendous amount of engineering and technology in contemporary life. Planes fly and ships sail because scientists understand the rules of how fluids like water and air behave under varying conditions.

The mathematical principle that describe these rules wave put forth more than 100 years ago and are known as the Navier-Stokes equations. They are well-known and understood by any scientist or student in the field. But now that researchers are delving into the realm of the small, an important question arisen: namely, how do these rules work when fluids and flows are measured on the nanoscale? Do the same rules apply or, given that the behavior of materials in this size regime often has little to do with their macro-sized cousins, are there new rules to be discovered?

It's well-known that small systems are influenced by randomness and noise more than large systems. Because of this, Georgia Tech physicist Uzi Landman reasoned that modifying the Navier-Stokes equations to include stochastic elements – that is give the probability that an event will occur – would allow them to accurately describe the behavior of liquids in the nanoscale regime.

Writing in the August 18, 2000, issue of Science, Landman and post doctoral fellow Michael Moseler used computer simulation experiments to show that the stochastic Navier-Stokes formulation does work for fluid nanojets and nanobridges in a vacuum. The theoretical predictions of this early work have been confirmed experimentally by a team of European scientists (see the December 13, 2006, issue of Physical Review Letters). Now, Landman and graduate student Wei Kang have discovered that by further modifying the Moseler-Landman stochastic Navier-Stokes equations, they can accurately describe this behavior in a realistic non-vacuous environment.

"There was a strong opinion that fluid dynamics theory would stop being valid for small systems," said Landman, director of the Center for Computational Materials Science, Regents' and Institute professor, and Callaway chair of physics at the Georgia Institute of Technology. "It was thought that all you could do was perform extensive, as well as expensive, molecular dynamic simulations or experiments, and that continuum fluid dynamics theory could not be applied to explain the behavior of such small systems."

The benefit of the new formulations is that these equations can be solved with relative ease in minutes, in comparison to the days and weeks that it takes to simulate fluid nano structures, which can contain as many as several million molecules. Equally difficult, and sometimes even harder, are laboratory experiments on fluids in this regime of reduced dimensions.

In this study, Landman and Wei simulated a liquid propane bridge, which is a slender fluid structure connecting two larger bodies of liquid, much like a liquid channel connecting two rain puddles. The bridge was six nanometers in diameter and 24 nanometers long. The object was to study how the bridge collapses.

In the study performed in 2000, Landman simulated a bridge in a vacuum. The bridge broke in a symmetrical fashion, pinching in the middle, with two cones on each side. This time, the simulation focused on a model with a nitrogen gas environment surrounding the bridge at different gas pressures.

When the gas pressure was low (under 2 atmospheres of nitrogen), the breaking occurred in much the same way that it did in the previous vacuum computer experiment. But when the pressure was sufficiently high (above 3.5 atmospheres), 50 percent of the time the bridge broke in a different way. Under high pressure, the bridge tended to create a long thread and break asymmetrically on one side or the other of the thread instead of in the middle. Until now, such asymmetric long-thread collapse configuration has been discussed only for macroscopically large liquid bridges and jets.

Analyzing the data showed that the asymmetric breakup of the nanobridge in a gaseous environment relates to molecular evaporation and condensation processes and their dependence on the curvature of the shape profile of the nanobridge.

"If the bridge is in a vacuum, molecules evaporating from the bridge are sucked away and do not come back” said Landman. "But if there are gas molecules surrounding the bridge, some of the molecules that evaporate will collide with the gas, and due to these collisions the scattered molecules may change direction and come back to the nanobridge and condense on it."

As they return they may fill in spaces where other atoms have evaporated. In other words, the evaporation-condensation processes serve to redistribute the liquid propane along the nanobridge, resulting in an asymmetrical shape of the breakage. The higher the pressure is surrounding the bridge, the higher the probability that the evaporating atoms will collide with the gas and condense on the nanobridge. Landman and Wei have shown that these microscopic processes can be included in the stochastic hydrodynamic Navier-Stokes equations, and that the newly modified equations reproduce faithfully the results of their atomistic molecular dynamics experiments.

"Knowing that the hydrodynamic theory, that is the basis of venerable technologies around us, can be extended to the nanoscale is fundamentally significant, and a big relief” said Landman. "Particularly so, now that we have been able to use it to describe the behavior of nanofluids in a non-vacuous environment – since we expect that this is where most future applications would occur.” ###

Thursday, February 22, 2007

Biologically inspired sensors can augment sonar, vision system in submarines

CHAMPAIGN, Ill. — To find prey and avoid being preyed upon, fish rely on a row of specialized sensory organs along the sides of their bodies, called the lateral line. Now, a research team led by Chang Liu at the University of Illinois at Urbana-Champaign has built an artificial lateral line that can provide the same functions in underwater vehicles.

“Our development of an artificial lateral line is aimed at enhancing human ability to detect, navigate and survive in the underwater environment,” said Liu, a Willett Scholar and a professor of electrical and computer engineering at Illinois. “Our goal is to develop an artificial device that mimics the functions and capabilities of the biological system.”

In fish, the lateral line provides guidance for synchronized swimming, predator and obstacle avoidance, and prey detection and tracking. Equipped with an artificial lateral line,

a submarine or underwater robot could similarly detect and track moving underwater targets, and avoid collisions with moving or stationary objects.

The artificial lateral line consists of an integrated linear array of micro fabricated flow sensors, with the sizes of individual sensors and spacings between them matching those of their biological counterpart.

Liu and colleagues at Illinois and at Bowling Green State University described their work in the Dec. 12, 2006, issue of the Proceedings of the National Academy of Sciences.

To fabricate the tiny, three-dimensional structures, individual components are first cast in place on sacrificial layers using photolithography and planar deposition. A small amount of magnetic material is electroplated onto each of the parts, which are then freed from the substrate by an etchant. When a magnetic field is applied, the induced torque causes the pieces to rotate out of the plane on tiny hinges and lock into place.

Each sensor is integrated with metal-oxide-superconductor circuitry for on-chip signal processing, noise reduction and data acquisition. The largest array the researchers have built consists of 16 flow sensors with 1 millimeter spacing. Each sensor is 400 microns wide and 600 microns tall.

In tests, the researchers’ artificial lateral line was able to localize a nearby underwater vibrating source, and could detect the hydrodynamic wake (such as the wake formed behind a propeller-driven submarine) for long-distance tracking. With further advances in engineering, man-made underwater vehicles should be able to autonomously image hydrodynamic events from their surroundings, Liu said.

“Although biology remains far superior to human engineering, having a man-made parallel of the biological system allows us to learn much about both basic science and engineering,” Liu said. “To actively learn from biology at the molecular, cellular, tissue and organism level is still the bigger picture.”

Chang Liu, a Willett Scholar and a professor of electrical and computer engineering at Illinois, holds one of the models (also below) that he and his postdoctoral research assoociate, Yingchen Yang, are using to test their artificial lateral line line. Their research could assist autonomous underwater robots. Photo by L. Brian Stauffer.

The work was funded by the U.S. Air Force Office of Scientific Research and by the Defense Advanced Research Projects Agency.

Wednesday, February 21, 2007

CHAMPAIGN, Ill. — Creating high-resolution metallic interconnects is an essential part of the fabrication of microchips and other nanoscale devices. Researchers at the University of Illinois at Urbana-Champaign have developed a simple and robust electrochemical process for the direct patterning of metallic interconnects and other nanostructures.

“Solid-state superionic stamping offers a new approach, both as a stand-alone process and as a complement to other nanofabrication techniques, for creating chemical sensors, photonic structures and electrical interconnects,” said Nicholas X. Fang, a professor of mechanical science and engineering, and corresponding author of a paper published in the Feb. 14 issue of the journal Nano Letters.

The S4 process uses a patterned superionic material as a stamp, and etches a metallic film by an electrochemical reaction. In superionic materials, metal ions can move almost freely around the crystal lattice. These mobile materials can also be used in batteries and fuel cells.

Unlike conventional processing – in which patterns are first placed on photoresist, followed by metal deposition and subsequent etching – the S4 process creates high-resolution metallic nanopatterns in a single step, potentially reducing manufacturing costs and increasing yields.

The S4 process begins by carving the desired pattern into a stamp made of superionic material, such as silver sulfide, using focused ion beam milling. The stamp is then placed on the substrate and a voltage is applied. This produces an electrochemical reaction at the contact points of the interface.

The reaction generates metal ions, which migrate across the interface into the stamp. As the reaction continues, the stamp progresses into the substrate, generating features complementary to the pattern on the stamp.

“The most difficult step in the S4 process is making the stamp extremely flat and smooth,” said graduate student Keng H. Hsu, the paper’s lead author. “Currently, our resolution for patterning details is 50 nanometers. As better tools for engraving the stamps are developed, we will achieve finer resolution.”

Ultimately, the resolution will be limited by the mechanical properties of the stamp, Hsu said.

With Fang and Hsu, co-authors of the paper are Placid M. Ferreira, a U. of I. professor of mechanical science and engineering, and director of NanoCEMMS; and graduate student Peter L. Schultz.

The work was funded by the U.S. Department of Energy and the National Science Foundation.

Tuesday, February 20, 2007

A new technology that could drastically reduce the amount of pollution emitted by a range of industrial processes has received a prestigious award from the Royal Society.

The technology traps carbon dioxide (CO2) and other pollutants so they can be removed and, where possible, recycled back into the production process.

Although its first applications are most likely to be in the beverage industry, the technology could find uses in other areas, such as removing benzene from petrol vapour at filling stations.

The technology is made up of nano-porous fibres that have tiny pores less than 1,000th of the width of a human hair and contain materials which trap volatile hydrocarbons and other gases so they can be removed from the air flow.

Early trials of the technology have shown that it uses less than five per cent of the energy needed by the cleaning processes currently used in industry.

The £185,000 Brian Mercer Award for Innovation from the Royal Society will be used to help develop the technology to a stage where it has proven its commercial viability.

“It is great to have won such a prestigious award that will help us take this technology forward,” said Dr Semali Perera, who developed the technology with research officer Chin Chih Tai in the University’s Department of Chemical Engineering.

“The technologies currently used to clean process waste are usually energy hungry themselves, so our technology offers a great saving and could help to reduce the environmental impact even further.”

Devices using the technology could be tailored to remove or recycle a diverse range of gases by varying the composition of fibres employed. Because the fibres can be ’spun’ with a high surface area to volume ratio, the devices have superior efficiency and can be constructed in compact configurations making them suitable for applications in which space is a particular constraint.

“I have been very impressed by the support that Research & Innovation Services has provided both in helping to secure this award and more generally in advising on the commercial direction for our research,” said Dr Perera.

“Commercialisation of this technology would not have been possible without RIS’s assistance in the filing the patent applications that cover our developments.”

According to figures recently released by the Department for Environment, Food & Rural Affairs, net emissions of carbon dioxide in the UK were over 500 million tonnes in 2005. This represents about two per cent of the global man-made emissions.

“The technology could make an important contribution to cutting emissions of a range of different pollutants,” said David Coleman, Technology Transfer Manager at the University of Bath who has been working on the commercial strategy for the technology.

“Although our initial thoughts have been geared towards the beverage industry, where recovery and reuse of CO2 could lead to significant operational savings, there are clear opportunities in a wide range of other different areas as well.

“The House of Commons Science & Technology Committee in 2005 stressed the importance of having zero-emission processing plants and encouraged greater deployment of CO2 capture, reuse and storage.

“This technology goes some way to achieving this, and we therefore believe that it could have a very exciting future.”

The technology has also received significant industrial backing, in particular from Colin Billiet, the former CEO of domnick hunter group Plc. Colin Billiet is now working with the University on a new spin-out venture, ’nano-porous solutions Limited‘, in which it is intended that the novel technology will be developed further and commercialised.

“The technology developed by Dr Perera at the University of Bath offers fantastic potential, especially in environmental applications where CO2 and Volatile Organic Compound recovery are important areas for business development,” said Mr Billiet, CEO for the new company.

“The novel nano-porous fibre technology provides for much lower energy consumption as well as providing solutions for which current technology is unsuitable.

“I am excited to be working with the University of Bath and with this new and innovative technology which has the potential to have a significant positive impact on the environment world-wide.”

Dr Perera commented: “We’re delighted to be working with Colin Billiet. He has a vast amount of experience in this industry and has already contributed significantly to our commercial thinking.”

The Brian Mercer Awards for Innovation were established by the Royal Society in 2001 as the result of a bequest received from the late Dr Brian Mercer. Dr Mercer was an enthusiastic inventor and entrepreneur and these awards aim to encourage these qualities in the next generation of scientists.

In 2005, Professor Julian Vincent from the Department of Mechanical Engineering at the University of Bath won a Brian Mercer Feasibility Award to help develop novel dehumidifier technology inspired by the desert cockroach.

NotesThe SETsquared Partnership, the research and enterprise collaboration of the universities of Bath, Bristol, Southampton and Surrey, has grown into a comprehensive and strategic approach for advancing enterprise and maximising the universities' impact on the UK economy.

The Partnership's universities: * Generate quality "spin-out" companies from university-research discoveries * Support high-growth science and technology companies from the universities' local communities, by providing entrepreneurs with business mentoring, routes to funding, access to industry specialists as well as affordable office space. * Collaborate with established companies, including large corporations such as Rolls-Royce, providing them university experts and access to top-rated facilities to further their business. * Prepare students and staff with the business skills to become the UK's next entrepreneurs

Spin-out Companies:

The stock market flotation of four spin-out companies since the start of 2002 created a combined market capitalisation of over £160 million and the Partnership has raised over £45 million of follow-on funding for various ventures in difficult markets and has succeeded with a number of trade sales.

The SETsquared Business Acceleration Centres:

The business-support incubators at the Universities of Bath, Bath in Swindon, Bristol, Southampton and Surrey, have helped new ventures to raise more than £25 million of early stage funding. Over 50 companies a year are supported with a network of more than 450 seasoned technology entrepreneurs, investors and support professionals.

The University of Bath is one of the UK's leading universities, with an international reputation for quality research and teaching. In 16 subject areas the University of Bath is rated in the top ten in the country. View a full list of the University's press releases: bath.ac.uk/news/releases

Monday, February 19, 2007

Advances in digital electronic circuits have prompted the boost in functions and ever- smaller size of such popular consumer goods as digital cameras, MP3 players and digital televisions.

But the same cannot be said of the older analog circuits in the same devices, which process natural sights and sounds in the real world. Because analog circuits haven't enjoyed a similar rate of progress, they are draining power and causing other bottlenecks in improved consumer electronic devices.

Now MIT engineers have devised new analog circuits they hope will change that. Their work was discussed at the International Solid State Circuits Conference (ISSCC) in San Francisco Feb. 11-15.

"During the past several decades engineers have focused on allowing signals to be processed and stored in digital forms," said Hae-Seung Lee, a professor in MIT's Microsystems Technology Laboratories (MTL) and the Department of Electrical Engineering and Computer Science (EECS). "But most real-world signals are analog signals, so analog circuits are an essential part of most electronic systems."

Analog circuits are used to amplify, process and filter analog signals and convert them to digital signals, or vice versa, so the real world and electronic devices can talk to each other. Analog signals are continuous and they vary in size, whereas digital signals have specific or discrete values.

The reason the two different types of electronic signal circuits did not advance at the same pace, Lee said, is because they are very different. Digital circuits can be decreased in size more easily, for example, by using the popular complementary metal oxide semiconductor (CMOS) technology. And much of the design and performance enhancement can actually be done by computer software rather than by a human. That's not the case with analog circuits, which Lee said require clever designs by humans to be improved because of their variable nature.

"There is a lot of room for innovation in the human design," he said. "The importance of analog circuits is growing in light of the digital improvements, so engineers can make a difference in products by working on them." Currently, analog circuits are rather expensive and they consume a disproportionate amount of power compared with digital circuits.

Another blow to analog circuits is that the advancements in fabrication (manufacturing) technology to improve digital circuits have had a negative impact on them. Traditionally, many conventional analog circuits have relied upon devices known as operational amplifiers. Two negative side effects that advanced fabrication technologies have had on operational amplifier-based analog circuits are that when used in consumer or other devices, they have reduced the range of the analog signal and decreased the device's gain. To compensate for these shortcomings, analog circuits must consume much more power, thus draining precious energy from batteries.

In addition, it still is not clear whether traditional operational amplifier-based circuits can be applied to emerging technologies such as carbon nanotube/nanowire devices and molecular devices.

Lee's research group, in collaboration with Professor Charles Sodini's group in MIT's MTL and EECS, recently demonstrated a new class of analog circuits that Lee said eliminates operational amplifiers while maintaining virtually all benefits of operational amplifier-based circuits. These new comparator-based switched capacitor (CBSC) circuits handle voltage differently than conventional analog ones, resulting in greater power efficiency.

"The new work coming out of MIT offers the intriguing possibility of eliminating operational amplifiers by proposing an architecture that relies on circuit blocks that are much more readily implemented on supply voltages of 1 volt or less," said Dave Robertson, high-speed converter product line director at Analog Devices Inc. in Norwood, Mass., and data converter subcommittee chair at ISSCC.

Lee said CBSC may enable high-performance analog circuits in emerging technologies because it would be easier to implement comparators than operational amplifiers in these technologies.

The first prototype MIT CBSC was demonstrated in an analog-to-digital converter and presented at 2006 ISSCC. The second prototype, an 8-bit, 200 MHz analog-to-digital converter, will be presented at the conference this month.

Other key members of the research team are EECS graduate students John Fiorenza and Todd Sepke, who were involved in the work presented in 2006; EECS graduate student Lane Brooks, who worked on the current prototype; and Peter Holloway of National Semiconductor Corp.

The research leading to the 2006 ISSCC paper was funded by Microelectronics Advanced Research Corp. The research leading to the paper presented this month was funded by the MIT Center for Integrated Circuits and Systems and a National Defense Science and Engineering Graduate Fellowship.

From the fields to the grocery store shelves, nanotechnology – technology that allows the control of unique, sub-molecular properties of matter – is revolutionizing the way food is produced, packaged and distributed, leaving many in the industry grappling with nanotechnology's numerous implications.

Michigan State University professors Sue Selke and John Stone are among a group of experts who will address questions surrounding the union of agriculture and nanotech during today's symposium, "What is Agrifood Technology?: Technical, Ethical, Legal and Social Questions," at the American Association for the Advancement of Science annual meeting.

Selke and Stone are from the Institute for Food and Agricultural Standards.

"It's not just food. Everything from food-processing equipment to packaging and distribution systems are being affected by nanotechnology," Stone said. "Applications are found throughout the supply chain."

Selke points out that nanotechnology plays an important role in the packaging of agrifood products. For instance, the interiors of snack food packages are often coated with a shiny, nano-thin layer of aluminum.

"This aluminum layer is much thinner than a piece of tissue paper and is an effective and economically beneficial way for keeping oxygen from getting in and keeping moisture out," Selke said.

Nanotechnology also can be helpful in selecting ripe produce. Special sensors with nanotech components capable of detecting the ripeness and freshness of packaged produce are used in stores today.

The sensors work by measuring the concentrations of oxygen within the package. A marker on the exterior of the package turns color, indicating to buyers that the produce has ripened to perfection.

Similar sensors able to detect microbial concentrations growing in food, drugs and medical devices have the potential to improve safety.

Despite the potential benefits to agrifood producers, retailers and consumers, nanotechnology's applications in the food industry are a reason for concern for many.

Stone points out that privacy and control issues associated with agrifood and nanotechnology are likely to be among several hot-button issues.

Many companies store sensitive shipping and distribution information on chips which can be scanned and loaded onto computers and rendered insecure.

Also, there is the potential for the development of small environmental testing devices containing nanocomponents that may offer ordinary citizens the chance to monitor chemicals being emitted from a nearby factory or those being used on a local farm. Such advances likely would result in changing the power relationship in food and environmental politics.

"There are some people that just don't want it because nanotechnology is associated with risk, big companies, and some just don't like new technology," said Paul Thompson, MSU Kellogg Chair in Agricultural, Food and Community Ethics. "People like to think of food as a warm old-fashion kind of thing."

Thompson organized the symposium with Larry Busch, University Distinguished Professor of sociology.

During the symposium, Stone will present a model for public collaboration with government and industry to lay the groundwork for more socially responsive agrifood nanotechnology.

He calls for an ethnographic approach to public engagement that builds on the collective experience of extension agents interacting with community members.

In this model, extension agents receive training on potential nanotechnology applications in food and agriculture and work at a grass-roots level to link public perceptions of risk and opportunity to agrifood policy makers and other stakeholder groups, Stone explained. ###

Saturday, February 17, 2007

TALLAHASSEE, Fla. – Using the highest magnetic fields in the world, an international team of researchers has observed the quantum Hall effect – a much studied phenomenon of the quantum world – at room temperature.

The quantum Hall effect was previously believed to only be observable at temperatures close to absolute zero (equal to minus 459 degrees Fahrenheit or minus 273 degrees Celsius). But when scientists at the National High Magnetic Field Laboratory in the U.S. and at the High Field Magnet Laboratory in The Netherlands put a recently developed new form of carbon called graphene in very high magnetic fields, scientists were surprised by what they saw.

“At room temperature, these electron waves are usually destroyed by the jiggling atoms and the quantum effects are destroyed,” said Nobel Prize winner Horst Stormer, physics professor at Columbia University and one of the paper’s authors. “Only on rare occasions does this shimmering quantum world survive to the temperature scale of us humans.”

The quantum Hall effect is the basis for the international electrical resistance standard used to characterize even everyday materials that conduct electricity, such as the copper wires in a home. It was first discovered in 1980 by the German physicist Klaus von Klitzing, who was awarded a Nobel Prize in 1985 for his discovery. Until recently the quantum Hall effect was considered to belong to the realm of very low temperatures.

That opinion began to change, however, with the ability to create very high magnetic fields and with the discovery of graphene, a single atomic sheet of atoms about as strong as diamond. Together, these two things have allowed scientists to push this fragile quantum effect all the way to room temperature. Now there is a way to see curious and often surprising quantum effects, such as frictionless current flow and resistances as accurate as a few parts per billion, even at room temperature.

The research was carried out by scientists from the University of Manchester in England, Columbia University in New York, the National High Magnetic Field Laboratory in Tallahassee, Florida, the High Field Magnet Laboratory in Nijmegen, Netherlands, and the Foundation for Fundamental Research on Matter, also in The Netherlands. Their article appears in Science Express, the advanced online publication of Science magazine, a top American journal with international stature.

The scientists believe that these findings may one day lead to a compact resistance standard working at elevated temperatures and magnetic fields that are easily attainable at the National High Magnetic Field Laboratory.

“The more we understand the strange world of quantum physics, the better we can design the next generation of ultra-small electrical devices, which already are pushing into the quantum regime,” said Gregory S. Boebinger, director of the U.S. magnet lab.

“This is a really amazing discovery for a quantum Hall physicist,” said Uli Zeitler, senior scientist at the High Field Magnet Laboratory. “For more than two decades, we’ve focused our research on exploring new frontiers such as very low temperatures and extremely sophisticated materials, and now it appears that we can just measure a quantum Hall effect in a pencil-trace and at room temperature.”

The room temperature quantum Hall effect was discovered independently in the two high field labs, in the 45-tesla Hybrid magnet in Tallahassee and in a 33-tesla resistive magnet in Nijmegen. Both research groups agreed that a common announcement on both sides of the Atlantic was the right thing to do.

“Because so many scientists are exploring this exciting new material, we are all on this roller coaster together,” said Boebinger. “Sometimes it makes sense to put competitiveness aside and write a better paper together.”

In addition to Stormer, Boebinger and Zeitler, authors on the paper include Andre Geim and Kostya Novoselov of the University of Manchester; Philip Kim, Zhigang Jiang and Y. Zhang at Columbia, and Jan Kees Maan, director of the High Field Magnet Lab.

This work is supported by the National Science Foundation, the U.S. Department of Energy, the Microsoft Corp., and the W.M. Keck Foundation.

field facilities that faculty and visiting scientists and engineers use for research. The laboratory is sponsored by the National Science Foundation and the state of Florida. To learn more, visit magnet.fsu.edu.