Tag Archives: US DoE

There’s an alphabet soup’s worth of agencies involved in research on lithium-ion battery ageing which has resulted in two papers as noted in a May 30, 2014 news item Azonano,

Batteries do not age gracefully. The lithium ions that power portable electronics cause lingering structural damage with each cycle of charge and discharge, making devices from smartphones to tablets tick toward zero faster and faster over time. To stop or slow this steady degradation, scientists must track and tweak the imperfect chemistry of lithium-ion batteries with nanoscale precision.

In two recent Nature Communications papers, scientists from several U.S. Department of Energy national laboratories—Lawrence Berkeley, Brookhaven, SLAC, and the National Renewable Energy Laboratory—collaborated to map these crucial billionths-of-a-meter dynamics and lay the foundation for better batteries.

“We discovered surprising and never-before-seen evolution and degradation patterns in two key battery materials,” said Huolin Xin, a materials scientist at Brookhaven Lab’s Center for Functional Nanomaterials (CFN) and coauthor on both studies. “Contrary to large-scale observation, the lithium-ion reactions actually erode the materials non-uniformly, seizing upon intrinsic vulnerabilities in atomic structure in the same way that rust creeps unevenly across stainless steel.”

Xin used world-leading electron microscopy techniques in both studies to directly visualize the nanoscale chemical transformations of battery components during each step of the charge-discharge process. In an elegant and ingenious setup, the collaborations separately explored a nickel-oxide anode and a lithium-nickel-manganese-cobalt-oxide cathode—both notable for high capacity and cyclability—by placing samples inside common coin-cell batteries running under different voltages.

“Armed with a precise map of the materials’ erosion, we can plan new ways to break the patterns and improve performance,” Xin said.

In these experiments, lithium ions traveled through an electrolyte solution, moving into an anode when charging and a cathode when discharging. The processes were regulated by electrons in the electrical circuit, but the ions’ journeys—and the battery structures—subtly changed each time.

The news release first describes the research involving the nickel-oxide anode, one of the two areas of interest,

For the nickel-oxide anode, researchers submerged the batteries in a liquid organic electrolyte and closely controlled the charging rates. They stopped at predetermined intervals to extract and analyze the anode. Xin and his collaborators rotated 20-nanometer-thick sheets of the post-reaction material inside a carefully calibrated transmission electron microscope (TEM) grid at CFN to catch the contours from every angle—a process called electron tomography.

To see the way the lithium-ions reacted with the nickel oxide, the scientists used a suite of custom-written software to digitally reconstruct the three-dimensional nanostructures with single-nanometer resolution. Surprisingly, the reactions sprang up at isolated spatial points rather than sweeping evenly across the surface.

“Consider the way snowflakes only form around tiny particles or bits of dirt in the air,” Xin said. “Without an irregularity to glom onto, the crystals cannot take shape. Our nickel oxide anode only transforms into metallic nickel through nanoscale inhomogeneities or defects in the surface structure, a bit like chinks in the anode’s armor.”

The answers hinged on intrinsic material qualities and the structural degradation caused by cycles at 4.7 volts and 4.3 volts, as measured against a lithium metal standard.

As revealed through another series of coin-cell battery tests, 4.7 volts caused rapid decomposition of the electrolytes and poor cycling—the higher power comes at a price. A 4.3-volt battery, however, offered a much longer cycling lifetime at the cost of lower storage and more frequent recharges.

In both cases, the chemical evolution exhibited sprawling surface asymmetries, though not without profound patterns.

“As the lithium ions race through the reaction layers, they cause clumping crystallization—a kind of rock-salt matrix builds up over time and begins limiting performance,” Xin said. “We found that these structures tended to form along the lithium-ion reaction channels, which we directly visualized under the TEM. The effect was even more pronounced at higher voltages, explaining the more rapid deterioration.”

Identifying this crystal-laden reaction pathways hints at a way forward in battery design.

“It may be possible to use atomic deposition to coat the NMC cathodes with elements that resist crystallization, creating nanoscale boundaries within the micron-sized powders needed at the cutting-edge of industry,” Xin said. “In fact, Berkeley Lab battery experts Marca Doeff and Feng Lin are working on that now.”

Shirley Meng, a professor at UC San Diego’s Department of NanoEngineering, added, “This beautiful study combines several complementary tools that probe both the bulk and surface of the NMC layered oxide—one of the most promising cathode materials for high-voltage operation that enables higher energy density in lithium-ion batteries. The meaningful insights provided by this study will significantly impact the optimization strategies for this type of cathode material.”

The TEM measurements revealed the atomic structures while electron energy loss spectroscopy helped pinpoint the chemical evolution—both carried out at the CFN….

The scientists next want to observe these changes in real-time which will necessitate the custom design of some new equipment (“electrochemical contacts and liquid flow holders”).

Typically, the process of corrosion has been studied from the metal side of the equation – See more at: http://www.anl.gov/articles/core-corrosion#sthash.ZPqFF13I.dpuf. Courtesy of the Argonne National Laboratory

A Feb. 18, 2014 news item on Nanowerk expands on the theme of corrosion as destruction (Note: Links have been removed),

Anyone who has ever owned a car in a snowy town – or a boat in a salty sea – can tell you just how expensive corrosion can be.

One of the world’s most common and costly chemical reactions, corrosion happens frequently at the boundaries between water and metal surfaces. In the past, the process of corrosion has mostly been studied from the metal side of the equation.

However, in a new study (“Chloride ions induce order-disorder transition at water-oxide interfaces”), scientists at the Center for Nanoscale Materials at the U.S. Department of Energy’s Argonne National Laboratory investigated the problem from the other side, looking at the dynamics of water containing dissolved ions located in the regions near a metal surface.

A team of researchers led by Argonne materials scientist Subramanian Sankaranarayanan simulated the physical and chemical dynamics of dissolved ions in water at the atomic level as it corrodes metal oxide surfaces. “Water-based solutions behave quite differently near a metal or oxide surface than they do by themselves,” Sankaranarayanan said. “But just how the chemical ions in the water interact with a surface has been an area of intense debate.”

Under low-chlorine conditions, water tends to form two-dimensional ordered layers near solid interfaces because of the influence of its strong hydrogen bonds. However, the researchers found that increasing the proportion of chlorine ions above a certain threshold causes a change in which the solution loses its ordered nature near the surface and begins to act similar to water away from the surface. This transition, in turn, can increase the rate at which materials corrode as well as the freezing temperature of the solution.

This switch between an ordered and a disordered structure near the metal surface happens incredibly quickly, in just fractions of a nanosecond. The speed of the chemical reaction necessitates the use of high-performance computers like Argonne’s Blue/Gene Q supercomputer, Mira.

To further explore these electrochemical oxide interfaces with high-performance computers, Sankaranarayanan and his colleagues from Argonne, Harvard University and the University of Missouri have also been awarded 40 million processor-hours of time on Mira.

“Having the ability to look at these reactions in a more powerful simulation will give us the opportunity to make a more educated guess of the rates of corrosion for different scenarios,” Sankaranarayanan said. Such studies will open up for the first time fundamental studies of corrosion behavior and will allow scientists to tailor materials surfaces to improve the stability and lifetime of materials.

The circumstances around Nigel S. Lockyer’s departure as Director of Canada’s National Laboratory for Particle and Nuclear Physics, TRIUMF, are very interesting. Just weeks ago, TRIUMF announced a major innovation for producing medical isotopes (my June 9, 2013 posting), which should have an enormous impact on cities around the world and their access to medical isotopes. (Briefly, cities with cyclotrons could produce, using the technology developed by TRIUMF, their own medical isotopes without using material from nuclear reactors.)

Also in the recent past, Canada’s much storied McGill University joined the TRIUMF consortium (I’m surprized it took this long), from the May 10, 2013 news release,

At its recent Board of Management meeting, TRIUMF approved McGill University as an associate member of the consortium of universities that owns and operates Canada’s national laboratory for particle and nuclear physics. McGill joins 17 other Canadian universities in leading TRIUMF.

Paul Young, Chair of the Board and Vice President for Research at the University of Toronto, said, “The addition of McGill to the TRIUMF family is a great step forward. McGill brings world-class scientists and students to TRIUMF and TRIUMF brings world-leading research tools and partnerships to McGill.”

The university’s closer association with TRIUMF will allow it to participate in discussions about setting the direction of the laboratory as well provide enhanced partnerships for new research infrastructure that strengthens efforts on McGill’s campuses. Dr. Rose Goldstein, McGill Vice-Principal (Research and International Relations), said, “We are delighted to formalize our long-standing involvement in TRIUMF. It is an important bridge to international research opportunities at CERN and elsewhere. Associate membership in TRIUMF will also help McGill advance its Strategic Research Plan, especially in the priority area of exploring the natural environment, space, and the universe.”

McGill University has been involved in TRIUMF-led activities for several decades, most notably as part of the Higgs-hunting efforts at CERN. TRIUMF constructed parts of the Large Hadron Collider that ultimately produced Higgs bosons. The co-discovery was made by the ATLAS experiment for which TRIUMF led Canadian construction of several major components, and McGill played a key role in the development of the experiment’s trigger system. McGill and TRIUMF have also worked together on particle-physics projects in Japan and the U.S.

Professor Charles Gale, chair of the Department of Physics, played a key role in formalizing the relationship between TRIUMF and McGill. He said, “Our department is one of the top in North America in research, teaching, and service. Undoubtedly our work with TRIUMF has helped contribute to that and I expect both institutions to blossom even further.” Professor of physics and Canadian Research Chair in Particle Physics Brigitte Vachon added, “TRIUMF provides key resources to my students and me that make our research at CERN possible; the discovery of the Higgs boson is a perfect example of what such collaboration can achieve.”

Nigel S. Lockyer, director of TRIUMF, commented, “The addition of McGill to the TRIUMF team is welcome and long overdue. We have been working together for decades in subatomic physics and this acknowledgment of the partnership enhances both institutions and builds stronger ties in areas such as materials science and nuclear medicine.”

A scant month after McGill joins the consortium and weeks after a major announcement about medical isotopes, Lockyer announces his departure for the Fermilabs in the US, from the May 20, 2013 TRIUMF news release,

In his capacity as Chairman of the Board of Directors of Fermi Research Alliance, LLC, University of Chicago President Robert J. Zimmer today announced that TRIUMF’s director Nigel S. Lockyer has been selected to become the next director of the U.S. Department of Energy’s Fermi National Accelerator Laboratory, located outside Chicago. Lockyer is expected to complete his work at TRIUMF this summer and begin at Fermilab in the autumn.

Paul Young, Chair of TRIUMF’s Board of Management and Vice President of Research and Innovation at the University of Toronto said, “Nigel was selected from a truly outstanding set of international candidates for this challenging and important position. Although it will be a short-term loss, this development is a clear recognition of Nigel’s vision and passion for science and the international leadership taken by TRIUMF and Canada in subatomic physics. On behalf of the entire TRIUMF Board, we wish Nigel, TRIUMF, and Fermilab every success in the future.”

Lockyer set TRIUMF upon a new course when he arrived six years ago, focusing the team on “Advancing isotopes for science and medicine.” Based on TRIUMF’s existing infrastructure and talent, this initiative ranged from expanding the nuclear-medicine program so that it is now playing a leading role in resolving the medical-isotope crisis to the formulation and funding of a new flagship facility called ARIEL that will double TRIUMF’s capabilities for producing exotic isotopes used in science and for developing tomorrow’s medical isotopes. At the heart of ARIEL is a next-generation electron accelerator using modern superconducting radio-frequency technology.

Commenting on Nigel’s leadership of TRIUMF, Paul Young added, “One look at TRIUMF’s current trajectory and you can see that this is a man of great ambition and talent. Working with the Board and a great team at the lab, he propelled TRIUMF to new heights. We have all been fortunate at TRIUMF to have Nigel as a colleague and leader.”

Reflecting on his time at TRIUMF and the upcoming transition to Fermilab, Nigel Lockyer said, “Knowing that TRIUMF is in good hands with a superb leadership team and seeing its growing string of accomplishments has helped make this decision a tiny bit easier. The laboratory’s future is secure and TRIUMF knows exactly what it is doing. I am proud to have contributed to TRIUMF’s successes and it is my hope to ignite the same energy and enthusiasm in the U.S. by heading the team at Fermilab.” He added, “I also expect to foster a new level of partnership between the U.S. and Canada in these key areas of science and technology.”

“Nigel has had a profound impact on TRIUMF,” said David B. MacFarlane, chair of the National Research Council’s Advisory Committee on TRIUMF and Associate Laboratory Director at the U.S. SLAC National Accelerator Laboratory. “He articulated an ambitious new vision for the laboratory and energetically set it upon a path toward an exciting world-class program in rare-isotope beams and subatomic-physics research. When ARIEL comes online, the lab will be fulfilling the vision that Nigel and his team boldly initiated.” David MacFarlane added, “The TRIUMF community will certainly miss his warmth, his insatiable scientific curiosity, his creativity, and his faith in the laboratory and its entire staff. However, I fully expect these same characteristics will serve Nigel well in his new leadership role as Fermilab director.”

As per standard practice, the TRIUMF Board of Management will announce plans and timelines for the international search process and interim leadership within the next few weeks.

Before speculating on the search process and interim leadership appointment, I have a comment of sorts about the Fermilab, which was last mentioned here in my Feb. 1, 2012 posting where I excerpted this interesting comment from a news release,

In this month’s Physics World, reviews and careers editor, Margaret Harris, visits the Fermi National Accelerator Laboratory (Fermilab) to explore what future projects are in the pipeline now that the Tevatron particle accelerator has closed for good.

After 28 years of ground-breaking discoveries, the Tevatron accelerator has finally surrendered to the mighty Large Hadron Collider (LHC) at CERN [European Laboratory for Particle Physics], placing Fermilab, in some people’s mind, on the brink of disappearing into obscurity. [emphasis mine]

It seems the Fermilab is in eclipse and Lockyer is going there to engineer a turnaround. It makes one wonder what the conditions were when he arrived at TRIUMF six years ago (2006?). Leading on from that thought, the forthcoming decisions as to whom will be the interim Director and/or the next Director should be intriguing.

Usually an interim position is filled by a current staff member, which can lead to some fraught moments amongst internal competitors. That action, however fascinating, does not tend to become fodder for public consumption.

Frankly, I’m more interested in the board’s perspective. What happens if they pick an internal candidate while they prepare for the next stage when they’re conducting their international search? Based on absolutely no inside information whatsoever, I’m guessing that Tim Meyer, Head, Strategic Planning & Communications for TRIUMF, would be a viable internal candidate for interim director.

From a purely speculative position, let’s assume he makes a successful play to become the interim Director. At this point, the board will have to consider what direction is the right one for TRIUMF while weighing up the various candidates for the permanent position. Assuming the interim Director is ambitious and wants to become the permanent Director, the dynamics could get very interesting indeed.

From the board’s perspective, you want the best candidate and you want to keep your staff. In Canada, there’s one TRIUMF; there are no other comparable institutions in the country. Should an internal candidate such as Meyer get the interim position but not the permanent one (assuming he’d want to be the permanent Director) he would have very few options in Canada.

Based on this speculation, I can safety predict some very interesting times ahead for TRIUMF and its board. In the meantime, I wish Lockyer all the best as he moves back to the US to lead the Fermilab.

Scientists at the (US Dept. of Energy) Brookhaven National Laboratory can turn gold nanoparticles into catalysts using a room temperature process. From the June 11, 2013 news item on ScienceDaily,

Gold bars may signify great wealth, but the precious metal packs a much more practical punch when shrunk down to just billionths of a meter. Unfortunately, unlocking gold’s potential often requires complex synthesis techniques that produce delicate structures with extreme sensitivity to heat.

Now, scientists at the U.S. Department of Energy’s Brookhaven National Laboratory have discovered a process of creating uniquely structured gold-indium nanoparticles that combine high stability, great catalytic potential, and a simple synthesis process. The new nanostructures — detailed online June 10 in the Proceedings of the National Academy of Sciences — might enhance many different commercial and industrial processes, including acting as an efficient material for catalytic converters in cars.

“We discovered a room-temperature process that transforms a simple alloy into a nanostructure with remarkable properties,” said physicist Eli Sutter, lead author on the study. “By exposing the gold-indium alloy nanoparticles to air, ambient oxygen was able to drive an oxidation reaction that converted them into an active core-shell structure.”

The Brookhaven Lab researchers were studying oxidation processes through which metals and alloys combine with oxygen when they made the discovery. For this study, they examined alloys of a noble metal and a non-noble metal through a remarkably simple reaction technique: giving gold-indium nanoparticles a little room to breathe. Once nanoparticles of the metal alloy were exposed to oxygen, highly reactive shells of gold-indium oxide formed across their surfaces.

“Conventional wisdom would say that oxidation should push the gold atoms into the center while pulling the less noble indium to the surface, creating a noble metal core that is surrounded by a shell of non-reactive indium-oxide,” Peter Sutter said. “Instead, the oxygen actually penetrated the alloy. After oxidation, the alloy core of the nanoparticles was encapsulated by a newly formed thin shell of mixed gold-indium oxide.”

Trapping gold in the amorphous oxide shell retains its catalytic properties and prevents the gold from sintering and becoming inert. The new nanostructures proved capable of converting oxygen and carbon monoxide into carbon dioxide, demonstrating their activity as a catalyst.

“The indium and gold in the shell are not mobile, but are frozen in the amorphous, oxide,” Eli Sutter said. “Importantly, the structural integrity holds without sintering at temperatures of up to 300 degrees Celsius, making these remarkably resilient compared to other gold nanocatalysts.”

The research was conducted at Brookhaven Lab’s Center for Functional Nanomaterials (CFN), whose unique facilities for nanoscale synthesis and characterization proved central to the discovery of this new process.

“The CFN brings a wide range of state-of-the-art instruments and expertise together under one roof, accelerating research and facilitating collaboration,” Eli Sutter said. “We used transmission electron microscopy to characterize the structures and their composition, x-ray photoelectron spectroscopy to determine the chemical bonding at the surface, and ion-scattering spectroscopy to identify the outermost atoms of the nanoparticle shell.”

Further investigations will help determine the properties of the gold-indium oxide particles in different catalytic reactions, and the same oxidation process will be applied to other metal alloys to create an entire family of new functional materials.

You can find a citation and a link to the researchers’ paper if you click on the ScienceDaily news item link earlier in this posting.

Here are two (seemingly) contradictory pieces of information (1) the US Library of Congress takes over 24 hours to complete a single search of tweets archived from 2006 – 2010, according to my Jan. 16, 2013 posting, and (2) Court (Courtney) Corley, a data scientist at the US Dept. of Energy’s Pacific Northwest National Laboratory (PNNL), has a system (SALSA; SociAL Sensor Analytics) that analyzes billions of tweets in seconds. It’s a little hard to make sense out of these two very different perspectives on accessing data from tweets.

If you think keeping up with what’s happening via Twitter, Facebook and other social media is like drinking from a fire hose, multiply that by 7 billion – and you’ll have a sense of what Court Corley wakes up to every morning.

Corley, a data scientist at the Department of Energy’s Pacific Northwest National Laboratory, has created a powerful digital system capable of analyzing billions of tweets and other social media messages in just seconds, in an effort to discover patterns and make sense of all the information. His social media analysis tool, dubbed “SALSA” (SociAL Sensor Analytics), combined with extensive know-how – and a fair degree of chutzpah – allows someone like Corley to try to grasp it all.

“The world is equipped with human sensors – more than 7 billion and counting. It’s by far the most extensive sensor network on the planet. What can we learn by paying attention?” Corley said.

Among the payoffs Corley envisions are emergency responders who receive crucial early information about natural disasters such as tornadoes; a tool that public health advocates can use to better protect people’s health; and information about social unrest that could help nations protect their citizens. But finding those jewels amidst the effluent of digital minutia is a challenge.

“The task we all face is separating out the trivia, the useless information we all are blasted with every day, from the really good stuff that helps us live better lives. There’s a lot of noise, but there’s some very valuable information too.”

I was getting a little worried when I saw the bit about separating useless information from the good stuff since that can be a very personal choice. Thankfully, this followed,

One person’s digital trash is another’s digital treasure. For example, people known in social media circles as “Beliebers,” named after entertainer Justin Bieber, covet inconsequential tidbits about Justin Bieber, while “non-Beliebers” send that data straight to the recycle bin.

The amount of data is mind-bending. In social media posted just in the single year ending Aug. 31, 2012, each hour on average witnessed:

30 million comments

25 million search queries

98,000 new tweets

3.8 million blog views

4.5 million event invites

7.1 million photos uploaded

5.5 million status updates

The equivalent of 453 years of video watched

Several firms routinely sift posts on LinkedIn, Facebook, Twitter, YouTube and other social media, then analyze the data to see what’s trending. These efforts usually require a great deal of software and a lot of person-hours devoted specifically to using that application. It’s what Corley terms a manual approach.

Corley is out to change that, by creating a systematic, science-based, and automated approach for understanding patterns around events found in social media.

It’s not so simple as scanning tweets. Indeed, if Corley were to sit down and read each of the more than 20 billion entries in his data set from just a two-year period, it would take him more than 3,500 years if he spent just 5 seconds on each entry. If he hired 1 million helpers, it would take more than a day.

But it takes less than 10 seconds when he relies on PNNL’s Institutional Computing resource, drawing on a computer cluster with more than 600 nodes named Olympus, which is among the Top 500 fastest supercomputers in the world.

“We are using the institutional computing horsepower of PNNL to analyze one of the richest data sets ever available to researchers,” Corley said.

At the same time that his team is creating the computing resources to undertake the task, Corley is constructing a theory for how to analyze the data. He and his colleagues are determining baseline activity, culling the data to find routine patterns, and looking for patterns that indicate something out of the ordinary. Data might include how often a topic is the subject of social media, who is putting out the messages, and how often.

Corley notes additional challenges posed by social media. His programs analyze data in more than 60 languages, for instance. And social media users have developed a lexicon of their own and often don’t use traditional language. A post such as “aw my avalanna wristband @Avalanna @justinbieber rip angel pic.twitter.com/yldGVV7GHk” poses a challenge to people and computers alike.

Nevertheless, Corley’s program is accurate much more often than not, catching the spirit of a social media comment accurately more than three out of every four instances, and accurately detecting patterns in social media more than 90 percent of the time.

Corley’s educational background may explain the interest in emergency responders and health crises mentioned in the early part of the news release (from Corley’s PNNL webpage),

B.S. Computer Science from University of North Texas; M.S. Computer Science from University of North Texas; Ph.D. Computer Science and Engineering from University of North Texas; M.P.H (expected 2013) Public Health from University of Washington.

The reference to public health and emergency response is further developed, from the news release,

Much of the work so far has been around public health. According to media reports in China, the current H7N9 flu situation in China was highlighted on Sina Weibo, a China-based social media platform, weeks before it was recognized by government officials. And Corley’s work with the social media working group of the International Society for Disease Surveillance focuses on the use of social media for effective public health interventions.

In collaboration with the Infectious Disease Society of America and Immunizations 4 Public Health, he has focused on the early identification of emerging immunization safety concerns.

“If you want to understand the concerns of parents about vaccines, you’re never going to have the time to go out there and read hundreds of thousands, perhaps millions of tweets about those questions or concerns,” Corley said. “By creating a system that can capture trends in just a few minutes, and observe shifts in opinion minute to minute, you can stay in front of the issue, for instance, by letting physicians in certain areas know how to customize the educational materials they provide to parents of young children.”

Corley has looked closely at reaction to the vaccine that protects against HPV, which causes cervical cancer. The first vaccine was approved in 2006, when he was a graduate student, and his doctoral thesis focused on an analysis of social media messages connected to HPV. He found that creators of messages that named a specific drug company were less likely to be positive about the vaccine than others who did not mention any company by name.

Other potential applications include helping emergency responders react more efficiently to disasters like tornadoes, or identifying patterns that might indicate coming social unrest or even something as specific as a riot after a soccer game. More than a dozen college students or recent graduates are working with Corley to look at questions like these and others.

As to why the US Library of Congress requires 24 hours to search one term in their archived tweets and Corley and the PNNL require seconds to sift through two years of tweets, only two possibilities come to my mind. (1) Corley is doing a stripped down version of an archival search so his searches are not comparable to the Library of Congress searches or (2) Corley and the PNNL have far superior technology.

In addition to being a news release, this is a really good piece of science writing by Paul Preuss for the Lawrence Berkeley National Laboratory (Berkeley Lab), from the Jan. 3, 2013 Berkeley Lab news release,

Because modern computers have to depict the real world with digital representations of numbers instead of physical analogues, to simulate the continuous passage of time they have to digitize time into small slices. This kind of simulation is essential in disciplines from medical and biological research, to new materials, to fundamental considerations of quantum mechanics, and the fact that it inevitably introduces errors is an ongoing problem for scientists.

Scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have now identified and characterized the source of tenacious errors and come up with a way to separate the realistic aspects of a simulation from the artifacts of the computer method. …

Here’s more detail about the problem and solution,

How biological molecules move is hardly the only field where computer simulations of molecular-scale motion are essential. The need to use computers to test theories and model experiments that can’t be done on a lab bench is ubiquitous, and the problems that Sivak and his colleagues encountered weren’t new.

“A simulation of a physical process on a computer cannot use the exact, continuous equations of motion; the calculations must use approximations over discrete intervals of time,” says Sivak. “It’s well known that standard algorithms that use discrete time steps don’t conserve energy exactly in these calculations.”

One workhorse method for modeling molecular systems is Langevin dynamics, based on equations first developed by the French physicist Paul Langevin over a century ago to model Brownian motion. Brownian motion is the random movement of particles in a fluid (originally pollen grains on water) as they collide with the fluid’s molecules – particle paths resembling a “drunkard’s walk,” which Albert Einstein had used just a few years earlier to establish the reality of atoms and molecules. Instead of impractical-to-calculate velocity, momentum, and acceleration for every molecule in the fluid, Langevin’s method substituted an effective friction to damp the motion of the particle, plus a series of random jolts.

When Sivak and his colleagues used Langevin dynamics to model the behavior of molecular machines, they saw significant differences between what their exact theories predicted and what their simulations produced. They tried to come up with a physical picture of what it would take to produce these wrong answers.

“It was as if extra work were being done to push our molecules around,” Sivak says. “In the real world, this would be a driven physical process, but it existed only in the simulation, so we called it ‘shadow work.’ It took exactly the form of a nonequilibrium driving force.”

They first tested this insight with “toy” models having only a single degree of freedom, and found that when they ignored the shadow work, the calculations were systematically biased. But when they accounted for the shadow work, accurate calculations could be recovered.

“Next we looked at systems with hundreds or thousands of simple molecules,” says Sivak. Using models of water molecules in a box, they simulated the state of the system over time, starting from a given thermal energy but with no “pushing” from outside. “We wanted to know how far the water simulation would be pushed by the shadow work alone.”

The result confirmed that even in the absence of an explicit driving force, the finite-time-step Langevin dynamics simulation acted by itself as a driving nonequilibrium process. Systematic errors resulted from failing to separate this shadow work from the actual “protocol work” that they explicitly modeled in their simulations. For the first time, Sivak and his colleagues were able to quantify the magnitude of the deviations in various test systems.

Such simulation errors can be reduced in several ways, for example by dividing the evolution of the system into ever-finer time steps, because the shadow work is larger when the discrete time steps are larger. But doing so increases the computational expense.

The better approach is to use a correction factor that isolates the shadow work from the physically meaningful work, says Sivak. “We can apply results from our calculation in a meaningful way to characterize the error and correct for it, separating the physically realistic aspects of the simulation from the artifacts of the computer method.”

You can find out more in the Berkeley Lab news release, or (H/T) in the Jan. 3, 2013 news item on Nanowerk, or you can read the paper,

“Using nonequilibrium fluctuation theorems to understand and correct errors in equilibrium and nonequilibrium discrete Langevin dynamics simulations,” by David A. Sivak, John D. Chodera, and Gavin E. Crooks, will appear in Physical Review X (http://prx.aps.org/) and is now available as an arXiv preprint at http://arxiv.org/abs/1107.2967.

This casts a new light on the SPAUN (Semantic Pointer Architecture Unified Network) project, from Chris Eliasmith’s team at the University of Waterloo, which announced the most successful attempt (my Nov. 29, 2012 posting) yet to simulate a brain using virtual neurons. Given the probability that Eliasmith’s team was not aware of this work from the Berkeley Lab, one imagines that once it has been integrated that SPAUN will be capable of even more extraordinary feats.