NERSC in the News

NERSC director Kathy Yelick explains why supercomputers must change drastically in the next decade. Future technologies group lead John Shalf describes how low-power chips--such the mobile phone chips used in his research--may be part of the solution.

Eighteen examples of the heaviest antiparticle ever found, the nucleus of antihelium-4, have been made in the STAR experiment at RHIC, the Relativistic Heavy Ion Collider at the U.S. Department of Energy’s Brookhaven National Laboratory. NERSC provides computing resources to the STAR experiment.

Researchers at the University of California, Santa Barbara, say they've figured out the cause of a problem that's made light-emitting diodes (LEDs) impractical for general lighting purposes. Their work will help engineers develop a new generation of high-performance, energy-efficient lighting that could replace incandescent and fluorescent bulbs.

Using bioinformatics techniques, researchers computing at NERSC have developed a pattern-matching algorithm that improves the accuracy of peptide identification by between 50 and 150 percent, compared to standard approaches.

Argonne National Laboratory and Lawrence Berkeley National Laboratory designed Magellan to test the feasibility of cloud computing for computational science — an experiment in experimentation, if you will, that could ultimately change how science is performed across the globe.

Using supercomputers at the National Energy Research Scientific Computing Center (NERSC), Sukanya Chakrabarti, has developed a mathematical method uncover these “dark” satellites. When she applied this method to our own Milky Way galaxy, Chakrabarti discovered a faint satellite might be lurking on the opposite side of the galaxy from Earth, approximately 300,000 light-years from the galactic center.

Gil Compo describes the 20th Century Reanalysis Project that he leads for the National Oceanic and Atmospheric Administration. The project relies on 150 years of weather data and simulations run on Department of Energy supercomputers. http://ascr-discovery.science.doe.gov

team of researchers led by Jean-Luc Vay of Berkeley Lab’s Accelerator and Fusion Research Division (AFRD) has borrowed a page from Einstein to perfect a new method for calculating what happens when a laser pulse plows through a plasma in an accelerator like BELLA. Using their boosted-frame method, Vay’s team has achieved full 3D simulations of a BELLA stage in just a few hours of supercomputer time, calculations that would have been beyond the state of the art just two years ago.

At NERSC, there is over 13 PB of data on tape: 30-40% of its tape activity is reads, it has a measured and proven reliability of 99.945%, and its $/GB cost is around 5% of that of its disk storage. Jason Hick explains how and why.

The National Energy Research Scientific Computing Center (NERSC), which is supported by DOE’s Office of Science, awarded the computer time to Dr. Michael Gao, a materials scientist, and Dr. De Nyago Tafen, a senior research scientist, both in NETL’s Office of Research and Development.

Department of Energy's (DOE) Lawrence Berkeley National Laboratory have developed a fast and efficient way to determine the structure of proteins, shortening a process that often takes years into a matter of days.

Top500 List Released

The TOP500 Supercomputer Sites is compiled by Hans Meuer of the University of Mannheim, Germany; Erich Strohmaier and Horst Simon of NERSC/Lawrence Berkeley National Laboratory; and Jack Dongarra of the University of Tennessee, Knoxville.

Operating under the banner of the Software Studies Initiative and computing at NERSC, Lev Manovich and a handful of other scholars at UCSD’s Center for Research in Computing and the Arts is bringing the power of supercomputing to cultural analytics. “Part of what I am trying to do,” Manovich said, “is to find visual forms in datasets which do not simply reveal patterns, but reveal patterns with some kind of connotational or additional meanings that correspond to the data."

The hard-charging scientists at Berkeley Lab's Computational Sciences and Engineering and the National Energy Research Scientific Computing Center (NERSC) have developed an industrial-strength simulation code to model CO2 injection into these underground saline reservoirs.

A white paper titled "Visualization From the Skinny Guys at Big Supercomputer Centers," takes a look at the importance of the role of data analysis, or scientific visualization, in understanding the results that come from high performance computing systems.

A new whitepaper from Wes Bethel and company from LBNL looks at the importance of visualization in big supercomputing centers and how viz specialists, the “Skinny Guys,” give scientists the power to see.

The 20th Century Reanalysis Project (20CR), a joint project between the National Oceanic and Atmospheric Administration and the University of Colorado, has brought together 27 international climatologists to create a comprehensive reanalysis of all global weather events from 1871 to present day, effectively creating an accessible time machine for climate scientists.

The 20th Century Reanalysis Project, outlined in the Quarterly Journal of the Royal Meteorological Society, not only allows researchers to understand the long-term impact of extreme weather, but provides key historical comparisons for our own changing climate.

GPU Computing Ushers in Progress

MIT researchers computing at NERSC have revealed the way that a molecule discovered in 1996, called fulvalene diruthenium, can store and release the sun's heat on demand, opening the way to a new approach of obtaining and storing heat and energy.

Chance to Spot Rare Supernova Fading Fast

Supernova 2011fe is bringing out the stargazers The supernova will last for more than a decade, but it won't stay this bright. Within the next week, the light that took 21 million years to reach earth will fade out of view for amateur astronomers.

Named Edison, NERSC's new machine is a Cray XC30 with 664 compute nodes and 10,624 cores. Each node has two eight-core Intel "Sandy Bridge" processors running at 2.6 GHz (16 cores per node), and has 64 GB of memory.