Mira Supercomputer Shaping Fusion Plasma Research

This image shows trapped (left cross-section) and passing (right cross-section) electrons carried in the bootstrap current of a tokamak.

The IBM Blue Gene/Q supercomputer Mira, housed at the Argonne national laboratory Argonne Leadership Computing Facility (ACLF), is delivering new insights into the physics behind nuclear fusion, helping researchers to develop a new understanding of the electron behavior in edge plasma – a critical step to creating an efficient fusion reaction.

Principal investigator CS Chang, head of the US SciDAC-3 Partnership for Edge Physics Simulation, headquartered at Princeton Plasma Physics Laboratory (PPPL), and co-investigator Robert Hager of PPPL recently led the team that developed this new insight into the properties of the self-generating electrical current that boosts power in a tokamak fusion reactor.

To develop the best predictive tools for the International Thermonuclear Experimental Reactor (ITER) – and other experimental fusion reactors in the future – research teams are using HPC to better understand the behaviour of fusion plasma. “Right now, it is only through HPC that researchers can simulate plasma kinetics for large experiments like ITER with enough simulated electrons to resolve important physics,” said Tim Williams, Argonne Leadership Computing Facility (ALCF) computational scientist.

With the first plant-scale prototype fusion reactor, ITER, currently under construction in France and slated to begin experimental tests around 2025, it is a critical time for the international teams of scientists and engineers to understand the mechanics behind delivering a sustainable efficient fusion reaction.

The team used the latest iteration, XGCa, of the XGC gyrokinetic code. XGC is the only code capable of simulating electron particle behaviour from first-principles in realistic tokamak edge geometry, making it a computationally demanding code that requires petaflop scale systems like Mira.

When researchers initially ran into problems running their large-scale simulations, Williams and ALCF systems specialist Adam Scovel identified where the code was hanging in a BlueGene-specific library function.

They found a way to work around the problem that allowed us to perform the simulation. Otherwise, it would have been difficult,’ Chang said. ‘We could not have done this study without a system of Mira’s size.”

Running on more than 260,000 Mira processing cores, the latest XGCa plasma edge simulations revealed electron behaviour related to edge bootstrap current that are not accurately predicted for in present-day tokamak geometry by the well-known Sauter formula, which is used to calculate values for the bootstrap current.

Mira allows running simulations of larger tokamaks at ITER’s scale, and modelling at much higher particle counts more accurately represents the electron populations in the plasma,’ Williams said.

The team evaluated the precision of the Sauter formula. When compared to the XGCa simulation results, the standard deviation was found to be approximately 19 per cent. The modified formula significantly reduces uncertainty to a standard deviation of 4 per cent.

This more precise formula, which was derived using simulations at a scale that would be impossible without HPC, provides fusion researchers a higher confidence in their predictions allowing them to better project the fusion power efficiency of ITER and, in turn, help to refine parameters for the magnetic field, temperature, and other controls during experimental operations.

ITER’s goal is to see 10 times energy output versus energy input or more, and the faster we can get there, the sooner we will see commercial fusion power,’ concluded Chang.

To read more about this research, the full story can be found on the DOE website. This research is supported by DOE’s Office of Science. Computing time at the ALCF was allocated through DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. The ALCF is a US Department of Energy (DOE) Office of Science User Facility.

Resource Links:

Latest Video

Industry Perspectives

In this podcast, Terri Quinn from LLNL provides an update on Hardware and Integration (HI) at the Exascale Computing Project. "The US Department of Energy (DOE) national laboratories will acquire, install, and operate the nation’s first exascale-class systems. ECP is responsible for assisting with applications and software and accelerating the research and development of critical commercial exascale system hardware. ECP’s Hardware and Integration research focus area (HI), was created to help the laboratories and the ECP teams achieve success through mutually beneficial collaborations." [Read More...]

White Papers

As data centers and high performance computing continue to drive demand for higher densities and increased efficiency, liquid cooling is expanding as a method of thermal management. Download the new white paper from CPC that provides a technical guide for connectors when considering a liquid cooling system for your HPC and data center environments.