Probabilistic Modeling for Nuclear Fission and Heavy-Ion Collisions

Ron Soltz (17-ERD-022)

Executive Summary

This research project is developing new uncertainty quantification tools that will lead to advances in nuclear theory and the interpretation of heavy-ion data. Our new probabilistic modeling capability will result in a novel uncertainty quantification-modeling framework for enabling a predictive theory of nuclear fission, which is important for stockpile stewardship, and a new method for probing the structure of the quark–gluon plasma formed one microsecond after the Big Bang.

Project Description

A complete microscopic theory of nuclear fission remains elusive because of the challenge of obtaining complete experimental data sets and the shear complexity of the nuclear fission process. Mastering this complexity requires the development of new time-dependent methods in nuclear density functional theory, combined with many orders-of-magnitude improvements in computational capabilities and the application of the Bayesian statistical method to scientific modeling. The Bayesian statistical method compares probability calculations with real information to guide the statistical inference process. Initial success in applying these tools to fission calculations has set the stage for further advances that will enable accurate predictions of fission observables. However, the quality of these predictions will ultimately depend on our fundamental understanding of nuclear forces inside the nucleus. New statistical and computational tools are needed to address uncertainty quantification for understanding the fundamental mechanisms of nuclear fission. We intend to provide a fundamental advance in the treatment of probabilistic modeling as it applies to nuclear fission calculations of nuclear database inputs and produced particle distributions in heavy-ion collisions. We will develop new probabilistic modeling methods to address two important problems in nuclear physics: the development of a density functional theory of nuclear fission and the extraction of physics from jet quenching in heavy-ion collisions. The first is crucial for building a new theory of nuclear fission that is capable of predicting prompt neutron distributions with correlated uncertainties, and the second is essential for understanding the microscopic structure of the quark–gluon plasma, a state of matter formed one microsecond after the Big Bang and recreated in experiments at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory in New York and the Large Hadron Collider at CERN near Geneva.

We expect to develop, for the first time, an estimate of fission-product yield uncertainties in a microscopic approach. We also expect to integrate, in a single software suite, the codes used for calculations of fission-product yields, as well as those used for the calculation of the neutron and photon spectrum. The science applications we will address for nuclear fission and jet quenching in heavy-ion collisions have similar needs for developing new Bayesian methods, and will benefit by coordinating work with a statistics group and developing a common code base. Rigorous treatment of errors and Bayesian methods is a key feature for making progress in each application. For the density functional theory of nuclear fission, the accuracy in simulations of quantities such as the fission product yields in actinide nuclei is limited by the low dimensionality of the calculations and the lack of control on uncertainties in density functional theory parameters. Also, for heavy-ion collisions, a probabilistic approach to jet finding is needed to achieve an unbiased estimate of the jet-quenching physics parameters. The uncertainty quantification tools developed for these applications require methodological advances in the treatment of constrained outputs, and the results of this research will be of significant interest to the uncertainty quantification community. Furthermore, the advances in the science applications have strategic impact for Livermore. The generation of covariant errors in density functional theory outputs will be incorporated into fission product yields and neutron distributions in the Generalized Nuclear Data format source code. Advances in jet quenching will benefit the construction of a new dedicated jet detector at the Relativistic Heavy Ion Collider, which is intended to provide a path to the Electron Ion Collider, the top new construction priority within the U.S. nuclear physics community. Our research will build on Livermore expertise and leadership in nuclear theory, heavy-ion physics, and uncertainty quantification to develop new uncertainty quantification tools that will lead to advances in nuclear theory and the interpretation of heavy-ion data. Our new probabilistic modeling capability will result in a novel uncertainty quantification-modeling framework applicable to developing a predictive theory of nuclear fission and a new method for probing the structure of the quark–gluon plasma.

Mission Relevance

This research project will strengthen our understanding of the fundamental mechanisms of nuclear fission, which is important for science-based stockpile stewardship and supports the Laboratory's core competency in nuclear, chemical, and isotopic science and technology. The research also supports the NNSA goal to understand the condition of the nuclear stockpile as well as the DOE goal to maintain the safety, security, and effectiveness of the nation’s nuclear deterrent without nuclear testing. The new method for probing the microscopic properties of the quark–gluon plasma will be used to improve the design of a new jet detector for the Relativistic Heavy Ion Collider and to provide a path to the Electron Ion Collider at Brookhaven, the top new construction priority within the U.S. nuclear physics community.

FY17 Accomplishments and Results

In FY17 we (1) constructed a Gaussian process continuous-domain statistical model to predict fission product yields in plutonium-239(n,f) as a function of the initial state of the compound nucleus; (2) trained the Gaussian process on a series of model calculations run on Livermore's Quartz supercomputer platform with the FELIX finite-element solver code, which implements the time-dependent generator-coordinate method extension of nuclear density functional theory and is used to predict fission product yields in actinides; and (3) integrated the PYTHIA jet model (which provides a generation of high-energy physics events, namely hard interactions between elementary particles such as electrons, positrons, protons, and antiprotons in various combinations) with the parametric initial condition model TRENTO within a Python scripting-language framework that will be used to tune jet-finding parameters in a heavy-ion environment.

This figure shows the results from a study applying Bayesian methodology to test the utility and accuracy of applying background subtraction techniques to jet-finding algorithms in heavy ion collisions, where jets are collimated streams of particles that are sensitive to the energy density and structure of the quark–gluon plasma created in the collisions. Here q is a quench factor by which all particle momenta in the quark-jet are scaled to simulate the loss of energy as the jet traverses the plasma. We selected q=0.5 in our simulation, implying that the jet loses half of its momentum. Left: Posterior probability mass function as a function of q for different centralities with (solid) and without (dashed) background subtraction, evaluated using calibration data with q = 0.5. Right: Posterior probability for q = 0.5 as a function of centrality with (blue) and without (red) subtraction. This initial study implies that background subtraction provides a significant advantage for the most central (0–20%) collisions with the highest multiplicities, but may not provide much benefit outside this realm. This conclusion will soon be tested with higher fidelity models.