Nuclear data activities for medium mass and heavy nuclei at Los Alamos
Nuclear Data (2022)
Invited presentation on 07/2022
Nuclear data is critical for many modern applications from stockpile stewardship to cutting edge scientific research. Central to these pursuits is a robust pipeline for nuclear modeling as well as data assimilation and dissemination. We summarize the ongoing nuclear data efforts at Los Alamos for medium mass to heavy nuclei. We begin with a discussion of a novel toolkit for model parameter optimization that is based on a Bayesian technique called hyperparameter optimization. This mathematical framework affords the combination of different measured data in determining model parameters and their associated correlations. It also has the advantage of being able to quantify outliers in data. We exemplify the power of this procedure by highlighting evaluated cross sections along the Pu isotopic chain and emphasize the importance of model consistency in such evaluations. Finally, we highlight the success of our tools and pipeline by covering the insight gained from incorporating the latest nuclear modeling and data in astrophysical simulations as part of the Fission In R-process Elements (FIRE) collaboration.
|Year||Authors||Title (Click for more details)||Journal (PDF)|
|2022||M. Mumpower, T. M. Sprouse, A. Lovell, A. T. Mohan||Physically Interpretable Machine Learning for nuclear masses||PRCL 106 021301|
|2022||A. Lovell, A. T. Mohan, T. M. Sprouse, M. Mumpower||Nuclear masses learned from a probabilistic neural network||PRC 106 014305|