The University of Texas at Austin
Joint seminar with the MIT Laboratory for Nuclear Science (LNS). Please note seminar location in LNS (Building 26).
The US last detonated a nuclear weapon in 1992 in an underground test in Nevada. By 1994, the Department of Energy’s National Nuclear Security Administration (NNSA) launched its science-based stockpile stewardship program (SSP) designed specifically to ensure the safety, security, and effectiveness of US nuclear weapons without underground nuclear testing. Today, one quarter century later, the scientists and engineers at NNSA’s national laboratories and associated facilities, have succeeded at this task by a thorough modernization of tools, methods, and ideas about stewarding nuclear weapons. Key enablers were development and employment of specific new experimental capabilities, creation of modern, 3D weapon simulation codes, and investment and acquisition in high-end supercomputers, which in turn led to more detailed understanding of relevant physical processes, and enabled extraction of the high value of the US underground test archive. All of this was knitted together by rigorous application of processes to quantify performance margins and uncertainties.
Looking ahead to the next generation of SSP, issues of responsiveness, agility, and efficiency of the nuclear weapons enterprise led the departments of Defense and Energy to seek a stockpile with fewer weapon types, while maintaining current capabilities. The “3+2” strategy envisaged three sets of “interoperable” nuclear components serving both Air Force ICBMs and Navy SLBMs, and two air-delivered weapons. The 2018 Nuclear Posture Review added a different mix of weapons to the near future stockpile and NNSA’s FY2019 plans detail only the first round of interoperable systems.
However, nuclear threats worldwide are quite different today compared to 1992-94, with more players and, potentially, greater threats. What is known is our SSP approach to stockpile stewardship---without nuclear testing---works. To be sure, there will be future technical challenges and opportunities to be faced: high performance computing may experience limits to growth, while new experimental approaches may be coming into focus to address key performance questions definitively in non-explosive nuclear experiments. Continuous improvement---modernization---of SSP will remain crucial to US deterrence in paving the way to deeper understanding and, perhaps someday, control of these threats to humanity.
Technical issues related to these points will be addressed in the talk.