News

NTI analysis recommends immediate action to manage use of AI in nuclear weapons systems

According to a new paper from the nonprofit Nuclear Threat Initiative (NTI), artificial intelligence in nuclear weapons systems is a neutral wash of possibilities — but it also appears inevitable, and as a result, immediate action is needed to balance benefits and risks while the technology is still maturing.

The report, Assessing and Managing the Benefits and Risks of Artificial Intelligence in Nuclear-Weapon Systems, was co-authored by Jill Hruby, Under Secretary for Nuclear Security at the U.S. Department of Energy and a former NTI Distinguished Fellow, as well as M. Nina Miller, a PhD student at the Massachusetts Institute of Technology and former NTI intern. Together, the two determined that two application areas were most likely to take advantage of AI advances soon: Nuclear Command, Control, and Communications (NC3) and autonomous nuclear-weapon systems.

“In NC3, AI could be applied to enhance reliable communication and early warning systems, to supplement decision support, or to enable automated retaliatory launch,” the authors wrote. “The implications vary dramatically. Enhancing communication reliability and decision-support tools with AI has recognizable benefits, is relatively low risk, and is likely stabilizing, although it still requires additional technical research to lower risk as well as deeper policy exploration of stability implications to avoid provoking an arms race. AI application to automated retaliatory launch, however, is highly risky and should be avoided.”

They added, “For autonomous nuclear-weapon systems, AI along with sensors and other technologies are required for sophisticated capabilities, such as obstacle detection and maneuverability, automated target identification, and longer-range and loitering capability. Today’s technology and algorithms face challenges in reliably identifying objects, responding in real time, planning and controlling routes in the absence of GPS, and defending against cyberattacks. Given the lack of technology maturity, fully autonomous nuclear-weapon systems are highly risky. ”

Given the riskiness and potential instability alike, the pair argued for an outright ban on fully autonomous systems until the technology could be better understood and proven. In the meantime, they also urged prioritization of research and carefully but openly published details of low technical risk approaches and fail-safe protocols for AI use in high consequence applications, cooperative research, national policies on the role of human operators and limits of AI in nuclear weapon systems, and increased international discussion of implications for AI use in nuclear weapons systems.

“Because of the high potential consequences, AI use in nuclear-weapon systems seems to be proceeding at a slower pace — or perhaps more covertly — than other military applications,” the authors wrote. “Nonetheless, the potential for AI application to nuclear-weapon systems is likely to grow as the military use of AI develops concurrently with nuclear-weapons system modernization and diversification,” they added.

Chris Galford

Recent Posts

Sen. Barrasso raises concerns over DOE’s ability to protect AI R&D from China

With artificial intelligence (AI) the increasing focus of interest for Washington and private companies, U.S.…

20 hours ago

Senate advances legislation that emphasizes federal building security

The Senate greenlit the Federal Building Security Act (S. 3613) this week, moving to the…

20 hours ago

Senate bill seeks to compel national security risk analysis of economic integration with China

A bipartisan group of senators recently introduced the American Economic Independence Act, which would require…

2 days ago

Sen. Peters presses federal agencies to increase cybersecurity for American health care

Citing cases where cyberattacks targeted health care systems in the United States, U.S. Sen. Gary…

2 days ago

U.S. Justice Department launches National Extreme Risk Protection Order Resource Center to curb gun violence

The U.S. Department of Justice recently launched a new resource center – the National Extreme…

3 days ago

Government funding agreement includes $3M allotment to create Northern Border security center

Among the provisions included in the recent $1.2 trillion funding agreement signed by President Joe…

3 days ago

This website uses cookies.