Artificial Intelligence and Nuclear Weapons

Please use this identifier to cite or link to this item:
https://doi.org/10.48693/543
Open Access logo originally created by the Public Library of Science (PLoS)
Title: Artificial Intelligence and Nuclear Weapons
Authors: Saalbach, Klaus
Abstract: Artificial Intelligence (AI) has a great potential in military applications and is seen by nuclear powers as indispensable and essential. AI and big data could be used for control of non-proliferation of nuclear weapons, could save time and costs in the research, design optimization, manufacturing, testing and certification, maintenance and surveillance of nuclear warhead systems and manage resources more efficiently. Military AI could enhance early warning and Intelligence, Surveillance and Reconnaissance (ISR) capabilities, communication reliability and accelerate decision-making. The integration of AI into the nuclear command, control, and communications (NC3) systems could also synchronize information across nuclear and non-nuclear command and control. For autonomous nuclear-weapon systems, AI can support obstacle detection and maneuverability, automated target identification, and long-range and loitering capability. The debate on the use of AI for nuclear weapons covers three areas, the autonomy, the stability of military AI systems and the strategic stability. A potential autonomy of nuclear weapons is part of the broader debate on lethal autonomous weapons systems (LAWS). Automated and semi-autonomous nuclear decision-makings were already considered during cold war (SAGE; Perimeter). A specific military AI problem is the mission stability, as the AI systems lack context knowledge and may decide too quickly. The opacity of the systems leads to the explainability or interpretability issue; further problems may result from data poisoning, manipulated images, automation bias and artificial escalation. Wargame simulations with current generative AI Large language models (LLMs) from OpenAI, Anthropic and Meta showed that the systems tend to escalation up to nuclear strikes. As software system, AI is vulnerable for cyber attacks, generative AI also for prompt injections. A main concern of all involved parties is the strategic stability, i.e., to avoid everything that gives a reason for a nuclear first strike. Such a reason could be uncertainty about the capabilities of the adversary, because if it is not known what the other side can do in a certain situation, the only chance in a nuclear conflict is to strike first. AI can undermine strategic stability also by compression of decision-making time, which may result in escalation or inadvertent use of nuclear weapons, amplify misunderstandings and misperceptions during a crisis, and by encouraging a premature deployment of insufficiently tested AI. Moreover, AI systems facilitate the use of ‘dead hand’-systems and autonomous nuclear weapons. The rise of hypersonic weapons and the increasing speed of warfare both undermine strategic stability as well. Currently, the AI is still perceived as too immature to be used in high-risk strategic situations; there is a substantial risk of technical failures and of biased, incomplete, or inaccurate data. The nuclear powers agree that for command and control at the strategic level, the role of AI should remain supportive. United States, China and Russia have started dialogues on AI risks. This paper briefly presents the current state of the debate and the background.
URL: https://doi.org/10.48693/543
https://osnadocs.ub.uni-osnabrueck.de/handle/ds-2024052111187
Subject Keywords: Artificial Intelligence; AI; Nuclear Weapons; Deterrence; Autonomous Weapons
Issue Date: 21-May-2024
License name: Attribution 3.0 Germany
License url: http://creativecommons.org/licenses/by/3.0/de/
Type of publication: Arbeitspapier [WorkingPaper]
Appears in Collections:FB01 - Hochschulschriften

Files in This Item:
File Description SizeFormat 
Artificial_Intelligence_Nuclear_Weapons_2024_Saalbach.pdf857,75 kBAdobe PDF
Artificial_Intelligence_Nuclear_Weapons_2024_Saalbach.pdf
Thumbnail
View/Open


This item is licensed under a Creative Commons License Creative Commons