Autonomous weapons systems, often referred to as ‘killer robots’, have been a hallmark of popular imagination for decades. However, with the inexorable advance of artificial intelligence systems (AI) and robotics, killer robots are quickly becoming a reality. These lethal technologies can learn, adapt, and potentially make life and death decisions on the battlefield with little-to-no human involvement. This naturally leads to not only legal but ethical concerns as to whether we can meaningful control such machines, and if so, then how. Such concerns are made even more poignant by the ever-present fear that something may go wrong, and the machine may carry out some action(s) violating the ethics or laws of war. Researchers, policymakers, and designers are caught in the quagmire of how to approach these highly controversial systems and to figure out what exactly it means to have meaningful human control over them, if at all. In Designed for Death, Dr Steven Umbrello aims to not only produce a realistic but also an optimistic guide for how, with human values in mind, we can begin to design killer robots. Drawing on the value sensitive design (VSD) approach to technology innovation, Umbrello argues that context is king and that a middle path for designing killer robots is possible if we consider both ethics and design as fundamentally linked. Umbrello moves beyond the binary debates of whether or not to prohibit killer robots and instead offers a more nuanced perspective of which types of killer robots may be both legally and ethically acceptable, when they would be acceptable, and how to design for them.

Designed for Death: Controlling Killer Robots

Umbrello, Steven
2022-01-01

Abstract

Autonomous weapons systems, often referred to as ‘killer robots’, have been a hallmark of popular imagination for decades. However, with the inexorable advance of artificial intelligence systems (AI) and robotics, killer robots are quickly becoming a reality. These lethal technologies can learn, adapt, and potentially make life and death decisions on the battlefield with little-to-no human involvement. This naturally leads to not only legal but ethical concerns as to whether we can meaningful control such machines, and if so, then how. Such concerns are made even more poignant by the ever-present fear that something may go wrong, and the machine may carry out some action(s) violating the ethics or laws of war. Researchers, policymakers, and designers are caught in the quagmire of how to approach these highly controversial systems and to figure out what exactly it means to have meaningful human control over them, if at all. In Designed for Death, Dr Steven Umbrello aims to not only produce a realistic but also an optimistic guide for how, with human values in mind, we can begin to design killer robots. Drawing on the value sensitive design (VSD) approach to technology innovation, Umbrello argues that context is king and that a middle path for designing killer robots is possible if we consider both ethics and design as fundamentally linked. Umbrello moves beyond the binary debates of whether or not to prohibit killer robots and instead offers a more nuanced perspective of which types of killer robots may be both legally and ethically acceptable, when they would be acceptable, and how to design for them.
2022
Trivent Publishing
Ethics and Robotics
1
250
978-615-6405-36-4
978-615-6405-37-1
978-615-6405-38-8
https://trivent-publishing.eu/home/139-182-designed-for-death-controlling-killer-robots.html#/26-cover-hardcover
killer robots, value sensitive design, VSD, applied ethics, autonomous weapons, military ethics, drones, systems thinking, engineering, design
Umbrello, Steven
File in questo prodotto:
File Dimensione Formato  
Designed for Death_eBook_bookmarked.pdf

Accesso riservato

Descrizione: PDF Editoriale
Tipo di file: PDF EDITORIALE
Dimensione 4.01 MB
Formato Adobe PDF
4.01 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1871101
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact