Computing experts from 37 countries call for ban on killer robots

More than 270 engineers, computing and artificial intelligence experts, roboticists, and professionals from related disciplines are calling for a ban on the development and deployment of weapon systems that make the decision to apply violent force autonomously, without any human control.

In a statement issued today, the experts from 37 countries say that, “given the limitations and unknown future risks of autonomous robot weapons technology, we call for a prohibition on their development and deployment. Decisions about the application of violent force must not be delegated to machines.”

The expert signatories to the statement question the notion that robot weapons could meet legal requirements for the use of force “given the absence of clear scientific evidence that robot weapons have, or are likely to have in the foreseeable future, the functionality required for accurate target identification, situational awareness or decisions regarding the proportional use of force.”

“Governments need to listen to the experts’ warnings and work with us to tackle this challenge together before it is too late,” said Professor Noel Sharkey, Chair of the International Committee for Robot Arms Control (ICRAC). “It is urgent that international talks get started now to prevent the further development of autonomous robot weapons before it is too late.”

More than 70 countries have acquired drones and the statement expresses concern about the comparable proliferation of autonomous robot weapons technology. The experts question how devices controlled by complex algorithms will interact, warning: “Such interactions could create unstable and unpredictable behavior, behavior that could initiate or escalate conflicts, or cause unjustifiable harm to civilian populations.”

The statement’s expert signatories (see also the quotes below) include among others:

Geoffrey Hinton FRS, Raymond Reiter Distinguished Professor of Artificial Intelligence at the University of Toronto

Alan Bundy CBE, FRS, FReng, Professor of Automated Reasoning, University of Edinburgh, Elected a founding fellow of the American Association for Artificial Intelligence

Bruno Siciliano is Past President of the IEEE Robotics and Automation Society and professor of Robotics at the University of Naples Federico II

Lucy Suchman, Professor at Lancaster University, UK, recipient of the Benjamin Franklin Medal in Computer and Cognitive Science, and the Lifetime Research Award from the ACM Special Interest Group on Computer-Human Interaction.

James Hendler, Tetherless World Senior Constellation Professor, Computer Science and Cognitive Science, Rensselaer Polytechnic Institute (RPI) former member of the US Air Force Science Advisory Board and a former Chief Scientist of the Information Systems Office at the US Defense Advanced Research Projects Agency (DARPA)

Bart Selman, Professor of Computer Science, Cornell University, Fellow of both the American Association for Artificial Intelligence and American Association for the Advancement of Science.

Tom Ziemke, Professor of Cognitive Science, University of Skovde, Sweden

The 272 signatures to the statement were collected by ICRAC, a not-for-profit organization comprised of scientists, ethicists, lawyers, roboticists, and other experts that formed in 2009 to address the potential dangers involved with the development of armed military robots and autonomous weapons. Given the rapid pace of development of military robots and the pressing dangers their use poses to peace, international security, the rule of law, and to civilians, ICRAC calls for a ban on armed robots with autonomous targeting capability.

Governments are beginning to consider their policy on fully autonomous robot weapons, but as yet there is no international process on the topic. Nations debated a UN report on the challenges of fully autonomous robot weapons at the Human Rights Council on 30 May 2013, and France is expected to propose that the topic be discussed at the annual meeting of the Convention on Conventional Weapons in Geneva on 14-15 November 2013.

ICRAC is a founding member of the Campaign to Stop Killer Robots, a global coalition launched in London in April 2013 that calls for a pre-emptive ban on fully autonomous weapons. Sharkey and other members of the Campaign to Stop Killer Robots will be speaking at the United Nations in New York on Monday, 21 October and at the United Nations in Geneva on Wednesday, 13 November.

“Autonomous weapons will violate the Geneva Convention by their inability to distinguish between combatants and civilians. Their use should be as as unthinkable as using chemical weapons.”

Alan Bundy CBE, FRS, FReng, Professor of Automated Reasoning, University of Edinburgh, Elected a founding fellow of the American Association for Artificial Intelligence

“Artificial Intelligence can improve people’s lives in so many ways, but researchers need to push for positive applications of technology by supporting a ban on autonomous weapons systems.”

Geoffrey Hinton FRS, [founding father of modern machine learning] Raymond Reiter Distinguished Professor of Artificial Intelligence at the University of Toronto

“Applied as tools of war, robotics raises the threat of ruthless dictators with unfeeling killing machines to use against civilian populace. Laws governing the development and proper use of these machines are needed now, before it is too late.”

James Hendler, Tetherless World Senior Constellation Professor, Computer, Web and Cognitive Sciences, Rensselaer Polytechnic Institute (RPI), Troy, NY 12180, former member of the US Air Force Science Advisory Board and a former Chief Scientist of the Information Systems Office at the US Defense Advanced Research Projects Agency (DARPA)

“This moment, when the accurate identification of enemy combatants is more challenging than ever, is also the crucial moment in which to take a stand against lethal autonomous weapons and to intensify our insistence on human judgment and accountability.”

Lucy Suchman, Lancaster University, UK, recipient of the Benjamin Franklin Medal in Computer and Cognitive Science, and the Lifetime Research Award from the ACM Special Interest Group on Computer-Human Interaction.

“There is something particularly repugnant in automating this most difficult of ethical decisions; killing people. There are numerous difficult and dangerous jobs in the world (e.g. involving mine clearance, decommissioning nuclear reactors etc.) and it seems far better to use our skills as AI-practitioners and roboticists to help supplement humans in these roles, rather than misguided and dangerous attempts to automate the very art of execution.”

Mark Bishop, Professor of Cognitive Computing, Goldsmiths University of London and chairman of the society for the study of Artificial Intelligence and the Simulation of Behaviour (AISB) – the world’s oldest AI society.

“Technology development is creating robots that can kill far more quickly than our ethical and societal ability to understand just where that technology is taking us. We must slow down innovation of killer robots because they violate clear moral principles. Discourse must replace development so we can bend our future towards peace.”

Illah Nourbakhsh, Professor of Robotics, Director of the CREATE lab and member of the Robotics Institute at Carnegie Mellon University, USA

Trackbacks/Pingbacks

[…] intelligence experts, roboticists, and professionals from related disciplines who have signed an experts’ call to ban killer robots. The experts say “given the limitations and unknown future risks of autonomous robot weapons […]

[…] intelligence experts, roboticists, and professionals from related disciplines who have signed an experts’ call to ban killer robots. The experts say “given the limitations and unknown future risks of autonomous robot weapons […]

[…] artificial intelligence experts, roboticists, and professionals from related disciplines issued a statement calling for a ban on fully autonomous weapons. They cast doubt on the notion that robotic weapons […]

[…] and properly recognize the surrender process of combatants. The article also reports that “272 engineers, computer scientists and roboticists” recently co-signed the ICRAC’s letter to the UN to implement a ban.The article clarifies that ICRAC does not oppose non-autonomous […]

[…] and properly recognize the surrender process of combatants. The article also reports that “272 engineers, computer scientists and roboticists” recently co-signed the ICRAC’… to the UN to implement a ban.The article clarifies that ICRAC does not oppose non-autonomous […]

[…] year, over 270 artificial intelligence experts, roboticists and scientists from 37 countries signed an open letter calling for the development and deployment of autonomous robot weapons technology to be […]