Governing Lethal Autonomous Weapon Systems

An artificial intelligence arms race will have a disintegrating effect on the commonly agreed global norms of international law, especially those on the restraint of the use of military force and on the protection of civilians during both times of war and in peace. LAWS will likely reduce the threshold for the use of force, making war more likely. The reality is that the institutions of peace and security are already starting to unravel. Armed unmanned aerial vehicles have enabled significant violations of the essential global norm against the use of force. Cyber-attacks also blur the lines between war and peace and will only grow in number and sophistication. The introduction of LAWS into countries’ military planning will only make this picture more complicated. For the negotiations to progress toward a more concrete and ambitious path, a few developments would be welcome. First, the ICRC could endorse the call for a preventive prohibition of any weapon systems that eliminate meaningful human control over the critical functions and the decisions to kill. In the past, the ICRC has taken significantly transformative stances against landmines, cluster bombs, and more recently toward a ban of nuclear weapons. Its position has enormous moral clout and it would make a difference at this critical juncture. Second, the forum for negotiations should not be the CCW, but one where decisions are taken by majority rather than unanimity. Finally, one of the permanent five (P5) members of the UN Security Council, such as France, could embrace the role of the champion state towards a prohibition of any weapon systems that eliminate meaningful human control over the critical functions and the decisions to kill. Working with middle powers in Europe and Latin America, a P5 member could form a group of like-minded states that would work toward a legally binding instrument with commonly agreed global norms to protect the future of peace.