AI:Should LAWS be banned?

A European resolution calls for an international ban on lethal autonomous weapon systems (LAWS).

This European Parliament resolution of 12 September 2018 (1) urges the Member States and the Council to work towards the start of international negotiations on a legally binding instrument prohibiting lethal autonomous weapon systems (LAWS) without meaningful human control over the critical functions of selecting and attacking individual targets (2).

In this resolution, the European Parliament calls for two major actions:

adopt, as a matter of urgency, a common position on LAWS that ensures meaningful human control over the critical functions of weapon systems ;

prevent the development and production of any LAWS; in other words, prevent any research programme aimed at developing such systems.

Why banning LAWS ?

The European Parliament resolution raises 3 arguments:

LAWS have the potential to “fundamentally change warfare by prompting an unprecedented and uncontrolled arms race,” (i)

The use of LAWS raises “fundamental ethical and legal questions of human control, in particular with regard to critical functions such as target selection and engagement,” (ii) and

The use of LAWS raises key questions about “the implementation of international human rights law, international humanitarian law and European norms and values with regard to future military actions” (iii).

The European Parliament (EP) resolution is based on a study entitled “Human Rights Implications of the Usage of Drones and Unmanned Robots in Warfare” of 3 May 2013. This study provided an overview of the current and likely future use of drones and other unmanned robots as early as 2013 and examined the relevant legal implications under human rights law, international humanitarian law and the UN Charter.

The study concluded that the present sense of uncertainty as to the applicable legal standards, the rapid development and proliferation of drone and robotic technology, and the perceived lack of transparency and accountability of current policies had the potential of polarizing the international community, undermining the rule of law and, ultimately, of destabilizing the international security environment as a whole.

Accordingly, the study developed three policy recommendations for European foreign policy:

1. First, the EU should make the promotion of the rule of law in relation to the development, proliferation and use of unmanned weapons systems a declared priority of European foreign policy.
2. In parallel, the EU should launch a broad inter-governmental policy dialogue aiming to achieve international consensus:
A. on the legal standards governing the use of currently operational unmanned weapon systems, and
B. on the legal constraints and/or ethical reservations which may apply with regard to the future development, proliferation and use of increasingly autonomous weapon systems.
3. Based on the resulting international consensus, the EU should work towards the adoption of a binding international agreement, or a non-binding code of conduct, aiming to restrict the development, proliferation or use of certain unmanned weapon systems in line with the legal consensus achieved.

What about the CCW Group of Governmental Experts?

The United Nations are also addressing the issue of LAWS. A Group of Governmental Experts (GGE(2)) was established in 2016 by the Fifth Review Conference of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects (CCW).

CCW offers an appropriate framework for dealing with the issue of emerging technologies in the area of LAWS;

International humanitarian law continues to apply fully to all weapons systems, including the potential development and use of LAWS;

Responsibility for the deployment of any weapons system in armed conflict remains with States;

there would be merit in focusing the next stage of the GGE’s discussions on the characterization of the systems under consideration in order to promote a common understanding on concepts and characteristics relevant to the objectives and purposes of the CCW;

there is a need to further assess the aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of LAWS in the next stage of the GGE’s work.

The overarching issues in the area of LAWS were addressed in the 2018 meetings of the GGE. They include:

Characterization of the systems under consideration in order to promote a common understanding on concepts and characteristics relevant to the objectives and purposes of the CCW;

Further consideration of the human element in the use of lethal force; aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of LAWS;

Review of potential military applications of related technologies in the context of the GGE’s work.

What did the GGE decide at its last meeting on 31.08.2018?

The GGE agreed on ten principles at its last meeting on 31 August 2018:

1. International humanitarian law continues to apply fully to all weapons systems, including the potential development and use of LAWS;
2. Human responsibility for decisions on the use of weapons systems must be retained since accountability cannot be transferred to machines. This should be considered across the entire life cycle of the weapon system;
3. Accountability for developing, deploying and using any emerging weapons system in the framework of the CCW must be ensured in accordance with applicable international law, including through the operation of such systems within a responsible chain of human command and control;
4. In accordance with States’ obligations under international law, in the study, development, acquisition, or adoption of a new weapon, means or method of warfare, determination must be made whether its employment would, in some or all circumstances, be prohibited by international law;
5. When developing or acquiring new weapons systems based on emerging technologies in the area of LAWS, physical security, appropriate non-physical safeguards (including cybersecurity against hacking or data spoofing), the risk of acquisition by terrorist groups and the risk of proliferation should be considered;
6. Risk assessments and mitigation measures should be part of the design, development, testing and deployment cycle of emerging technologies in any weapons systems;
7. Consideration should be given to the use of emerging technologies in the area of LAWS in upholding compliance with international humanitarian law and other applicable international legal obligations;
8. In crafting potential policy measures, emerging technologies in the area of LAWS should not be anthropomorphized;
9. Discussions and any potential policy measures taken within the context of the CCW should not hamper progress in or access to peaceful uses of intelligent autonomous technologies;
10. CCW offers an appropriate framework for dealing with the issue of emerging technologies in the area of LAWS within the context of the objectives and purposes of the CCW, which seeks to strike a balance between military necessity and humanitarian considerations.

What is the position of Germany and France on LAWS?

At the GGE meetings in November 2017, Germany and France made several proposals in a working document, including a working definition of LAWS.

Germany and France further clarified that the issue of a definition of “LAWS” evolve over time along with technological advances. The exact definition adopted at a later stage will also depend on the question of what kind of regulatory measures is being sought and which political or legal status LAWS should have.

In their joint political declaration, Germany and France affirmed their shared conviction that humans should continue to be able to make ultimate decisions with regard to the use of lethal force and should continue to exert sufficient control over lethal weapons systems they use. Moreover, State parties should recall that rules of international law, in particular international humanitarian law, are fully applicable to the development and use of LAWS.

What is the United States’ position on LAWS?

Unlike Germany and France, the United States believes that it is unnecessary for the GGE to adopt a specific working definition of LAWS. Instead, the USA supports promoting a general understanding of the characteristics of LAWS. It believes that the absence of a specific working definition is no impediment to the GGE’s work in understanding the potential issues posed by LAWS. Given that the law of war provides a robust and coherent system of regulation for the use of weapons, the GGE can discuss the issues potentially posed by “LAWS” under the object and purpose of the CCW without needing to agree on a specific working definition of LAWS.

The law of war’s existing rules of general applicability apply with respect to the use of all weapons, including any weapons deemed to be “LAWS.”

The United States also submitted the following positions in two separate working documents (WP.6 and WP.7):

It is not the case that the law of war requires that a weapon, even a semi-autonomous or autonomous weapon, make legal determinations. For the United States, the law of war does not require that a weapon determine whether its target is a military objective, but rather that the weapon be capable of being employed consistent with the principle of distinction. Similarly, the law of war does not require that a weapon make proportionality determinations, such as whether an attack is expected to result in incidental harm to civilians or civilian objects that is excessive in relation to the concrete and direct military advantage expected to be gained;

Machines are not States or persons under the law. Questions of legal accountability are questions of how existing and well-established principles of State and individual responsibility apply to States and persons who use weapon systems with autonomous functions;

States are responsible for the acts of persons forming part of their armed forces. It follows that States are responsible for the uses of weapons with autonomous functions by persons forming part of their armed forces as well as other such acts that may be attributable to a State under the law of State responsibility. States, in ensuring accountability for such conduct, may use a variety of mechanisms, including investigations, individual criminal liability, civil liability, and internal disciplinary measures;

As with all decisions to employ weapon systems, persons are responsible for their individual decisions to use weapons with autonomous functions. For example, persons who use weapons with autonomous functions to violate the prohibition on targeting the civilian population may be held responsible for such violations;

The responsibilities of any particular individual belonging to a State or a party to the conflict may depend on that person’s role in the organization or military operations. As a general matter, the persons who are responsible for implementing a party to a conflict’s obligation are those persons with the authority to make the necessary decisions and judgments required by that international obligation. For example, a party to a conflict has the obligation to take feasible precautions to reduce the risk to civilians, such as providing warnings before attacks. The determination of whether it is feasible to provide such a warning would be made by the relevant commander in charge of the attack.

What is the Russian Federation’s position on LAWS?

The Russian Federation expressed its position in a working document (WP.8) whereby it stated that:

The Russian Federation believes that the GGE could conduct a thorough review of existing provisions of international law, including international humanitarian law and human rights law that could potentially be applied to LAWS;

The Russian Federation proceeds from the premise that the examination of new issues within the CCW should be carried out taking into account both humanitarian concerns and legitimate defense interests of States. That said, the need to address humanitarian concerns cannot be used as the one and only sufficient prerequisite for imposing restrictive and prohibitive regimes on certain weapons.

EP resolution: a hasty position that renders GGE’s work meaningless

Apart from confirming that LAWS do not exist yet or that automated systems should not be considered as LAWS, the European Parliament does not seem to have taken into account all the work conducted by the GGE. Its desire to confirm the principle of a ban on LAWS just before the GGE meeting, scheduled for November 2018, calls into question the very existence of the GGE.

Amid increasing conflicts between technology and ethics, should it be seen as a desire to adopt a resolution guided more by moral convictions than by rational considerations?

The GGE was created to meet a specific objective that the European Parliament seems to disregard: to consider the reality of technological challenges in the light of ethical and legal considerations.

The desire to adopt a common European position and an internationally binding legal instrument banning LAWS may have only a symbolic effect. Indeed, it could run up against reality.

If such a European common position were to materialise one day, it would be totally out of step with the potential ambitions of countries such as China, the United States, the Russian Federation, India and Saudi Arabia.

How would Europe justify its decision in the event where, for a peacekeeping operation, it sends an army of men and women fights against LAWS developed by states that do not share its political and ethical considerations? The European Parliament should work together with the GGE to try and work out a solution that could be as realistic and rational as possible while being in line with the principles and values on which Europe is founded.

Europe must weigh in on the debate on LAWS to maintain its influence at the international level. Also, for the sake of its peacekeeping objective, Europe should not radically prevent the development of LAWS in order to maintain its deterrent capability.

(1) European Parliament Resolution of 12 September 2018 on autonomous weapon systems (Alain Bensoussan Website).
(2) Definition of “lethal autonomous weapon systems” proposed by the European Parliament: weapon systems without meaningful human control over the critical functions of selecting and attacking individual targets.
(3) Group of Governmental Experts (GGE).

About the Lexing® Network

Lexing® is the first international network of lawyers dedicated to digital and technology law, has been created on an initiative of Alain Bensoussan, the founder and managing partner of Alain Bensoussan-Avocats, a law firm headquartered in Paris (France) specialized in IT and new technologies.