I should like to thank you, Mr Speaker, for allowing me this debate to bring to the House’s attention the issue of lethal autonomous robotics, or LARs, which are sometimes referred to as “killer robots”. I have the privilege of being the vice-chair of the all-party parliamentary group on weapons and the protection of civilians, and I wish to raise the issue of what plans the Government have to engage in international talks to try to limit, through means such as international regulation, the development and proliferation of such weapons. I believe that the UK has a key role to play in international talks and that without concerted international agreement and pressure it is unrealistic to anticipate that many individual states will pause in their drive for ever-increasing technological advantage.

This debate is timely because just over a fortnight ago, on 29 May, Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, presented a report on lethal autonomous robotics to the UN Human Rights Council in Geneva. The following day, 24 states took part in discussions on the report, and with the exception of the UK, they all agreed on the need for further debate. Germany and the United States were among those who expressed a particular willingness for further international discussions. Brazil and France urged the need for an arms control forum, and suggested using the convention on certain conventional weapons. That is a framework convention with protocols on specific issues. It is the mechanism that was used to make an international legal agreement to ban the use of blinding lasers before they were ever deployed.

Before returning to the need for international dialogue, I want briefly to explain what we mean by lethal autonomous robotics and why we need to take action now. I also want to highlight some of the concerns raised in the UN special rapporteur’s report. The term LARs refers to

“robotic weapons systems that, once activated, can select and engage targets without further intervention by a human operator”.

The key element is that the robot has the power to “choose” a target independently and to “decide” to use lethal force against that target. That element of full autonomy means that LARs represent more than just a game-changing development in weapons technology. They represent a revolution.

LARs have sometimes been grouped with modern unstaffed weapons systems, such as remotely piloted aircraft systems, sometimes called unmanned aerial vehicles but most are more commonly known as drones. However, they go a considerable step further than drones. LARs are fully autonomous weapons systems which, once activated, can select and use lethal force against targets without further human intervention. The key departure from existing military technology—the factor that differentiates LARs from unmanned weapons systems such as drones—is the absence of human intervention once a fully autonomous weapons system has been activated. A robot would be able to make the decision to kill a human being, which has never been the case before. For that reason, LARs would constitute not an upgrade of the weapons that are currently in our arsenals, but a fundamental change in the nature of war. LARs explode our legal and moral codes which assume that the decision-making power of life and death will be the responsibility of a human being, never a machine.

It is the natural horror of a scenario in which a robot could decide to kill a human that has led to the description of LARs as “killer robots”. They are also sometimes described as “fully autonomous weapons”. Whatever the label, no lethal, fully autonomous weapons system has yet been deployed, but we need urgent action now, before further technological development and investment make a race toward killer robots impossible to stop. Make no mistake: technological know-how is widespread, and it is estimated that more than 70 countries have military robotics programmes. The United Kingdom is a leader in the field of sophisticated high-tech military industries, and is therefore at the forefront of development of the types of technology that could be used in LARs.

Inevitably, much of the development of LARs worldwide is shrouded in secrecy, including development in the UK. What we do know is that weapons technology is developing at an ever-increasing pace, and it is therefore very difficult to determine how close we are to the production of LARs that are ready to be used. Weapons systems with various degrees of autonomy and lethality are already being developed. One is the UK’s Taranis system, a jet-propelled combat drone prototype that can search for, identify and locate enemies autonomously, and can defend itself against enemy aircraft without human intervention. It is clear that LARs are not a fantasy of science fiction, or a technology belonging to the distant future; they are a real possibility for our time.

The considered, comprehensive and balanced report by Christof Heyns, which was published on 9 April, raised a plethora of concerns about LARs. First, it drew attention to the moral dilemmas presented by them.

Does my hon. Friend agree that the gravest danger posed by these weapons is their perpetuation of the philosophy that might is right? Is it not the case that, while the use of sophisticated technology in certain countries against other, unsophisticated countries may secure victories in the short term, huge resentments will be built up because of that difference in technology, and will leave a legacy of continuing conflict?

My hon. Friend is right. There will be a huge imbalance between countries that have these technologies and the potential to use them, and countries that do not.

LARs increase the distance, physical and emotional, between weapons users and the lethal force that they inflict. Drones already offer the states which deploy them the military advantage of being able to carry out operations without endangering their own military personnel, and thus distance the operators from the action. LARs would take that a crucial step further by lessening the weight of responsibility felt by humans when they make the decision to kill. They could lead to a vacuum of moral responsibility for such decisions.

Secondly, LARs give rise to legal issues. Given that they would be activated by a human being but no human being would make the specific decision to deploy lethal force, it is fundamentally unclear who would bear legal responsibility for the actions performed by them. If the legal issues are not tackled, an accountability vacuum could be created, granting impunity for all LARs users. Furthermore, robots may never be able to meet the requirements of international humanitarian law, as its rules of distinction and proportionality require the distinctively human ability to understand context and to make subjective estimates of value. The open-endedness of the rule of proportionality in particular, combined with complex circumstances on a battlefield, could result in undesired and unexpected behaviour by LARs. It is clear that existing law was not written to deal with LARs.

Thirdly comes a multitude of terrifying practical concerns. The lowered human cost of war to states with LARs, as my hon. Friend Paul Flynn pointed out, could lead to the “normalisation” of armed conflict. A state with LARs could choose to pit deadly robots against human soldiers on foot, presenting the ultimate asymmetrical situation. States could be faced with the temptation of using LARs outside of armed conflict, finding themselves able to eliminate perceived “troublemakers” anywhere in the world at the touch of a button. LARs could be hacked or appropriated, possibly for use against the state, and they could malfunction, with deadly consequences.

This report corroborates the revolutionary difference between LARs and any previous weapons system, and proves the following: that our current understanding of the nature of war cannot support them; that our existing legislation cannot regulate them; and that we cannot predict the effects that they may have on our future world.

What is called for worldwide in response is both an urgent course of action, and a mutual commitment to inaction: immediate action to ensure transparency, accountability and the rule of law are maintained; and agreement to inaction in the form of a global moratorium on the testing, production, assembly, transfer, acquisition, deployment and use of LARs until an international consensus on appropriate regulations can be reached.

Will the Minister explain the Government’s position on the recommendations in the UN report. It calls on all states to do the following: put in place a national moratorium on lethal autonomous robotics; participate in international debate on lethal autonomous robotics, and in particular to co-operate with a proposed high level panel to be convened by the UN High Commissioner for Human Rights; commit to being transparent about internal weapons review processes; and declare a commitment to abide by international humanitarian law and international human rights law in all activities surrounding robotic weapons.

At the UN Human Rights Council in Geneva, a large number of states expressed the need to ensure legal accountability for LARs and pledged support for a moratorium. The UK was the only state to oppose a moratorium. Did the UK really consider existing law to be sufficient to deal with fully autonomous weapons, and was it completely dismissing the idea of national moratoriums on the development and deployment of

LARs? What evaluation of the recommendations for an international moratorium, for transparency over weapons review processes, for discussion of the limits of international humanitarian law and international human rights law, and for engagement in international dialogue did the Government carry out in advance of the debate in Geneva two weeks ago? I believe the UK should take a leading role in limiting the use of LARs, and use our considerable standing on the world stage to bring nations together to negotiate.

The UN report recommended a “collective pause”—time to reflect and examine the situation with open eyes, before the demands of an arms race, and of heavy investment in the technology, make such a pause impossible. Only with multilateral co-operation can an effective moratorium be achieved. As Christof Heyns observes, if nothing is done,

This could form the positive basis of a strong policy, but further clarification and explanation are urgently required, and there has been no mention of a moratorium.

In November 2012 the USA outlined its policy and committed itself to a five-year moratorium. In a Department of Defense directive, the United States embarked on an important process of self-regulation regarding LARs, recognising the need for domestic control of their production and deployment, and imposing a form of moratorium. The directive provides that autonomous weapons

“shall be designed to allow commanders and operators to exercise appropriate levels of human judgement over the use of force.”

Specific levels of official approval for the development and fielding of different forms of robots are identified. In particular, the directive bans the development and fielding of LARs unless certain procedures are followed. The UN report notes that this important initiative by a major potential LARs producer should be commended and that it may open up opportunities for mobilising international support for national moratoriums.

During a Westminster Hall debate on l1 December 2012, my hon. Friend Mr Jones, the Opposition Defence spokesman, expressed support for the move by the United States to codify the use of UAVs. He suggested that the UK examines whether it should, in addition to existing law, have a code covering: the contexts and limitations of usage; the process for internal Government oversight of deployments; command and control structures; and acceptable levels of automation. The Minister who responded to the debate rejected that suggestion on the grounds of operational security—this may be one of the big stumbling blocks.

However, now that we are talking about the development of LARs, we do need greater clarity, both in respect of UK policy and on the international stage. Existing international humanitarian law and international human rights law never envisaged weapons making autonomous decisions to kill. Deciding what to do about LARs is not like simply banning a chemical agent—it is far more complex than that. We are talking about technological know-how that can be used in so many different ways, so we need to sit down with other countries to look at the limitations of international humanitarian law and international human rights law.

I have listened to what the hon. Lady has said. Does she agree that the only way forward is an explicit ban, because international humanitarian law was, as she said, written before anyone could contemplate fully autonomous weapons? Does she agree that the most important thing is for human beings to make morally based decisions to stay within the law and the only way forward is a full ban?

The hon. Lady makes a very valid point, which shows why the negotiations are so crucial. We need to define exactly what is meant by LARs and examine that international law to see what we can do to regulate all the appropriate weapons. If we are to make progress on banning LARs, nations need to be clear about exactly what we mean.

We then need to look at what mechanisms could be used. One suggestion would be to use the convention on certain conventional weapons, the mechanism used to make an international legal agreement on a pre-emptive ban on blinding lasers. Another option would be to use the process that led to 107 states adopting the convention on cluster munitions, five years ago last month. That treaty was groundbreaking for three main reasons: first, it banned an entire category of weapons; secondly, it brought a ban into existence before the use of cluster munitions had become widespread; and, thirdly, the treaty process was multilateral, shaped through the initiative and sustained leadership of the Norwegian Government, with a strong partnership between states and organisations working together towards a clear common goal.

The UK needs to be at the forefront of the debate on LARs. Now is the time for further international discussion. Now is the time to encourage a wide range of states to adopt a moratorium on the development and deployment of LARs until a new international legal framework has been developed that takes account of the potential of LARs and lays the basis for discussion on their future regulation or prohibition. I very much hope that we will see the UK taking a lead on this matter.

I thank Nia Griffith, not only for bringing a very serious matter to the House and explaining it clearly, but for her immense courtesy this afternoon in sending us a copy of her speech, which enabled me to discuss it with officials and therefore answer the four key questions that she has raised.

I thank the hon. Lady for bringing the issue of lethal autonomous robotics before Parliament. It is clear from this debate and the one recently at the UN Human Rights Council in Geneva that this is an important subject which will inevitably become ever more so as technology develops. Let me clarify the scope of today’s debate. I agree with her that LARs are weapon systems which, once activated, can select and engage targets without any further human intervention. Her definition was correct and it is clearly one step on from drones, which have a human component—I will come back to discuss that in a moment.

Let me be very clear and back up the comments made by my noble Friend Lord Astor in the other place and quoted by the hon. Lady. He stated that

Let me reiterate that the Government of the United Kingdom do not possess fully autonomous weapon systems and have no intention of developing them. Such systems are not yet in existence and are not likely to be for many years, if at all. Although a limited number of defensive systems can currently operate in automatic mode, there is always a person involved in setting the parameters of any such mode. As a matter of policy, Her Majesty’s Government are clear that the operation of our weapons will always be under human control as an absolute guarantee of human oversight and authority and of accountability for weapons usage.

By putting that information on the record I hope to make it clear that we share the concern that the hon. Lady has brought before the House, which others share, about possible technological developments. My argument is that the UK believes that the basis of international law governing weapons systems would prevent the development of weapons in the way that she suggests, but whether or not that is the case, the UK’s position on wishing to develop such weapons is absolutely clear.

The United Kingdom always acts fully in accordance with international humanitarian law and international standards. We are committed to upholding the Geneva conventions and their additional protocols and encourage others to do the same. We always ensure that our military equipment is used appropriately and is subject to stringent rules of engagement. I shall discuss that in more detail later.

I thank the hon. Lady for her summary of the report presented by Christof Heyns, the special rapporteur on extrajudicial, summary or arbitrary executions, which was discussed in Geneva on 30 May. Let me summarise the report. Mr Heyns highlighted that the “possible” use of lethal autonomous robotics raises far-reaching concerns about the protection of life during war and peace. In his findings, he recommended that states establish national moratoriums on aspects of lethal autonomous robotics and called for the establishment of a high-level panel to produce a policy for the international community on the issue.

The hon. Lady asked whether the Government were willing to accept the four recommendations made in the report. I believe the point she particularly wanted to discuss was the question of why, as she said, the UK was the only state that did not support a moratorium. Let me make things a little more clear, if I may. The UK has unilaterally decided to put in place a restrictive policy whereby we have no plans at present to develop lethal autonomous robotics, but we do not intend to formalise that in a national moratorium. We believe that any system, regardless of its level of autonomy, should only ever be developed or used in accordance with international humanitarian law. We think the Geneva conventions and additional protocols provide a sufficiently robust framework to regulate the development and use of these weapon systems.

As I had the chance to read the hon. Lady’s speech before the debate, I noticed that she used the phrase “Furthermore, robots may never be able to meet the requirements of international humanitarian law”. She is absolutely correct; they will not. We cannot develop systems that would breach international humanitarian law, which is why we are not engaged in the development of such systems and why we believe that the existing systems of international law should prevent their development.

The basis of the Government’s argument, made by me and by my noble Friend in the other place, is that the system of law and conventions that govern the development of weapons would prevent anyone from developing the weapon in such a manner as the hon. Member for Llanelli has suggested. It would not fit export criteria, so I do not think that we are at odds on that. The issue is whether the legal framework is sufficiently robust to prevent that. The United Kingdom, having made its own decision that it is not developing these weapons, believes that the basis of the legal system on weaponry is such as to prevent that development.

Can the Minister explain the distinction that he makes? In a meeting held in this place, one of the noble Lords with great experience in the Navy gave an example of a weapon that is used now which, once the parameters have been set, would work entirely automatically without any human intervention. What is the difference between that and the prospect of fully autonomous weapons?

My understanding, having discussed this with officials, is that it is the setting of the parameters that is the human element. For example, once the parameters were set of some existing weapons system that would seek to identify and defend itself against missiles coming at one of our ships in a situation of conflict, plainly an operator would not be needed to press the button each second to fire off the missiles—the system will do that automatically. That is an automatic system where the parameters have been set. What is envisaged through lethal autonomous robotics is a step beyond that, which no one has reached. To use the definition that the hon. Member for Llanelli gave right at the beginning and which I cited, that would be weapons systems which, once activated, could select and engage targets without any further human intervention. Those are not drones; it is a step beyond.

The hon. Lady has rightly observed that this is a complicated area, where further international discussion would help to clarify the legal and political implications of the possible future development of this technology.

Like others, we think that the Human Rights Council is not the right forum for the discussion, but we stand ready to participate in the international debate and we agree that the convention on certain conventional weapons seems the right place for this important issue.

The hon. Lady asked why the UK was the only country to resist the call for a moratorium. I have set out our willingness to adopt a more restrictive policy than the legal freedom afforded, and our commitment to uphold international humanitarian law and to encourage others to do the same. I do not believe that our approach is so different from that of the United States and our European allies.

We did not interpret the discussion in Geneva in quite the same way as the hon. Lady. We believe that French and US attitudes are very similar to our own. Although some states spoke in favour of some sort of regulation or control, many did not, and we should not take that as universal support for a moratorium, given the number of states that did not express a view. Our sense is that support for a moratorium is far less than indicated by the hon. Lady. That does not in any way negate the concerns, but we are not quite sure that people are where she suggests in relation to a moratorium.

The law of armed conflict already addresses the ethical and moral aspects of these weapons systems to ensure adherence to principles of discrimination, proportionality, military necessity and humanity to protect people from unnecessary suffering. The selection and prosecution of all targets is always based on rigorous scrutiny which complies with international humanitarian law, UK rules of engagement and targeting policy.

The hon. Lady also asked me to elaborate on what the Government mean by human control and what level of human control they believe is sufficient, which is also the point behind the question asked by Paul Flynn. Targets will always be positively identified as legitimate military objectives with an appropriate level of command authority and control in their selection and prosecution. The UK is legally obliged to ensure that all weapons and associated equipment that it obtains or plans to acquire or develop comply with the UK’s treaty and other obligations in accordance with international humanitarian law. We do this through legal weapons review. For equipment to be procured, it must satisfy those key legal principles. The policy on the necessity, responsibility and conduct of article 36 reviews will be placed in the Library of the House.

International humanitarian law was designed to withstand future changes in technology. Although we have been discussing matters that are still far beyond the present technology, we believe that the legal system has in mind such future developments. We encourage all states to meet their obligations under international humanitarian law. We believe that the development and use of weapons should always be fully compliant with international law, including the Geneva conventions. We are working closely with the Government of Switzerland and the International Committee of the Red Cross on an initiative to strengthen compliance with international humanitarian law, and one of our primary objectives for the arms trade treaty was that it should put compliance with international humanitarian law at the heart of Governments’ decisions about the legitimate arms trade. We have voiced, and will continue to voice, our concerns with those states that do not live up to their obligations.

As I mentioned earlier, the United Kingdom does not have fully autonomous weapon systems, and the Ministry of Defence’s science and technology programme does not fund research into fully autonomous weapons. No planned offensive systems are to have the capability to prosecute targets without involving a human in the decision making process.

There are a number of areas where United Kingdom policy is currently more restrictive than the legal freedoms allowed. We consider that to be entirely prudent. However, we cannot predict the future; we cannot know now how this technology will develop. Given the challenging situations in which we expect our armed forces personnel to operate now and in the future, it would be wrong to deny them legitimate and effective capabilities that can help them to achieve their objectives as quickly and safely as possible. We have a responsibility to the people who protect us, and must therefore reserve the right to develop and use technology as it evolves in accordance with established international law. Our current position on the development of these weapons is very clear, and I thank the hon. Member for Llanelli for giving me this opportunity to explain that to the House.