KILLER robots pose the grave threat of violating the most basic of human rights, a law expert has warned in a dire message.

Bonnie Docherty, a lecturer in law at Harvard University, fears the proliferation of weaponised Artificial Intelligence (AI) could be at odds with the international Convention on Conventional Weapons.

Together with the Human Rights Watch campaign group and the Harvard Law School International Human Rights Clinic, Ms Docherty has called on world governments to ban the development of so-called killer robots.

The call to stop autonomous weapons from being deployed in the battlefield comes days before representatives of more than 70 nations address the issue at a United Nations (UN) gathering in Geneva.

The topic of killer robots will be discussed next week between Monday, August 27, and Friday, August 31, by the signatories of the aforementioned convention.

Related articles

She said: “We urge countries at this UN meeting to work toward a new treaty that would save people from lethal attacks made without human judgment or compassion.

We urge countries at this UN meeting to work toward a new treaty

Bonnie Docherty, Harvard University

“A clear ban on fully autonomous weapons would reinforce the longstanding moral and legal foundations of international humanitarian law articulated in the Martens Clause.”

As of April 2018, a total of 26 nations have so far backed calls to ban killer robots, according to campaign group Stop Killer Robots.

One of the nations backing this campaign is China, which stated it has called for a ban on the use of fully automated weapons.

Killer robots: Autonomous weapons would not follow the rules and laws of combat (Image: GETTY)

But not everyone agrees with this world-view and there are some who support the development of lethal autonomous machines.

The proponents of killer robot technology claims banning such weaponry would stifle the potential benefits it could bring to the table.

The United States is one such nation which has in 2012 published a Department of Defense Directive on the “responsibilities for the development of autonomous and semi-autonomous functions in weapon systems”.

The document established a set of guidelines to minimise the risk of smart weapons failing and engaging with unintended targets.

Killer robots: Some countries like the USA have published directives on the issue (Image: GETTY)

Autonomous weapons are described in the document as any “weapon system that, once activated, can select and engage targets without further intervention by a human operator.”

Under these rules, autonomous weapons also include any human-supervised weapon systems that can be overridden by human operators.

Peter Asaro, an associate professor of media studies at the New School, argued in April this year reaching consensus on the issue could be tricky because it is unprecedented.

He said: “Positive and meaningful action on this issue is still within reach, and it is up to the diplomats at the Convention on Certain Conventional Weapons and their governments to prove that they can work together to address the full range of threats to humanity posed by autonomous weapons.”