Warfare Will Be Revolutionized: UN Debates Autonomous Weapons, Many Call for Ban

The United Nations has convened a week-long meeting on lethal autonomous weapons systems, as a number of groups and academics call for an international ban on such provisions.

Talks on lethal autonomous weapons systems began at the United Nations November 13, amid calls for an international ban on independent "killer robots" that could revolutionize warfare — discussions are scheduled to last all week, under the banner of the Convention on Certain Conventional Weapons.

​The summit in Geneva comes after over 100 major figures in technology and science co-signed a letter warning such weapons systems could lead to a "third revolution in warfare" in July.

"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. The deadly consequence of this is machines — not people — will determine who lives and dies," the letter read.

While there is presently no international consensus on what constitutes a lethal autonomous weapon system, they are often defined as systems capable of targeting or firing without meaningful human control, functioning independently via artificial intelligence and machine learning.

All weapon systems, including Lethal Autonomous Weapons Systems, must comply with the rules, norms and principles of international law. EU statement today at @UNGenevahttps://t.co/0TOjqUlgLf

​Many modern weapons, including drones, precision-guided munitions and defense batteries, already have varying levels of autonomy, but remain dependent on various degrees of human control — although the average citizen may underestimate the extent of automation and computerization of warfare in the present day. The distance between humans and battlefields is already significant in some cases, and is growing constantly.

Nonetheless, lethal autonomous weapons systems do not literally exist today — or at least, have not been revealed to the public yet. Perhaps the closest any military has to such a device is Israel's Harpy anti-radar drone — after launch, it flies over an area to find radar fitting predetermined criteria, then strikes.

South Korea has also developed a sentry gun system to guard its border with North Korea, which includes surveillance sensors and tracking as well as automatic firing, and can be made completely autonomous — although it can only currently engage with human approval.

​"If this trend towards autonomy continues, the fear is humans will start to fade out of the decision-making loop, first retaining only a limited oversight role, and then no role at all. The US and others state lethal autonomous weapon systems "do not exist" and do not encompass remotely piloted drones, precision-guided munitions, or defensive systems," the Campaign to Stop Killer Robots said in a statement.

The NGO added that while the capabilities of future technology are uncertain, there are "strong reasons" to believe fully autonomous weapons could never replicate the full range of inherently human characteristics necessary to comply with international humanitarian law. Moreover, it claimed existing mechanisms for legal accountability are ill-suited and inadequate to address the unlawful harm fully autonomous weapons could cause.

Gone Awry

Several countries, including the US, Russia, China and Israel are all researching and/or developing lethal autonomous weapons systems — and the implications of their use are not entirely negative. Technologically advanced sensors and guidance systems could improve targeting, and produce fewer unintended casualties compared to traditional bombing systems.

​Retired US Colonel Brian Hall, autonomy program analyst at the Joint Chiefs of Staff, praised the technology in July, arguing it would positively augment human decision-making, not replace it.

"The role of humans will aggregate at the higher cognitive processes such as operational planning and analysis. Increased automation or autonomy can have many advantages, including increased safety and reliability, improved reaction time and performance, reduced personnel burden with associated cost savings, and the ability to continue operations in communications-degraded or —denied environments," he wrote.

However, Hall acknowledged the pace of advance in science and technology meant the weapons capability of autonomous weapons in the future would be difficult to predict, and required legal and policy changes.

Hello,
!

We are committed to protecting your personal information and we have updated our Privacy Policy to comply with the General Data Protection Regulation (GDPR), a new EU regulation that went into effect on May 25, 2018.

Please review our Privacy Policy. It contains details about the types of data we collect, how we use it, and your data protection rights.

Since you already shared your personal data with us when you created your personal account, to continue using it, please check the box below:

I agree to the processing of my personal data for the purpose of creating a personal account on this site, in compliance with the Privacy Policy.

If you do not want us to continue processing your data, please click here to delete your account.

promotes the use of narcotic / psychotropic substances, provides information on their production and use;

contains links to viruses and malicious software;

is part of an organized action involving large volumes of comments with identical or similar content ("flash mob");

“floods” the discussion thread with a large number of incoherent or irrelevant messages;

violates etiquette, exhibiting any form of aggressive, humiliating or abusive behavior ("trolling");

doesn’t follow standard rules of the English language, for example, is typed fully or mostly in capital letters or isn’t broken down into sentences.

The administration has the right to block a user’s access to the page or delete a user’s account without notice if the user is in violation of these rules or if behavior indicating said violation is detected.