UN Panel Agrees to Move Ahead With Debate on 'Killer Robots'

U.N. officials say in theory, fully autonomous, computer-controlled weapons don't exist yet but defining exactly what killer robots are and how much human interaction is involved was a key focus of the meeting

Receive the latest national-international updates in your inbox

Astronauts Repair Robotic Arm on Space Station

Two NASA astronauts ventured outside the International Space Station to repair a robotic arm. It is the first of three spacewalks planned for October.

(Published Thursday, Oct. 5, 2017)

A U.N. panel agreed Friday to move ahead with talks to define and possibly set limits on weapons that can kill without human involvement, as human rights groups said governments are moving too slowly to keep up with advances in artificial intelligence that could put computers in control one day.

Advocacy groups warned about the threats posed by such "killer robots" and aired a chilling video illustrating their possible uses on the sidelines of the first formal U.N. meeting of government experts on Lethal Autonomous Weapons Systems this week. More than 80 countries took part.

Ambassador Amandeep Gill of India, who chaired the gathering, said participants plan to meet again in 2018. He said ideas discussed this week included the creation of legally binding instrument, a code of conduct, or a technology review process.

The Campaign to Stop Killer Robots, an umbrella group of advocacy groups, says 22 countries support a ban of the weapons and the list is growing. Human Rights Watch, one of its members, called for an agreement to regulate them by the end of 2019 — admittedly a longshot.

Meet 'Pepper' the Robot Priest

Japanese telecommunications company Softbank has built a robot programmed to fulfill religious duties. The robot, named "Pepper," is on display at a funeral and cemetery expo in Tokyo.

(Published Wednesday, Aug. 23, 2017)

The meeting falls under the U.N.'s Convention on Certain Conventional Weapons — also known as the Inhumane Weapons Convention — a 37-year old agreement that has set limits on the use of arms and explosives like mines, blinding laser weapons and booby traps over the years.

The group operates by consensus, so the least ambitious goals are likely to prevail, and countries including Russia and Israel have firmly staked out opposition to any formal ban. The United States has taken a go-slow approach, rights groups say.

U.N. officials say in theory, fully autonomous, computer-controlled weapons don't exist yet but defining exactly what killer robots are and how much human interaction is involved was a key focus of the meeting. The United States argued that it was "premature" to establish a definition.

The concept alone stirs the imagination and fears, as dramatized in Hollywood futuristic or science-fiction films that have depicted uncontrolled robots deciding on their own about firing weapons and killing people.

Ambassador Gill played down such concerns.

"Ladies and gentlemen, I have news for you: The robots are not taking over the world. So that is good news, humans are still in charge ... We have to be careful in not emotionalizing or dramatizing this issue," he told reporters Friday.

Parkland Shooting Survivor Calls 'BS' on Politicians' Gun Stance

Marjory Stoneman Douglas High senior Emma Gonzalez had a message for president Donald Trump and for other politicians on their failure to enact sensible gun laws: "BS." Gonzalez was one of several survivors to speak at a rally held outside the Federal Courthouse in Fort Lauderdale, Florida, to speak out against the gun lobby.

(Published Sunday, Feb. 18, 2018)

The United States, in comments presented, said autonomous weapons could help improve guidance of missiles and bombs against military targets, thereby "reducing the likelihood of inadvertently striking civilians." Autonomous defensive systems could help intercept enemy projectiles, one U.S. text said.

Some top academics like Stephen Hawking, technology experts such as Tesla founder Elon Musk and human rights groups have warned about the threats posed by artificial intelligence, amid concerns that it might one day control such systems — and perhaps sooner rather than later.

"The bottom line is that governments are not moving fast enough" said Steven Goose, executive director of arms at Human Rights Watch. He said a treaty by the end of 2019 is "the kind of timeline we think this issue demands."