Are These 'Killer Robots' or a Smart Move?

Nov. 22, 2012

Page 2 of 2

Why Human Rights Watch Opposes Fully Autonomous Weapons

"Human Rights Watch and Harvard Law School's International Human Rights Clinic believe that such revolutionary weapons would not be consistent with international humanitarian law and would increase the risk of death or injury to civilians during armed conflict. A preemptive prohibition of their development and use is needed."

The report notes that roboticists have suggested developing robots that would be able to use algorithms to "analyze combat situations," and the use of artificial intelligence that would attempt to mimic human thought, but points out that "these rules can be complex and entail subjective decision making, and their observance requires human judgment."

Human Rights Watch argues that robots would not have the restraint provided by human emotion or the capacity for compassion, and goes so far as to suggest that they could "serve as tools of repressive dictators seeking to crack down on their own people without fear their troops would turn on them."

All lethal weapons currently in use still involve some human interaction.

Why the U.S. Military Thinks There are Benefits

"Militaries value these weapons because they require less manpower, reduce the risks to their own soldiers, and can expedite response time," according to the report.

The lack of emotion the Human Rights Watch named as a concern could also work in the military's favor. Proponents argue that automated weapons would not be able to kill out of fear or rage, and therefore less likely to kill irrationally.

The report notes that the U.S. Department of Defense wrote in "Unmanned Systems Integrated Roadmap FY 2011-2036" that it "envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure."

U.S. Army Lt. Col. James Gregory said during an interview with ABC/Univision News that the Department of Defense "is not currently reviewing any autonomous weapon systems."

He added in an email that "operators controlling [weapons such as the Predator drone] undergo extensive protocols whenever any lethal force is employed. Regardless of their physical location, weapon system operators must comply with the same standards for the use of force -- the law of war and applicable rules of engagement."

As the report acknowledges, both the U.S. Department of Defense and the U.K. Ministry of Defense have said that they don't plan, for the foreseeable future, to remove human control from the use of unmanned weapons.