Automaton army

Ronald Arkin doesn't like war-but the Georgia Tech professor and Department of Defense researcher thinks it could be more humane with the help of armed robots programmed to make ethical decisions. Sound farfetched?

The U.S. military already uses robots to fire on enemy targets: Two examples are the missile-bearing Reaper and Predator drones. Ground-based robots at the military's disposal can disarm roadside bombs or, in the case of the Modular Advanced Armed Robotic System, launch grenades and fire on enemies with a machine gun that swivels 360 degrees.

In all these cases, the robot is remotely controlled by humans, who make the final decision whether to fire. But Arkin's team is developing a software architecture that would "embed ethical constraints into the decision-making process," allowing the robot to make such a decision within the bounds of international laws of war, such as the Geneva Conventions. Arkin told me robots could use image recognition technology to distinguish between combatants and noncombatants, and choose "the right weapon" for a particular situation.

We see you’ve been enjoying the content on our exclusive member website. Ready to get unlimited access to all of WORLD’s member content?
Get your risk-free, 30-Day FREE Trial Membership right now.(Don’t worry. It only takes a sec—and you don’t have to give us payment information right now.)

Choosing the right weapon might mean using a sniper rifle instead of a grenade to avoid damaging a building or injuring more people than necessary. A soldier under heavy fire might opt for the grenade-but a robot has no motive for self-preservation. Arkin points to a 2006 Army Surgeon General survey of U.S. troops in Iraq that found 10 percent of soldiers and Marines admitted to mistreating noncombatants or their property, a behavior more likely among troops dealing with anger or who had seen large amounts of combat.

Robots could behave better in certain situations, Arkin believes, because they aren't influenced by the emotions that can cloud soldiers' judgment. "We are not saying that we are intending to embed the full moral faculties of the human brain into a robotic system," he said, but for a mission with a narrow objective, an ethically constrained robot could protect both soldiers and civilians in a war zone.

No such "ethical" robots have been built yet. Critics of Arkin's approach say only humans can make complex moral decisions, and some concerned activists have called for restricting the development of all military robots. Arkin thinks the advent of lethal, autonomous weapons is inevitable, and as a 25-year veteran of the robotics field, he feels some personal responsibility to ensure people "proactively manage this technology, rather than reactively. ... If you're involved in creating something, you want to make sure that it doesn't end up poorly."

Megatakedown

The FBI shut down one of the world's most popular file-sharing websites, Megaupload, with 150 million registered users, on Jan. 19, while federal prosecutors charged the site's operators with criminal copyright infringement totaling half a billion dollars in damages. Police in New Zealand arrested Megaupload founder Kim Dotcom (aka Kim Schmitz) only after breaking into a safe room in Dotcom's mansion, where the 37-year-old was known to carry on a playboy lifestyle. While legal experts don't agree whether the government's case against Megaupload will prevail based solely on the site's business model, some said that emails from Megaupload employees show they encouraged the sharing of copyrighted songs and videos.

Daniel James Devine

Daniel is a reporter for WORLD who covers science, technology, and other topics in the Midwest from his home base in Indiana. Follow Daniel on Twitter @DanJamDevine.