Posted
by
samzenpus
on Monday May 12, 2014 @08:51AM
from the whit-a-push-of-a-button dept.

concertina226 (2447056) writes "The United Nations will debate the use of killer robots for the first time at the UN Convention on Certain Conventional Weapons (CCW) this week, but human rights activists are calling for the robots to be banned. Human Rights Watch and Harvard Law School's International Human Rights Clinic have published a new report entitled 'Shaking the Foundations: The Human Rights Implications of Killer Robots', which calls for killer robots to be banned to prevent a potential arms race between countries. Killer robots, or fully autonomous weapons, do not yet exist but would be the next step after remote-controlled armed drones used by the US military today. Fully autonomous weapons would have the ability to identify and fire on targets without human intervention, putting compliance with international humanitarian laws in doubt. Among the problems with killer robots highlighted in the report is the risk of criminal liability for a military officer, programmer or weapons manufacturer who created or used an autonomous weapon with intent to kill. If a robot killed arbitrarily, it would be difficult to hold anyone accountable."

Auto-targeting weapons are only a matter of time. If a college student can make a gun that spits out paintballs with high accuracy, then the best and brightest likely have items far superior.

Yes, the UN will debate it, but it will be like the debate on land mines. A lot of hand wringing, but nothing really getting done, and the belligerent parties will still make them.

Right now, it is only a matter of perfecting manufacturing. I wouldn't be surprised to see in 5-10 years that sentry robots, which shoot at anything that doesn't have some form of friendly transponder, will become the norm on not any military post, be it Russian, Chinese, Saudi Arabian, or any other place that needs area denial.

Lets be real here... a couple independently active robots with high RPM machine guns are a lot more reliable than soldiers/guards, have no moral issues, have no morale issues, and will "just work". Someone takes one out with a rocket, another can easily return fire.

Add sentry UACVs to the mix, and a rocket attack would be responded in kind.

I wouldn't be surprised to see even civilian warehouses (a data center in a rural area) protected by autonomous firing machines soon. Might makes right, and SCOTUS has shown that money is speech, so any casualties from these would have no criminal/civil consequences ("there was a warning sign".) I would also not be surprised to see this on train tracks and other places, where there isn't a need for it, but the fear of being gunned down by a robot will keep kids from putting pennies on tracks.

The holocaust was conducted clearly by an advanced state, signatory to many treaties and international obligations and "laws", none of which served to make any difference whatsoever, when that state decided they didn't care what the rest of the world thought.

But why stop there? Rwanda, Stalin's purges, China's Cultural Revolution, Kashmir, Iraq-Iran, until the U.S. got actively involved, all the U.S. wars against brown people, etc., etc., etc. When has international law, regulations, or even opinion, ever changed the conduct of an aggressor nation when they decided to go to war? The reason nukes haven't been used since Nagasaki is only because everyone who has them is afraid if they used them in aggression, it would trigger a much higher escalation, and has nothing to do with any treaties, laws, or world opinions.