The article says "We have to manage the ethics of the scientists making the robots and the artificial ethics inside the robots."

I know most of you will be thinking 'oh, robots are no where near advanced enough to even bother with ethics at the moment.' You may be right about the 'artificial ethics,' but what about the ethics of you as a builder?

Almost two years ago I went about an hour north of Baltimore (in the middle of no where) to interview for GDRS (General Dynamics Robotics Systems) for a job (http://www.gdrs.com/). Literally seeing robot tanks driving around outside the offices, I remember running into the ethical question of 'should I really build robots purely designed to kill people?!?' They also told me about a guy that got fired at GDRS, because paraphrased 'he went psycho with a change in ethical thought.'

Since then, its really made me think about military robots . . . The robots I am working on at my current job (for the Navy) are thankfully not for killing . . . but what if you were offered a job of building robots designed to do something you were not comfortable with (sex, killing, spying, etc.)? Would you take it? Would you support giving a robot sentience? What about making robots that care for your kids while you are at work?

After all, robots will be built modeled off the ideals and values of the builder . . .

The majority of my work is defense-related, and has been for many years, so I had to settle the ethics questions in my head a long time ago. I have refused only a few projects over the years, for ethical reasons. One was a non-military application – a transformer set and new controller for Florida’s electric chair “Old Sparky”. Another was a fuze-tester for nuclear warheads. No thanks. There have been others – but mainly because of ethical conflicts other than the nature of the projects themselves. As far as robots as weapons – it’s coming, and fast. The DOD has issued a directive that within the next 14 years, approximately 30% of all U.S. combat vehicles will either be robotic, or capable of being remotely-controlled. Some of those will be ARV’s – Armed Robotic Vehicles. I don’t have a problem with the idea of a weaponized robot per se – but I don’t think they should have the ability to make a target decision without direct human oversight. The automatic gun video posted on this site is one example of that type of weapon; The CROWS system – a remote-controlled vehicle-mounted gun operated by a human – would IMHO be a more ethical weapon to field.

I’m curious about the Navy robots Admin mentioned – If you can say; are these land-based robots along the lines of the PackBot, or sea-based, like the Minehunter RMV?

I think a lot of people look on the down side of robotics in combat....If an angry mob of civilians is attacking a robot you don't endanger any human lives in choosing not to open fire.....But if you are a human being attacked by an angry mob of civilians it is a you or them mentality....

see it all depends who is controlling the robot and to what end.a country that can send hardware to do a "job" without risking it's own personnel is far more likely to use them to resolve political situations.

it's taking the current difference in technological levels that extra step.at the moment a government has to make a moral decision on the behalf of its military and civilian personnel.with robots on the field that becomes a purely financial decision.

Yea how effective is a non-leathal tyrany? where is the fear in that control...That just makes the populace more powerful if a tyrant isn't willing to use lethal forceagainst a populace that is willing to use leath force against a tyrant...

I’m curious about the Navy robots Admin mentioned – If you can say; are these land-based robots along the lines of the PackBot, or sea-based, like the Minehunter RMV?

Its mostly basic science; developing a novel propulsion system for a sea based vehicle.

I used to argue that a war between two armies of robots was much more moral than a war between two armies of humans, and hence developing military robots was therefore morally justified . . . But thats before modern warfare - technologically advanced nations vs gorilla groups - came to pass . . . Would we be so concerned with finding an Iraq solution if the US only had robot 'casualties'?

I don’t believe the robot will ever entirely replace the infantryman on the ground as a combat fighter. The direction I see the development going right now, is to set up better expendable tools for the most dangerous or mundane missions. I can see a fleet of robotic “mobile listening posts”, deployed to forward, vulnerable positions as a tactical early warning system. I can see robotic “Leader-Follower” convoys delivering bullets and beans to troops deployed in forward areas, without endangering other men just to provide drivers for the vehicles. I can see robotic rescue vehicles, sent on their own into high-risk areas, to pull out overwhelmed troops, without unduly risking the lives of their brothers. Any of these machines will need to be able to defend themselves though, and that means armor and weapons – hopefully with a human operating the trigger.

As far as governments being more willing to risk machinery than their own men – well, I guess that is as it should be. Maybe I’m a skeptic, but I really haven’t seen all that much evidence that there is any great hesitation by most countries to use force or to risk their personnel for what appear to be political reasons, as it is. Better to order up a couple of replacement robots, than to call 8 families back home with a “regret to inform you” letter.

True but one could argue that despite human casualties or machine casualties, just leaving themto their own devices is still a morally weak point of view...The change of strategy is to prevent more human loss of life...it's hard to argue if we areor are not doing the 'right thing' by staying in Iraq, and my argument isn't really directedat Iraq in particular...just the fact that maybe doing the right thing will always cost something.So why not make it cost less human lives and take the sting out of the cost...

I really don't want to get into the argument of Iraq because then toes get stepped on and peopleget angry and that concrete example leads to to much emotion being put into what is supposedto be a logical discussion.

My view point is this: Robots are a tool that can be used to keep humans safer. We should usethem where appropriate. But if we never make robots to use in situations and mess up, we cannever evaluate their usefullness accurately... We need more data on how robots will be used and what will become standard practice before we condemn their use. Its seems like a matter ofscience for me...

"Non-lethal weapons," while not always non-lethal, are usually designed to inflict pain. It's really a method of torture and harassment. Prison is actually a non-lethal weapon of deprivation. They're not all bad, but they could become that way. When they're used to silence the people, for instance.