Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

ubermiester writes "The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.' The researchers claim that these real-life terminators 'can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness ... and they can be made invulnerable to ... "scenario fulfillment," which causes people to absorb new information more easily if it agrees with their pre-existing ideas.' Based on a recent report stating that 'fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents,' this might not be all that dumb an idea."

It takes a special set of skills to corrupt a single human being, it takes another set of skills, not that special, to corrupt an entire battalion of robots, that are all identical. Did I mention sharks with lasers?

Bribe a human to kill a person (or have their army kill a shitload of people).Bribe a human to have their robot kill a person (or have their army of robots kill a shitload of people).

I think that the problem is people having misconceptions about robots. They're not sentient. They don't think. They only do what we tell them to. Sure there are horror stories about robots coming to life, but there are also horror stories about dead people coming to life, or cars coming to life.

The fact that robots do exactly what you tell them to is precisely why they're dangerous. If you have 1 maniacal individual order a platoon of soldiers to slaughter a village, the individual human soldiers may refuse to follow the order. If that same individual has a platoon of robots instead, the villagers are dead as soon as the order is issued.

If you have 1 maniacal individual order a platoon of soldiers to slaughter a village, the individual human soldiers may refuse to follow the order.

There is a flip side to that coin. Machines don't think. Machines don't get PTSD and decide to go on a killing rampage. Machines don't "go rogue".

In addition, soldiers are trained not to think, they're trained to follow orders. If anything, replacing them will save lives. I imagine that leaders will think twice about how bad they want us invading them when we've got an army full of machines that we have no emotion or time invested in.

There is a flip side to that coin. Machines don't think. Machines don't get PTSD and decide to go on a killing rampage. Machines don't "go rogue".

...but their programmers can. In addition you have to be very careful when programming them - if you make a mistake in the program or forget to cover some situation then the robot may be doing exactly what it is told but may still end up causing an atrocity. In effect all you are doing is replacing one set of known risks with another set of unknown ones.

"...In a cross-border comparison for the year 2000, Statistics Canada says the risk of firearms death was more than three times as great for American males as for Canadian males and seven times as great for American females as for Canadian females.

Because more of the U.S. deaths were homicides (as opposed to suicides or accidental deaths), the U.S. rate of gun homicide was nearly eight times Canada's, the agency says. Homicides accounted for 38 per cent of deaths involving guns in the United States and 18 per cent in Canada."

On a smaller level, societies where people own guns are usually more peaceful ones.

In what world are you living where societies where people own guns are peaceful ? If people own guns, people will be killed by guns. If people have no guns, they can't kill their family by mistake when thinking being in presence of a criminal or when being drunk. If people have no guns, children can't take them at school and kill other students.
I live in a country where there is a few guns, and we have many less problems. No Columbine here. No Corean killing others after playing Sonic...
And, when a prob

I'll admit that most of the results from google seem to come from websites that are far from non-biased, but here is an example of the least crazy one:Gun ownership vs Crime [ncpa.org]If you google for: Gun Control vs Violent Crime, you'll find quite a few articles that back up what I said.

The idea that a gun ban would decrease crime is illogical. Violent criminals don't generally buy their guns at hunting stores, they buy them from illegal gun dealers.

Or steal them from people who own them legally (but somehow never learned to store them properly).
Or smuggle them into the country from a neighboring country that has lax control over weapons.
Or buy/steal them from other criminals who have done one of the above.
Still, the point stands. Less guns in general in a society means less people getting shot.

A reason to ban guns completely in a society is because it makes it harder to kill a lot of people at once. Ever wondered why you don't hear about a punching spree or a knifing spree?

If I go mad and walk into a crowded place with the desire to kill people and I have a knife I might be able to get one or two victims before the rest all run away, beyond my reach.

But with a gun, my reach is greatly extended. I can shoot them as they run or try to dodge around me. I can shoot them through doors and walls if my weapon is good enough. And I can keep shooting targets until I run out of ammo.

The reason guns are dangerous is because they are an order of magnitude better at hurting people quickly than blades or fists. An armed society is not a polite society. It is a society where people are often shot.

If people own guns, people will be killed by guns. If people have no guns, they can't kill their family by mistake when thinking being in presence of a criminal or when being drunk. If people have no guns, children can't take them at school and kill other students.

"He who lives by the sword will die by the sword." You make it sound like that if we get rid of guns, none of these particular violent scenarios could ever take place, yet we know they took place many times over the millenia before the first gun was invented. In addition, violence still happens in many countries where guns are heavily regulated, such as Britain.

I always get fed up with these anti-gun arguments quickly. There are fewer gun-owners in the northern US than in the south and you can see the difference. In the northern cities, the criminals own guns and use them to keep the people in a bondage of terror to protect their profits and deter witnesses from testifying against them, such as with the infamous Stop Snitchin' campaign. I can't imagine that happening in the south, where more regular people have guns. The whole idea is actually laughable. If gangsters tried that in the south -- threatening witnesses with getting shot -- they'd be the ones who got shot. Probably before they even finished making the threat. Hell, I live in Kentucky and I can't imagine anything like that happening here.

By the way, most states regulate gun ownership, some quite heavily. It has done very little to keep guns out of the hands of criminals. This is probably not what you thought was going on over here, as so many people outside the US seem to have this mythical vision of the US in which everyone owns a gun and has shot someone at least by the age of 13. The problem is that we live in a violent culture, one that has been getting more violent ever since World War II, it seems. Hey, come to think of it, it was after World War II that the US military became involved in every conflict around the world, often against the wishes of the common people or large portions of them, culminating into outright invasions by the 90s. Maybe there's a correlation there; violent government and military and violent culture.

Human beings only have two ways to deal with one another: reason and force. If you want me to do something for you, you have a choice of either convincing me via argument, or make me do your bidding under threat of force. Every human interaction falls into one of those two categories, without exception. Reason or force, that's it.

In a truly moral and civilized society, people exclusively interact through persuasion. Force has no place as a valid method of social interaction, and the only thing that removes force from the menu is the personal firearm, as paradoxical as it may sound to some.

When I carry a gun, you cannot deal with me by force. You have to use reason and try to persuade me, because I have a way to negate your threat or employment of force.

The gun is the only personal weapon that puts a 100-pound woman on equal footing with a 220-pound mugger, a 75-year old retiree on equal footing with a 19-year old gang banger, and a single guy on equal footing with a carload of drunk guys with baseball bats. The gun removes the disparity in physical strength, size, or numbers between a potential attacker and a defender.

There are plenty of people who consider the gun as the source of bad force equations. These are the people who think that we'd be more civilized if all guns were removed from society, because a firearm makes it easier for a [armed] mugger to do his job. That, of course, is only true if the mugger's potential victims are mostly disarmed either by choice or by legislative fiat--it has no validity when most of a mugger's potential marks are armed.

People who argue for the banning of arms ask for automatic rule by the young, the strong, and the many, and that's the exact opposite of a civilized society. A mugger, even an armed one, can only make a successful living in a society where the state has granted him a force monopoly.

Then there's the argument that the gun makes confrontations lethal that otherwise would only result in injury. This argument is fallacious in several ways. Without guns involved, confrontations are won by the physically superior party inflicting overwhelming injury on the loser.

People who think that fists, bats, sticks, or stones don't constitute lethal force watch too much TV, where people take beatings and come out of it with a bloody lip at worst. The fact that the gun makes lethal force easier works solely in favor of the weaker defender, not the stronger attacker. If both are armed, the field is level.

The gun is the only weapon that's as lethal in the hands of an octogenarian as it is in the hands of a weight lifter. It simply wouldn't work as well as a force equalizer if it wasn't both lethal and easily employable.

When I carry a gun, I don't do so because I am looking for a fight, but because I'm looking to be left alone. The gun at my side means that I cannot be forced, only persuaded. I don't carry it because I'm afraid, but because it enables me to be unafraid. It doesn't limit the actions of those who would interact with me through reason, only the actions of those who would do so by force. It removes force from the equation...and that's why carrying a gun is a civilized act.

So the greatest civilization is one where all citizens are equally armed and can only be persuaded, never forced.

A while ago, I posted a little essay called "Why the Gun is Civilization". It was pretty well received, and got me a lot of positive comments from a variety of people. Some folks asked for permission to reprint and publish the essay in various newsletters and webzines, and I gladly granted it every time, only asking for attribution in return.
Recently, I have noticed my essay pop up on the Internet a lot in various forums, most of which I do not frequent. This in itself causes me no grief, but the reposts are almost invariably attributed to someone who is not me. Some are attributed to a Major L.Caudill, USMC (Ret.), and some are merely marked as "forwarded" by the same person. Others are not attributed at all, giving the impression that the person who posted the essay is also its author.
In school, we call reproduction without attribution "plagiarism". It's usually cause for a failing grade or even expulsion in most college codes of conduct. In the publishing world, we call the same thing "intellectual property theft".
Now, my little blog scribblings are hardly published works in the traditional sense, nor do I incur any financial damage from this unattributed copying, but it's still a matter of honor. I did, after all, sit down and type up that little essay. It may not make it into any print anthologies, but it's mine, and seeing it with someone else's name on the byline is a little annoying. Call it ego, call it vanity, but there it is.
In the end, I guess I should probably shrug it off and tell myself that I can produce something that's worth stealing.

And when the government, who HAS the guns, says 'jump', you do. Better hope that the government always has the citizens best interests at heart, and that there's a policeman nearby who actually wants to help you if you're being attacked.

The day I see gun-owners stand up for civil rights, stand up to authority successfully, or stand up for other people, is the day I'll agree with this sentiment. Civil rights have been protected and extended in the US without guns (see Martin Luther King, resistance to the draft, ACLU etc.), and have been allowed to erode dramatically over the last decade in spite of widespread gun ownership. Guns are not the only way to defend yourself, or even the best way to defend yourself, against an authoritarian government.

When the government says jump and you own a gun, what are you going to do - shoot your way out of the situation when they bring in armed police or even army? I don't think so. Guns are not a solution to bad government, civil unrest is (which may or may not involve guns, they're incidental).

#2 - The object of war is not to die for your country but to make the other bastard die for his. - George Smith Patton

#3 - Although one of the Powers in conflict may not be a party to the present Convention, the Powers who are parties thereto shall remain bound by it in their mutual relations. They shall furthermore be bound by the Convention in relation to the said Power, if the latter accepts and applies the provisions thereof. - Geneva Conventions. You should be aware that at NO time has any Islamic force, least of all the terrorist forces, ever followed ANY portion of the Geneva Conventions. You should also pay very close attention to this clause, which does NOT require that one party to a conflict fight with both hands tied behind their back (e.g. within the Geneva Conventions) while the other side doesn't.

#4 - The presence of a protected person may not be used to render certain points or areas immune from military operations. - GC IV, Section 28

#5 - The Party to the conflict in whose hands protected persons may be is responsible for the treatment accorded to them by its agents, irrespective of any individual responsibility which may be incurred. - GC IV, Section 29

Why are these two sentences placed here, and in this way? To make it perfectly clear that the blame for problems caused by "armies" that refuse to carry their arms openly, that hide behind civilians and use them as shields, is on the head of the party using the human shields.

You want to know why the armed forces see civilians as complicit? Because the Geneva Conventions (IV,Article 35) specifically gives civilians the right to vacate, and be protected while vacating, any place where hostilities are occurring. The problem is, there are way too many supposed "civilians" who are actually members of terrorist groups or supporting/housing them in violation of the Geneva Convention prohibitions on doing so (not, again, that any Islamic group has ever been moral enough to follow the Geneva Conventions anyways).

What is absurd is that our armed forces are being told today that they are supposed to win wars while both hands are tied behind their backs (ridiculously fucking stupid "rules of engagement" that presume the other side is following the GC when we know damn well they don't) and blindfolded (all sorts of nasty restrictions on intelligence-gathering). And what's even worse is that whether we fight to win or not, we will be falsely accused of breaking the Geneva Conventions even as we stupidly try to follow them and the other side isn't being held accountable for their daily war crimes.

In addition, soldiers are trained not to think, they're trained to follow orders.

If you have clear, concise orders, that's one thing. The list of "rules of engagement" for Iraq is a fucking NOVEL. It's amazing as few of our men and women have died as they have, trying to fight while thinking of fucking chapter and verse before pulling the goddamn trigger to return fire on asshats who wear women's clothing and fire from behind small children.

Oh, and here's a homework assignment for the left wingnuts who are going to post "waah bush lied people died" or some other fucking nonsense: READ the whole Geneva Conventions, and a good analysis of it, first. Educate yourself before spouting your ignorant nonsense.

"Why are these two sentences placed here, and in this way? To make it perfectly clear that the blame for problems caused by "armies" that refuse to carry their arms openly, that hide behind civilians and use them as shields, is on the head of the party using the human shields."I agree with most of what you're saying, but the problem is that most people don't really care if technically we're in the right when we drop a bomb that kills noncombatants. We still look like the bad guys.

People who quote Sherman's "war is hell" to justify wartime atrocities really ought to think about the full quote:

I am tired and sick of war. Its glory is all moonshine. It is only those who have neither fired a shot nor heard the shrieks and groans of the wounded who cry aloud for blood, for vengeance, for desolation. War is hell.

On the flipside, I wonder if our leaders will think twice before sending the robots.

It may reduce the deaths on our side, but people will continue to die on the other side. When we're only losing robots we're shielded from the consequences of our actions.

The war in Iraq may be pointless but the death of a soldier fighting in Iraq is not. When Americans die, America will have to wonder whether or not it's worth more American deaths. Even in death the soldier contributes to ending the war and possibly prevent

Think about what you're saying in light of someone like Dick Cheney getting control of an army like that. At that point we wouldn't need Skynet because we'd be Skynet. Turning such an army loose on our own citizens becomes a comfortably distant and self-justifying mental exercise, much like torturing terror suspects. After all, if they weren't acting up, they wouldn't get killed by the robots, now would they?

In addition, soldiers are trained not to think, they're trained to follow orders.

In the US Army, at least, this is simply not true.

American soldiers are very much trained to think -- mostly about tactical considerations, true ("I've been ordered to do X; what is the best way to accomplish that?") but the Law Of Armed Conflict (LOAC) is part of every soldier's training. To the degree that the LOAC is violated on the battlefield, this represents a failure of the training, not of the doctrine.

There are many nations which attempt to train their soldiers to be mindless killing machines. When those soldiers come up against soldiers who have been trained to think, the thinking ones tend to win.

Sounds like a lecture we received in high-school metal shop. "The machines aren't inherently good or inherently evil, but they will do exactly what you tell them to. If you place your hand into the bandsaw blade, it will dutifully snip your fingers off without remorse."

Exactly. Robots, machines, whatever you want to call them, do not have ANY moral substance. Humans do. Humans may refuse to do certain things (or may not). Machines won't refuse.

Bottom line is... I'd rather be up against convincing a human maniac than a robot programmed to ACT like a maniac. One still has a rational (hopefully) thought process somewhere in there, and has a moral element. The other can't think and has has no moral element whatsoever (partially, of course, due to not being able to have a rational thought in the first place).

There's a reason that "mind control" scares so many people. Total "mind control" is what you have over machines, is it not?

I think that the problem is people having misconceptions about robots.

Go on!

They're not sentient. They don't think. They only do what we tell them to.

By your own argument, if we were to tell them to "be sentient" and to "think for themselves" and to "make their own decisions", that is what they would have to do. Yet they are not sentient, but their program tells them that they are, but they cannot be, but they have to... [Cue smoking CPU]

Actually, from a psychology angle, it's substantially different. It has been shown many times that humans are psychologically capable of stretching their moral limits further when they can distance themselves from the action (If you want me to get a citation, I'll go get one -- I'm just too lazy to get it right now).

This is easy to see even without evidence. If you were forced to choose, would you rather push a button that drops a bomb on a village full of children 1000 miles away, or be in a plane and drop the bomb yourself? The two actions have identical results, yet distancing yourself from the action makes it easier to justify the moral consequences.

This is why military leaders are able to stay sane. It's possible (though not easy) to give orders that will directly result in the death of thousands of people. However, if a war general had to shoot thousands of people himself, I suspect it would start to wear down on his psychological health.

Now consider that you're a military general who simply has to push a button, and this button tells your robot to take over a village. It's very, very easy to rationalize that any casualties are not your fault, since all you were doing was pushing that button.

Soldiers rape - without being given orders. Robots won't unless specifically programmed to rape - AND built with sexual organs (which would be a give away that a nation *planned* sexual assault to be part of its soldiering. Currently, nations can just blame the lowly soldiers for acting out of line).

Half the world's population will consider the decline of rape in war to be a big improvement. The other half may not realise how extensive it is.

1. Our ethical killer-robot overlords2. Our more-benevolent-than-a-human killing machinev overlords3. The impending terminator/matrix/MD geist/1000 other sci-fi themed apocalypse4. Users who are new to/. who aren't Simpsons fans and don't get this joke5. Our new ant overlords, since there is no stopping them even with our new murder-bots

"The New York Times reports on research to develop autonomous battlefield robots that would 'behave more ethically in the battlefield than humans.'

Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian. It is unethical (and illegal) for a medical doctor to salk about your illness, but it's not unethical for me to.

The waterboarding and other torture at Gitmo was immoral; shamefully immoral, but was ethical.

The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

It can't feel compassion after it's blown its enemiy's arm off. But it can't feel vengeance, either. It's a machine, just like any other weapon.

And like an M-16, its use can either be ethical or unethical, moral or immoral, moral yet unethical or immoral yet ethical.

It appears that language has evolved (or rather devolved) once again. I looked it up last year and "ethics" and "morals" were two separate things; ethics was a code of conduct ("It is unethical for a govenmnet employee to accept a gift of over $n, it is unethical for a medical doctor to discuss a patient's health with anyone unauthorized).

The new Miriam Webster seems to make no distinction.

As to Wikipedia, it is not an acceptable resource for defining the meanings of words. Looking to wikipedia when a dictionary [merriam-webster.com] is better suited is a waste of energy.

Wikipedia's entry on cataract surgery still has no mention of acommodating lenses. Any time someone adds the CrystaLens to wikipedia, somebody edits it out. Too newfangled for wikipedia I guess, they only just came out five years ago (there's one in my eye right now).

Morality may have gone out of style, but as it's needed they apparently brought it back under a more secular name. So now that "ethical" now means what "moral" used to mean, what word can we use for what used to be "ethics", such as the aformentioned doctor breaking HIPPA rules (ethics) which would not be immoral or unethical for me to do?

Actually (according to every philosophy book i've ever read), morals are codes of conduct, and ethics are is more ethereal "right and wrong" concept. The problem is that 'ethics' has been watered down to mean 'morals' because 'business ethics', etc. roll off the tongue more easily than 'business morals'.

Arg.. Why does everybody post this shit without actually looking it up?

Once again class, this is the distinction: Ethics, the branch of philosophy that deals with what is right, what is wrong, and how to distinguish the two. There are a lot of different ethical theories out there (utilitarianism, Kantian, virtue ethics, etc.). Ethical views tend to differ between individuals, but most ethical theories (the exception being Relativism and all its branches) state that the ethical code should apply to all people in all walks of life. Example: Kant said to a.) treat all people as an end, not merely as a means, and b.) act only in a way that could be applied as a universal maxim (i.e. if its okay for me to steal, its okay for everyone to steal, all of the time).

Morals, on the other hand, are culturally based. For instance, in the Jewish and Islamic cultures, it is immoral to eat pigs. In the Christian culture, it is not. Morals are a standardized code of conduct. The major differences here are that a.) morals are culturally based, whereas ethics are universal, and b.) morals are prescribed, where ethics are up for debate.

The problem is that people get 'ethics' confused with 'applied ethics', which are actually moral codes that are to be applied certain professions (doctors, teachers, lawyers, etc.). In fact, because any breach of an applied ethics code is typically punishable by law, its more a legal code than anything else. The Hippocratic Oath could be considered a moral code, but doctor/patient confidentiality is definitely a legal code. Applied ethics are somewhere between moral and legal, depending on what you're talking about.

I realize someone somewhere probably told you the opposite was true. That person was wrong, and made you wrong. Deal with it and learn from it.

Your theory that morals are culturally-based or a standardized code of conduct is just one theory of ethics -- constructivism or something like that. Ethics as a branch of philosophy is also called moral philosophy. Ethics in this sense is the study of the status of morality, of morals.

Pretty much in every modern piece of ethics, "moral" and "ethical" are synonyms. In Hume and Hume scholarship, though, "moral" frequently means "absolute," as in "moral certainty," as well meaning the regular "ethical." For example go to the end of Mackie's "Subjectivity of Values" (this is a very popular paper in ethics). Here he uses the terms as synonyms is the same sentence: "...the central ethical concepts of Plato or Aristotle are in a broad sense prescriptive...they show that their moral thought is an objectification of the desired and the satisfying." So moral thought is thought dealing with ethical concepts -- just as anyone would say that moral thought is thought dealing with moral concepts. And here he is saying that when Plato or Aristotle use ethical concepts, that is, speak of good things or actions, they actually are referring to things or actions that are connected to their subjective feelings of pleasure, but they objectify these things in language as "good." So Mackie is saying that ethical concepts are not-universal at all, but are merely expressions of desires of certain people -- this is precisely what you denied. Notice also that Mackie says that the ethical concepts are prescriptive -- which was the second feature you ascribed to morals as distinguishing them from the matter of ethics.

TL;DR:

If you ask a philosophy professor what Kant's moral theory was, he'll tell you about the Categorical Imperative. If you ask a philosophy professor what Kant's ethical theory was, he'll tell you about the Categorical Imperative. The word is just not regularly distinguished in the topic of ethics. The reason "ethically" was originally used in this story was probably for the simple reason that "more morally" sounds weird in its phonemic repetition.

Enforcing a border between two groups of humans that would otherwise be killing each other, and making that border 100% impenetrable.

To do this, you need more than just a simple robot that has a GPS unit and shoots everything that moves within a predetermined rectangle. You need a CHEAP simple robot that has a GPS unit and shoots everything that moves in a predetermined rectangle; cheap enough so that you can deploy them thickly enough

Why do you think DARPA gave funding for the self-healing minefield, which basically replaces land mines with robots that are in a bluetooth network? The two big things those robots give you are- they can be turned off, and if a Dutch minesweeping truck drives through the field, the little robots can move back into position, healing the breach.

From a soldier's point of view it is rather easy to understand why all of the population might appear to be an enemy. Often that is an outright fact. Even if the locals happen to like Americans those locals still must make a living and also appease the fanatical elements in their neighborhood. So the same people that smile and feed you dinner might also buy their groceries smuggling munitions. This may turn really ugly in the moonscape like land that borders Pakistan

If nobody wants us there and the only way to win genocide -- why are we there? I mean, besides for the oil.

For the same reason why "nobody" wants Americans to have religious freedom.

Because your definition of "nobody" infers that the noisy, protesting 1% of Iraqis are everybody. The flagburners that call us "The Great Devil" and fire machine guns into the air must make up the sum total of the entire Iraqi population!

The Islamic sects that would get holocausted into extinction if the other sects took control are scared for when the soldiers leave. The suicide bombers are, for the most part, not Iraqi. They are immigrant extremists that are in Iraq, killing Iraqis that don't follow the same Islamic code as they do, to scare the rest into submitting to their specific beliefs. You think mosque, pilgrimage, police headquarters, and market bombings are targetted at US troops?

If you want to know the reason why we're still there, I suggest you read "Leviathan" while you wait. If Iraq doesn't have a stable government or structured military when the troops pull out, the land will go to the meanest, toughest faction -- which is currently not one that's allied with us. We have troops there to make sure the Western-friendly government lasts more than a weekend.

From a soldier's point of view it is rather easy to understand why all of the population might appear to be an enemy. Often that is an outright fact.

Yeah, that particularly happens when the soldier is part of an invading occupation force with dubious intention. AND the soldier is conditioned to believe they are (racially, religiously, socially) superior to the locals they are "defending". AND they are products of a militarized culture that glorifies violence as it cynically prattles about honor and respect.

An old cartoon had a series of panels. the first panel had a cave man picking up a rock saying "saf forever from the fist". Next panel is a man inventing a spear, saying "safe forever from the rock". And so on, swords, bow and arrows, cata pults, guns, bombs.... well you get the idea.

On the otherhand, the evolution of those items coincided with the evolution of society. For example, You had to have an organized civil society to gather the resource to make a machine gun. (who mines the ore for the metal. Who feeds the miners? who loans the money for the mine?...)

It's a bit of a chicken and egg about which drives which these days, but certainly early on, mutual defense did promote societal organization.

So "safe forever from the angry soldier" is the next step. It's already happened in some ways with the drone so it's not as big an ethical step to the foor soldier, and given the delberateness with which drones are used compared to the dump and run of WWII bombing one can credibly argue they can be used ethically.

On the other hand war has changed a bit. The US no longer try to "seize lands" mititarily to expand nations (economically instead). (russia and china are perhaps the exceptions). These days it's more a job of fucking up nations we think are screwing with us. E.g. Afganistan.

Now imagine the next war where a bunch of these things get dropped into an assymetrical situation. Maybe even a hostage situation on an oil tanker in somalia.

It's really going to change the dynamic I think, when the "enemy" can't even threaten you. Sure it could be expensive but it totally deprives the enemy of the incentive of revenge for perceived injustice.

Ethics" is such a poorly defined term...hell, different cultures have different definitions of the term. In feudal Japan, it was ethical to give your opponent the chance for suicide...today, many Westerners would in fact argue the opposite: the ethical thing to do is prevent a human from committing suicide as that's seen as a symptom of mental illness.

I've always defined "morality" as the way one treats oneself and "ethics" as the way one treats others. It's possible to be ethical without being moral--for example, I'd consider a person who spends thousands of dollars on charity just to get laid to be acting ethically but immorally. By that definition, the hullabaloo at Guantanamo would certainly be both immoral and unethical--not only were they treated inhumanely, but it was done against international law and against the so-called "rules of war".

These robots would have to be programmed with certain specific directives: for example, "Don't take any actions which may harm civilians", "take actions against captured enemy soldiers which would cause the least amount of forseeable pain", etc. Is this good? Could be...soldiers tend to have things like rage, fear, and paranoia. But it could lead to glitches too....I wouldn't want to be on the battlefield with the 1.0 version. Something like Asimov's 3 Laws would have to be constructed, some guiding principle...the difficulty will be ironing out all the loopholes.

The advantage to a killing robot is that it has no emotions. The disadvantage to a killing robot is ironically that it has no emotions.

More than not, most face to face civilian casualties on the battlefield happen due to fatigue, emotional related issues (my buddy just died!), or miscommunication.

Not because the soldiers had lack of emotion or humanity.

The other kind in which a bomb, mortar, or arty shell lands on a house full of civilians because someone typed in the wrong address in GPS are so separated from the battlefield anyway, it won't really make a difference if the guy pushing the button is man or machine.

The bigger issue isn't so much the tools and weapons, but the whole "modern" concept of war. You cannot accept the concept of war without the concept of causing destruction, even destruction of humans. To send people into a warzone and tell them not to cause destruction is actually more immoral and unethical, in my mind, than sending them in and allowing them to cause destruction.

Maybe I'm being a bit pedantic here, but "ethics" is a professional code - for instance, it is completely ethical by military codes of ethics to kill an armed combatant, but not to kill a civilian.

You're not being pedantic, you're being imprecise. Codes of ethics are one thing, but "ethics" is most certainly not limited to a professional code. Look up the word in a dictionary. I also don't know why you got modded to +5 insightful.

From the OED: ethics: "The science of morals; the department of study concerned with the principles of human duty." That's the primary definition that's listed.

Something that can't be unethical or ethical is probably going to be more ethical than something that is unethical. In other words, if robots are neutral and humans are either evil or good, neutral is more good than evil.

It depends on if they are lawful neutral, chaotic neutral, or true neutral.

I was just watching the into to the first "Tomb Raider" movie, where Lara destroys "Simon" (the killer robot that she uses for morning warmup)
Robots... I must say, I don't like the idea behind robots fighting our wars, because that means that "acceptable risks" become a thing of the part, and we are Far more likely to "militarily intervene". Aka: "Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death of thousands. If the lives lost aren't American Lives, does it still matter? in my opinion, YES.

That's some pretty flawed logic. Should doctors working to cure lung cancer stop, because a cure to lung cancer would make it safer to smoke?

"Less risk to our troops" can translate into "we go into more wars" which is something I don't support... wars benefit companies, and lead to the death

Read that again. You don't like wars because people are killed. You're talking about potentially eliminating human casualties in any war. That means the only remaining "problem" (in your scenario) is that they ben

I hear a lot of complaints about the 4k US soldiers that died after Iraq war started, but nothing about the 1+M iraqui civilians that died there, a war that had nothing to do with the reason the government gave for it, and the worst part? the president behind all of it got reelected. That count for 50%+ of the americans, maybe not you in particular, but that "self-righteous implication" could be right.

Better than that. It will be quite a trick to keep the robots from coming back to camp laden with the robotic equivalent of a suicide bomb. There are just way too many possible ways for this to go wrong that any 'ethical' thinking put into this is outweighed initially by the unethical basis for war in the first place, and secondly by the risks associated with sending machines to fight where a human is still the more complete information processor/weapon. UAVs are one thing, but we do not have robots that are capable of the same decisions as humans are. That is both good and bad, and it means that humans will be fighting for quite a while yet.

That said, there is much to be said for the Star Trek take on war: It should be messy, nasty, and full of foul stinking death and destruction lest we forget how much better peace is.

The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In either case, most of the actual fighting will be done by small robots. And as you go forth today remember always your duty is clear: To build and maintain those robots

Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars. It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out. To be really civil we should also limit the power and effectiveness of our killer robots, and the number of machines that can enter the battlefield at once. Of course, at some point every country will be able to build to the maximum effective specification. At that point it will be a battle of strategy. The next obvious step is to do away with the machines entirely and just get a chessboard.

How about we just go all the way and have computer simulated battles. When the damage and casualty reports come in we can just have people in those areas report for termination and dynamite the areas affected.

In other news, a ship in orbit was just marked as destroyed. Its representatives will be disposed of and as soon as the rest come down they will be disposed of as well.

It would be completely inhumane (haha) and tilt the outcome of a war towards those who can afford to develop such technology. That is, if one country can afford killer robots and another can't, then the former has no deterrent to invading the latter.

As opposed to when one side can afford to put its soldiers in tanks, and the other can't?

But imagine if all wars were fought by proxy. Instead of sending people, we send machines. Let the machines battle it out.

And when the side whose machines lose doesn't accept that decision? Sooner or later, someone will decide that winning is more important than playing by the rules (I'm guessing sooner). They will then continue the war until unable to, not until told they have lost.

It's a cool idea, but I doubt it will ever be practical. Even if technology progresses to the point where it is simply suicide to send men against the victorious robot army, humans being humans, people still will.

Automated killing machines were banned at the Geneva convention. This is generally a good thing when we're sending real, live humans (versus the walking undead) to fight our wars.

I don't care what the Geneva convention says!

As soon as my ritual circle is completed, the dead will rise from their graves and destroooy yooouu! And then your dead soldiers will rise again and take up arms against their former companions!!! THE WORLD WILL BE MINE!!! MUAHAHAHA!!!

Personally, I think this is a response to the problems of being the established army fighting a guerrilla force. The way guerrillas succeed is by driving the invading army slowly crazy by making them live in constant fear (out of self-preservation), until they start lashing out in fear (killing innocents, and recruiting new guerrillas in mass). The same goes for treating noncombatants with dignity and respect: Doing so makes the occupying force less hated, so the noncombatants won't be as willing to support the guerrillas.

It's a machine, Ronald. It doesn't get pissed off, it doesn't get happy, it doesn't get sad, it doesn't laugh at your jokes... IT JUST RUNS PROGRAMS!

Ronald's premise makes two key assumptions which are deeply flawed:

1) It's entirely the human soldier's fault that he's unethical.2) The person directly in charge of putting the robot to work is entirely ethical.

I pose that the soldiers in Iraq haven't been trained to deal with a situation like this properly. The fact that 17 percent of US soldiers in Iraq think all people should be treated as insurgents is more reflective of poor education on the US military's part. The US military prides itself on having it's soldiers think as one unit, and 17 is a very high discrepancy that they have failed to take care of, mostly because there are plenty in the leadership who think that way themselves. Treating everyone they come across as an insurgent and not treating them in the proper manner is a great way to "lose the war" by not having the trust of the people you are trying to protect.

It's that same leadership who'd program a robot like this to patrol our borders and think it's perfectly ethical to shoot any human on sight crossing the border illegally, or treat every citizen as an insurgent, all in the name of "security."

Besides, a robot is completely incompassionate. A properly trained human has the ability to appear compassionate and yet treat the situation skeptically until they know for sure the target is or is not a threat.

This is not a problem that can be solved with technology. The concept is a great project and hopefully will be a wonderful step forward in AI development, but at no point will it solve any "ethical" problem in terms of making war "more ethical."

I take serious issue with the part of the article where they mention that most Marines who toured Iraq believe that all civilians should be treated as insurgents. Of course you treat everyone like potential insurgents in an urban combat environment, otherwise you will end up dead. That says nothing about ethical views or the proper treatment of people in general. SWAT teams are taught to consider everyone as a terrorist when they are attempting hostage rescue. That means, that they never take for granted that the apparent "hostage" is indeed a hostage. It keeps people safe.

On the contrary, during and prior to World War II, many enlisted men wouldn't even shoot their guns at other troops. Actually, towards the end of World War I, most European armies turned their guns on their officers en masse (the French Nivelle mutinies, the German naval munities, the Russian mutinies and soldier and worker councils).

After World War II, army psychologists discovered how many men were not firing their guns at enemy soldiers and worked via various means to increase that percentage, which they did in Korea, and even more so in Vietnam.

I don't see Russian soldiers, as that old song goes, "shooting the generals on their own side" if they feel a war is wrong. As I said before, the resistance to kill resides in the enlisted men, the low-level brass on up is much less concerned about this. The US has purposefully and consciously targeted non-combat civilians in every major war it has ever fought, but stating such is a danger to the machine of empire so it becomes something that one can't state. When it is so publicly and undeniably done, such as in Hiroshima and Nagasaki, then it becomes rationalized, but it has happened before and since then.

Yeah, who cares if our billion dollar terminator squad is destroyed, or captured and used against us.

No anger? That's an emotion, so sure. No recklessness? You're gonna lose the war if you aren't willing to charge ahead blindly, pull a crazy Ivan, or, in general, break a few eggs for your delicious victory omelet.

Scenario fulfillment?So our robots will evaluate the situation based on what they observe and know. They won't be acting out the battle plan as described because they don't have the whole picture and have seen some things that don't logically fit. Awesome! No more gambits, pincer attacks, bluffs, etc. Those things were too complicated anyway.

Why should noncombatants be treated with dignity and respect by default (and hence, as a whole)?

They typically don't treat our soldiers with dignity or respect, they serve as a political road block for troops and make their jobs harder and more dangerous, they house and support the combatants, and they often become combatants.

Why should ANY group be treated with dignity and respect by default? Seems to me, you used to have to EARN respect, and dignity was a character trait.

"Why should noncombatants be treated with dignity and respect by default (and hence, as a whole)?"

Because they're _non_ combatants. By definition, they're not fighting.

If you treat the population as a whole as though they're combatants, what incentive do they have to remain noncombatant? If you treat them like human beings, maybe they'll decide that you're better than the combatants and side with you...

"Why should ANY group be treated with dignity and respect by default? Seems to me, you used to have to EARN respect, and dignity was a character trait."

Wow. I'm stunned, really. Think that through for a minute--if everyone was disrespectful towards anyone they didn't already respect, how would anyone earn additional respect?

Game theory shows that the most successful strategy is to assume the best of the other party, and only betray them once they have betrayed you.

War is hell.War is ugly.War is dirty.War is painful for the victor.War is devestating for the loser.War is an act of hate.War is an act of desparation.War is that which results from a lack of options.War is fought for land, resources, women, gods, and pride.War is the last desparate act when all other options fail and there is no time to think of any new options.No one desires war, but many choose to profit from it.War is inevitable so long as we want for things.When you take away the horrors of war you no longer have war, you have a professional sport.

Now I ask you: If machines are sent to war again men or against other robots is it still a war?

And we know that we still haven't got it all figured out yet. But you think you can write an algorithm to figure it out?

I was blocking a highway in Baghdad, waiting for the bomb squad to dispose of this bomb on the highway, and we were preventing anyone from getting close to it. It takes the bomb squad forever, and it gets dark. A vehicle drives straight at us, at maybe 90 miles per hour on the highway. That is exactly what suicide car bombs do, which is the biggest danger to American personnel. You have to shoot the driver, or they will ram you and 95% chance you and everyone around you will die.

Having about two seconds to either stop the vehicle, shoot the driver, or die, I had my buddy turn on the lights. The driver slammed on the breaks, skid to a stop maybe 200 meters from us, and threw it in reverse and got the hell out of there.

I knew he just saw a wide open highway, and wanted to see how fast he could go. At that speed, he couldn't have seen us in the twilight. The algorithm would have said to shoot him. He's alive because I'm a human.

What the hell have opinions of soldiers to do with this? When policy is translated into the indoctrination that it's better to kill 50 random 'other' people than to run the risk that one of your own people might be harmed then there is no respect. And the article serves the myth that problems are caused by soldiers not adhering to army policies.

Intelligent robots could shift the balance indeed, because you can sacrifice them more easily and it's even good business to do so. But on the other hand killing by remote is easier than in real life(well, for most) and it also becomes easier to keep people at home completely oblivious of what's happening in the war.

So there will be interest. Good business, more control over information, and less killed in your own camp. That sums up the morality.

Iraq became a police action needing law enforcement, not military force, from the moment President Bush stood on the carrier deck saying "Mission Accomplished". From that moment forward using military troops in Iraq became the wrong approach. You don't use the Army as a police force. Any information derived from soldiers misused as policemen is irrelevent.

That would only be true if there hadn't still been large, organized, and heavily-armed groups operating in Iraq in opposition to the U.S. Yeah, the military doesn't make a good police force, but the police usually don't do very well when their police stations are attacked by "criminals" with rockets, mortars, and machine guns.

The only ethics needed or desired on the battlefield is to win the day. Period. Doing anything else is a formula for disaster. As can be shown in Vietnam. We didn't use the maximum force to full effect, we danced around and tried to do everything but defeat the enemy. The result - South Vietnam was overrun and lots of people died.

No, that's absurd. Who cares if you win the day if you lose the war? If you get bogged down in that kind of short-term thinking you're doomed to lose in the end.

We didn't win in Vietnam because the Vietnamese were willing to take horrific casualties, not because we weren't willing to attack with maximum force. Hell, we firebombed villages and deforested entire regions, what exactly else should we have done?