Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Doofus writes "A recent article in the Washington Post, A future for drones: Automated killing, describes the steady progress the military is making toward fully autonomous networks of targeting and killing machines. Does this (concern|scare|disgust) any of you? Quoting: 'After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look. Target confirmed. This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial "Terminators," minus beefcake and time travel.' The article goes on to discuss the dangers of surrendering to fully autonomous killing, concerns about the potential for 'atrocities,' and the nature of what we call 'common sense.'"

I've always wanted to cream the Blue Team in Paintball. From home. I wonder when this tech will be available for toy companies. Especially when the 2012 Geneva Convention on Laws of Armed Robots in Combat declares them unfit, thereby resulting in a blackmarket for jailbroken drones.

I don't think that having good ol' fashioned humans die in our wars is morally required of a sovereign people, rather, I question how we can truly feel ownership of our society if we do not control it, protect it, assist it and direct it.

I think there's another issue to consider before we even get near to asking questions about "societal ownership".

Automating front-line offensive & defensive forces makes it much easier for a government to use it's military might against its' own citizens, as there will be far less of a problem with human officers and front-line soldiers refusing to open fire on their fellow citizens and/or issue orders to that effect.

Somebody in the White House, Pentagon, or some military installation just types a command and pushes the "Enter" key and people are automatically hunted down and killed. A tyrant's dream.

Robots and drones are already being utilized in domestic law enforcement, so how long would it be before these fully-automated weapons systems were used domestically? You know they will be eventually if we allow it. History shows us that human nature is all too predictable when it comes to governments having immense power over relatively defenseless citizens. Governments always seek more power & control, and it never ends well once they achieve a large amount of it.

That's all well and good, but I am more concerned about our robotic overlords commanded by the one or few who need killers without conscience and without any sense of right or wrong.

We already have a government in the US who felt it was necessary to use contractors to perform acts which exceed that which the military service members should do. But that's not good enough. They want killers who will kill, ask no questions and tell no one about it.

Which is why civilised countries [wikipedia.org] have already outlawed them. No decent human could encourage the spread of the things which kill many civilians, animals and for the most case mine clearers for every attacking soldier they kill.

N.B. the treaty still allows anti-tank mines and even remote triggered claymore mines so it's still possible to do area denial against serious military forces. I will give Koreans a small out in this case in that this was the way that there border was divided long before the treat

Just politically correct. The US already has policies in place that effectively meet and exceed the goals of the Ottawa treaty.

We stopped selling mines, we destroyed old stockpiles. we have spent over a billion dollars clearing mines and helping victims (usually not our mines). Our new mines are self-destructing or self-disarming, and policy is to not place one without its position being recorded, and that it be removed from any battlefield after its need has passed.

Even with that, the only place we actually use them is in the Korean DMZ. The last time we used them in combat was the Gulf War, in limited use. These were scatterable mines, fired or dropped to a specific grid coordinate to deny use of that small area to the enemy. Since this was their first use we did make mistakes, as apparently not every shot was recorded and reported for later easy cleanup. Rules for their use have since been changed, and by now they should be converted to self-destructing or self-dearming anyway.

Bingo. The US has spent years phasing out land mines, and if it wasn't for the Korean DMZ, it would be a signatory to the Ottawa Treaty. It would be a backwards step if they built new weapons where humans do not make the targeting decision.

In the type of asymmetric warfare we are engaged in today robots may not be very effective especially at their current intelligence. Using a gun caries great responsibility and attaching one to a robot and having it make intelligent life and death decisions is something that is far off. I hope by the time we automate killing that we will have fixed most of the problems that cause us to go to war.

If you can have a discussion about war without talking about economics, then you're probably a child. If you can have a discussion about war without talking about ethics, then you're probably a politician.

I assume you're referring to Ron Paul here, but you're somewhat wrong about the idea that the only opposition to war comes from libertarians. Among others, you can point to Ron Paul's frequent Democratic ally on stopping wars, Dennis Kucinich - he's staunchly anti-war, and staunchly pro-welfare, and polls about as well nationally as Ron Paul.

The 'automated recognition' in this case was a large orange tarp. The difficulty of creating an automated recognition algorithm for an orange object in a natural background is extremely low. Wake us up when this thing can recognize camouflaged tanks in a forest.

... in Afghanistan, Iraq, Libya etc. I don't think US military gives a crap about civilians. Albeit they don't tend to shoot them on purpose, they don't give a shit about collateral damage when bombing their 'suspected targers'. Sorry for being trollish, but I suspect that filtering out civilians isn't on the top on their priority list.

Science fiction writer Cordwainer Smith called them "manshonyaggers" in a story published back in the 1950's. The word was supposed to be a far-future corruption of "menschen" and "jager", or "manhunter".

It looks like his future is going to get here a lot faster than he thought.

I'm excited about all the trickle-down technology that'll eventually become consumer grade fare, and I appreciate the advancement in various technology that war brings, but I would much prefer it if the US stopped economically destroying itself (while giving the Middle East a "Great Satan" to fight) and instead let them get back to killing each other over tiny differences in interpretation of fundamentalist Islam.

Not even Bob the Builder can fix the Middle East at the moment. Not when you have God handing out the real estate titles and commanding the thousands of various splinter cells to annihilate everything that's not exactly identical to themselves, as trillions of dollars of oil money pour into the region to feed and fund it all.

What's bad for one part of the economy may be good for another part. What's 'good' for the economy is a matter of debate. I know it's a tiredly overused example, but if you owned stock in Halliburton in 2000 and hung onto it I'm sure you'd think that these pointless wars are pretty good for the economy.

Overall, I agree with your comments, but I don't think the pointless wars were a major drag on our economy. If anything, they probably helped. Lowering taxes during wartime - now that's a classic economic no-

Not that anything pass the first line of your response is relevant to the GP. I hate AC that do this.

Defending against this sort of weapon is pretty much like attacking a minefield; utterly pointless.

I don't know what the compensation for a dead soldier is but I don't think its the millions that it cost when these go down. Its going to be cheaper to have soldiers than drones for a long time yet the financial cost will hurt as much as the human cost. I think EMP technology will become a high priority as well.

You are making stupid hypothetical assumptions; the US still win their wars (which are being calle

Yep, autonomous machines are certain to make mistakes and kill people who aren't the target, who are innocent, don't really deserve to die, etc.

So what?

Humans make those mistakes now, in abundance: friendly fire, massacres, genocide, innocent bystanders... you name it. What difference does it make whether it's a human or some artificial pseudo-intelligence that makes the mistakes?

I'll tell you what the difference is: one less human on the battlefield, and thus at the least one less human that can die from

So... program the machines to "feel remorse". That one should be easy,since remorse is (a) recognizing a possible mistake, (b) analyzing the causal decisions and events, and (c) altering the decision process to minimize repeating the same pattern. Sounds pretty straightforward to me, unlike some other emotions.

Looks good on paper but for now they are just expensive toys which may be more useful as recruiting tools (look war is just like a video game, come play with us!). Barely useful in an asymmetrical warfare conflict like the one in Afghanistan and useless in a war with a country that has a modern air force and an integrated air defense system. They'd be shot out of the sky immediately.

There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain.

For starters, the PR would be through the floor if even one of these things killed a civilian (though I guess with how callous the US has been towards civilian collateral casualties for the past ten years, that might not be a big deal.)

The other main reason is that there's no way a manly man is ever going to give up on the idea of manly soldiers charging manly into bat

The US doesn't need autonomous killing machines. Sure, the US will develop them, but so long as the Americans are busy busting on sheep herders armed with AK47s, they wont use them. You might get to the point where drones are doing everything but pull the trigger, but having a human in the loop approving all death and destruction is cheap and easy. You don't gain anything when you are fighting peasants with shitty guns by having a fully autonomous killing machines.

"There is no way that the military is going to permit autonomous combatant units. At least, not without having a stake put through its brain."

You are implicitly assuming that the USA will be fighting inferior enemies in the future and thus will be more concerned about bad PR than coming out on top. A potential future conventional conflict with a heavily armed opponent capable of inflicting millions of casualties will change that (most likely China but there are also other potential candidates). And in such

IF a target is a unique type of vehicle that can be easily identified by target recognition software that _already_ does this for normal pilots AND said target is within a set of coordinates that are known to only contain hostile vehicles of that type, THEN kill it, otherwise seek human double-check and weapons release confirmation.

If a target is in an area known to not contain friendlies and is detected firing a missile or weapon (like an AA gun for example), then kill it.

How soon can we send it into FATA in pakistan? Time to just target the high level taliban/AQ. I am fine with using automation to do this. In fact, I think that we should send these ppl into Mexico as well once it is working decently. Lots of Cartel there.

We're quite likely to see systems that kill anybody who is shooting at friendly troops. The U.S. Army has had artillery radar systems [fas.org] for years which detect incoming shells and accurately return fire. There have been attempts to scale that down to man-portable size, but so far the systems have been too heavy.

Sooner or later, probably sooner, someone will put something like that on a combat robot.

The most dangerous thing is about this is that now when a glitch or bug or malware causes a plane to blow up a wedding, it means no one is responsible. No one ordered it, and no one can be punished for it.

When we take the risk out of war, it loses all meaning. You America haters think we're too quick to go to war now, wait until there is no risk to our own people. No bodies coming off planes at Dover AFB, no funerals, no news reports about another young widow with children to raise without a father (or the other way around). If we remove the risk, then war becomes merely a cost item on the budget, and much easier to jump into.
Take out the Sci-fi stories of the war robots turning on their Human masters (

What bothers me is these things make war easier to wage. When Americans aren't coming home in coffins, it's a lot easier for the public and politicians to accept war, therefore we're more likely to start wars.

If we're risking our own soldiers and pilots, at least we might think twice and look for other solutions before starting a war. However, once you've made war palatable to your own public, too often it becomes the first resort especially amongst the hawkish (and religious right versus non-Christian enemies)

War is supposed to be up close, personal, and horrific. Letting machines handle the dirty work removes a large amount of the deterrance that should be inherent in pursuing a war. Knowing the horrors of war should be a big motivator in seeking alternatives to war.

What's next? Just have computers simulate attacks, calculate damage and casualties, and then those on the casualty list report to a termination center?

I read somewhere recently a quote that, IIRC, was from Churchill. It was something about avoiding war, but if you must fight, fight with severity, for that is the most humane. I think that applies here. Though it sounds incredibly cruel, if people are not dying in your war, there will be no incentive for either side to stop.

Of course, Gadhafi, Hussein, Stalin, and similar madmen are somewhat of a counter example in that they don't give up no matter how many of their side are killed. Yet Japan in WWII is an example of the ruthless severity (nuclear bombs) causing an immediate and complete cessation of any attempts to create war.

Even modern times with Gadhafi and Hussein, the invasion of Iraq was much more severe than the Libyan rebels, thus the shorter amount of time to cause the government to capitulate. (Getting the rest of the population to stop fighting, much harder... we'll see how Libya does without the outside intervention.)

Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.

Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.

There will always be conflict. War is just one method to resolve conflict. Legal fights are another method. Negotiation is another method. Robot wars is on future potential method. In my opinion, machines killing each other is vastly preferable to people killing each other, people who would be brothers in a different situation.

The idea of winning a war by way of killing so many of the opposition that the rest will surrender or retreat is viable some of the time, but horrific. And truth be told, it doesn't work nearly as well in real life as it does on paper; people are unpredictable creatures at the best of times, and there are plenty of cases of soldiers or entire armies fighting to the very last, horrific fate be damned, rather than surrender. In particular populations and politicians may fa

the way that's had a better track record of making wars end, is to destroy the ability of the enemy to make war altogether.

Indeed, that's precisely what Allies were doing with those firebombings of Germany and Japan back in WW2. Keeping in mind that (civilian) workers manning the factories are a crucial resource required for making war...

Anyway, the point is that robot vs robot is war by proxy. Without the violence, the bloodshed, the impetus to end the war just won't be the same. They'll drag on for longer and longer, and resolution will be even less certain than it is today. I'm not sure that's necessarily such a good thing.

Isn't the problem with this really that robot v robot doesn't actually resolve anything? I.e., one side will simply destroy the other side's robots eventually, but then what happens? Just because their robots are gone, doesn't mean the loser of that part of the war simply surrenders. Instead the humans then pick up guns and fight the remaining robots/other humans from the other side.

E.g. if China is invading your homeland, and their robots beat your robots, does your homeland just surrender after the rob

When I was a little kid, I read a sci-fi story (in an anthology, more then likely--I devoured them so fast I rarely remembered the authors names) that was based on the premise that humans had spread throughout the stars, and in the process discovered a planet that had an indigenous race of diminutive humanoids. This race of humanoids was divided into clans and was in a multi-fronted, never-ending state of war--a total free for all. If I remember cor

Yet Japan in WWII is an example of the ruthless severity (nuclear bombs) causing an immediate and complete cessation of any attempts to create war.

It is worth mentioning that it was not immediate and that a Russian force was preparing to invade. The Japanese had a very good idea of what a Russian occupation would be like and that was a major influence in surrendering to the USA.History is too messy to be told as a simple fairy tale with no substance other than cheering for your home team.

Wouldn't politicians killing other politicians be even better, less pollution and your not feeding the ravenous beast, "The Global Military Industrial Complex" (apparently they are now colluding together upon a multinational basis to keep mass murdering high profit wars going). All you need is a bunch of clubs and some campaign contributions and let them go at it and, the winner is, the general public.

Those chicken hawk politicians that want war, let them fight it themselves.

a future where war is limited to robots killing other robots, and not humans killing each other, is a GOOD THING.

That is true in a naive way, the question is which countries is the US going to attack that can afford drones? The reality is going to be drones killing brown people with ak47s, and of course people with random objects that resemble ak47s, and people who are standing in the wrong place at the wrong time. Like the status quo I guess.

All power comes from the barrel of a gun. Aimed at you - to make you comply. Willingly, or otherwise.

All power comes from being able to make someone happy. Really, think about it. A gun is no guarantee that someone will comply. If they feel certain you will shoot, then it has almost no power at all. The power of a gun comes from the fact that you MIGHT make them happy by not killing them.

If your goal is to get people to do something, you'll do much better paying them than trying to threaten them. And if you can make them happy in other ways, you may be even more powerful than merely with money.

Obama didn't obtain the most powerful office in the world by threatening to kill people (King George tried that, and got a revolution). He got votes by giving people hope for change. How much change he delivered is a different thing (certainly he delivered some), but people were happy to believe that it might be true. So they voted for him.

"All power comes from being able to make someone happy. Really, think about it. A gun is no guarantee that someone will comply. "

I don't now if you've noticed but you live in a world of millions of suffering people, you have billionaires and homeless people not for a lack of homes but for a lack of guns on part of power to kill/subdue the rich. There is no rational reason to have as much suffering as we do in the developed world because of capitalism but most people fear guns.

They are violent and do not allow anyone else to be. Criminal gangs usually only care about profit, if someone is violent in their territory but profit is not in risk they are not concerned. Of course, powerful criminal gangs in a weak country can feel the void of power and begin working partially as a government (and becoming more like warlords).

You could argue that, in order for that monopoly to be effective, the government needs some bac

You do realize, right, that most people agree with putting criminals in jail? Even that is evidence we are governed by consent, which is why politicians who appear to be 'tough on crime' gain popularity. The majority even wants drug users to go to jail.

Most criminals don't agree with being put in jail. You have to use the threat of violence to put them there. That is government in it's purest, most basic form. Take that away, and you no longer have government.

I haven't seen a lot of wars about liberty lately. Most were about economics or territory, some were about religion. To my knowledge, the last time the USA was attacked on own territory was Pearl Harbor and the last time the US mainland was invaded was well over 100 years ago. In the end, only the weapons manufacturers get a good deal out of war, the people just get another sock puppet ruling their countries.

There are a lot of treaties that try to limit the number of nukes, land mines and other non-discr

One of the most difficult problems we've encountered in prosecuting such wars (starting as far back as Vietnam) is trying to figure out who's "these malevolent folks" and who's a bystander. That's a tough decision in any case, and a very tough decision in the face of fire. But we've found that it's a decision that must be made because we're not fighting the same kind of war that General Sherman fought. The concept of automating killing machines sets my teeth on edge specifically because of your attitude

Because at the end of the day you still have to break the will of your opponent and have them do something they wouldn't ordinarily do. Chances are 'you lost at rock em sock em so you now need to handover your port' wouldn't be overly persuasive.

"our military has an unusually high percentage of people "with brown skin" both doing the killing and in positions of leadership".

Because soldiers are historically recruited from the lower classes.

When President Harry S. Truman desegregated the military in 1948, African-Americans saw the Army as a key avenue for advancement. Joining up became "a way out of a worse situation," said Gregory A. Black, a retired Navy dive commander and creator of blackmilitaryworld.com, a website devoted to the history of Afric