Posted
by
Zonk
on Thursday October 18, 2007 @06:34PM
from the keep-quiet-on-the-terminator-jokes dept.

TJ_Phazerhacki writes "A new high tech weapon system demonstrated one of the prime concerns circling smarter and smarter methods of defense last week — an Oerlikon GDF-005 cannon went wildly out of control during live fire test exercises in South Africa, killing 9. Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers. 'Electronics engineer and defence company CEO Richard Young says he can't believe the incident was purely a mechanical fault. He says his company, C2I2, in the mid 1990s, was involved in two air defence artillery upgrade programmes, dubbed Projects Catchy and Dart. During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says. "They made a temporary rig consisting of two steel poles on each side of the weapon, with a rope in between to keep the weapon from swinging. The weapon eventually knocked the pol[e]s down."' The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether."

Why didn't they have some provision to cut power to the weapon? If they were testing it in a place where there were people exposed in its possible field of fire (effectively "downrange"), they should have taken precautions.

(It was, of course, as a result of the Great Ventilation and Telephone Riots of SrDt 3454, that all mechanical or electrical or quantum-mechanical or hydraulic or even wind, steam or piston-driven devices, are now required to have a certain legend emblazoned on them somewhere. It doesn't matter how small the object is, the designers of the object have got to find a way of squeezing the legend in somewhere, because it is their attention which is being drawn to it rather than necessarily that of the user's.

The legend is this:

"The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.")

Not the least of which is, with current artificial intelligence, they're laughably unenforcable. In Asimov's books, you had this neat little "positronic brain" which was capable of resolving sensory inputs and determining things like "that is a human -->" (to say nothing of "I am harming it", especially through indirect causality.) They were even capable of coming up with ideas to avoid the "through inaction" clauses.

Really, the stories weren't about robots, they were about people just like us, with a certain set of "must-follow" rules. Modern AI does not resemble this in the slightest.

Shouldn't it be constructed so it can only fire overhead at a certain minimum elevation so it cannot hit anything less than let's say a truck's height from the ground? Sure that might not keep it from hitting targets on higher ground but it would make the gun a lot safer for firing crew and support troops around. Even if it was tracking a legitimate target coming in it might shoot right through it's own crew if say put on a hill so the incoming is coming in at 0

All of the stories in I, Robot are about pointing out the flaws in the laws, actually. From what several bigger fans of Asimov than myself have told me, he wasn't really trying to make grand philosophical statements with them though; they were just story hooks he used for the purpose of spinning a good yarn.

As I read the headline, "Robotic Cannon Loses Control," I immediately thought of the droids in Robocop. I was all set to make a funny post, if someone hadn't already. Then I got to the end: "Kills 9." And suddenly it wasn't funny anymore.

It's one thing to make jokes about things going wrong. It's another thing to make jokes about people dying. I'd like to think that the people who made those comments, or modded them up, only skimmed the headline and summary. But I can't quite convince myself.

Asimov's three laws were meant to be a thought experiment in hubris and unintended consequences. They were sold (in the context of the stories) as the perfect control system for robots, and then there were always "problems" that the USR management couldn't understand and which Susan Calvin needed to figure out and fix.

Young says he was also told at the time that the gun's original equipment manufacturer, Oerlikon, had warned that the GDF Mk V twin 35mm cannon system was not designed for fully automatic control. Yet the guns were automated. At the time, SA was still subject to an arms embargo and Oerlikon played no role in the upgrade.

It may just be me, but automating a machine that fires explosives that isn't designed to be automated just sounds like a Bad Idea(TM).

The three laws might be relevant when artificial intelligence is sufficiently advanced that a robot can understand abstract ideas, like what constitutes a human being and what it means to cause them harm. Until then, they are irrelevant because it is the the human programmers responsible for making the decisions about how the gun will respond to external stimuli, not the will of a 'robot'. This was not a malevolent machine attacking people, just a malfunctioning computer controlled gun (gone wild!).

> It may just be me, but automating a machine that fires explosives that isn't designed to be automated just sounds like a Bad Idea(TM).

It's just you. On Slashdot, we call that "pretty fuckin' cool", on Makezine.com [makezine.com], they call it "neat, but don't try this at home", and at Survival Research Labs [srl.org], they call it "another Thursday at work".

Kind of like my response to Slashdotters objecting to an automated weapon designed to shoot down cruise missiles, which leave too little reaction time for human-controlled defenses to counter, which save lives of soldiers, airmen, and sailors from massive loss of life.

Honestly, from reading the article it isn't clear that a software problem was even the cause of this disaster. It could have been some kind of mechanical gun jam.

Any time you are dealing with big guns, fast motors, high-speed fire, large rounds, and explosive projectiles there is a risk of disaster if things go wrong. These things aren't toys. Even if the fire button was completely manual things could still go wrong.

I recall reading an article about a magazine detonation in a battleship which went into all kinds of detail about all the things that could go wrong - and this was a fairly manual operation. It did involve lots of machinery (how else do you move around shells that weigh hundreds of pounds?), but it was all human operated.

Assuming the system is well-designed the automation actually has great potential to LOWER risk. Humans make mistakes all the time. They're even more prone to making mistakes when a jet is incoming loaded with cluster bombs.

Another thing to keep in mind is that peacetime training disasters always make the news with the military. However, the military has a fine line to walk - on one hand they want to be safe in their exercises, but on the other hand they want to be able to handle combat operations. A 30 minute single-shot firing procedure that allows for all kinds of safety checks sounds great in theory, but in wartime you'd lose more people to incoming fire than you'd ever save from gun explosions. Sure, you don't want to kill yourself, but if you're so ineffective that the enemy overruns you it is all for nothing. As a result we tolerate some friendly fire, accidents, etc.

Like it or not robotic weapons WILL be the future of warfare. Sure, one country might elect not to develop them, but sooner or later somebody else will, and once they work out the bugs they'll be overrunning everybody else...

Nope, unlike what tv may have taught you, people rarely, if ever, joke about something anything that affects and hurts them.

Let's see you cracking a joke about the robot at the funeral if it was *your* son in the casket.

Now, I don't see anything bad about us making jokes in this forum, since we aren't personally involved in the matter at all and can only feel sorry in an "abstract" kind of way (as in, accidents and human loss are sad but oh well I can't feel sad for *every* bad thing that happens in this world right?), and this won't be read by the affected people. But let's not go around pretending that we are "dealing" or "coping" with anything here. That's just hipocrisy.

This thread happens every single time some tragedy with loss of life is posted here on Slashdot. Some people find the humor, then others are "sickened" and "can't believe the heartlessness".

The simple matter is, many, many people die every day. Many, many people are also born every day. You can't be personally upset over every life lost or you would spend all your time in overwhelming grief. And sometimes humor is the only alternative to what would otherwise be shock, anger, sadness, or fear.

My father is a paramedic, and some of the jokes that circle the station after a particularly gruesome scene would probably make you vomit. These men aren't deranged, dark humor is a very real way to deal with tragic events. These men are psycologically evaluated from time to time and the psycologists never seem to have any problem with dark humor. One has gone so far as to tell my father it is a COMMON coping mechanism, especially when one is trying to remain abstracted from the trauma.

I'm not saying they make these jokes at funerals (that's just called tact) or in the presense of civilians, but pull your head out of your ass and realize that laughter is a powerful healing tool.

Maybe that's what they tell the grunts. Congratulations, you managed to shoot down large mock targets that weren't shooting back.

Think you can shoot down supersonic missile flying below the horizon? No. They let the computer guided robots do that. You're not nearly good enough at it. Ok, maybe you get lucky and nail it. Now try thirty in five seconds all coming from different bearings. Didn't think so.

Let's see you cracking a joke about the robot at the funeral if it was *your* son in the casket.

I did. It was the only way I could react to my father's death. It's who I am. I hurt fiercely, I was crying hard, and when my mom and I stepped into her kitchen I had to say something, so I cracked a quiet joke. It broke the tension, and made us feel just a tiny bit normal.

That's coping, using humor. It happens in real life.

In this forum, however, nine South Africans are truly remote. They're about as far outside my monkey sphere as humans can get. You wanna joke about them? Fine by me. You want to complain about the jokers because you don't think people really deal with tragedy that way? You're quite wrong.

He had claimed that he had been involved in writing code for some kind of automated anti-missile defense system, though he had always insisted that he wasn't allowed to give details.

If programmers like HIM are writing the code for these "smart" weapons, then I think we should just give the things to our enemies for free.

Defense contractors frequently end up with bad products, but it's usually due to mission creep and gross mismanagement. Based on my experience*, I'd almost guarantee that this guy was lying about his experience. Pretending to have worked on a "top secret" project that you conveniently can't talk about is pretty weak sauce. In reality, there are two kinds of classified projects: mundane ones, where the engineers working on 'em can talk about the "what" of the program in great general detail, but the specific "how" is classified; and REALLY secret ones, which you can't talk about at all, the most you can say is "I work for Lockheed" or whomever. This "I worked on a secret anti-missile program" shit is a load of crap. It falls into the big fat liar zone between mundane and really secret.

* I was an intelligence analyst in the Army. I dealt strictly with excruciatingly mundane secrets. Boring, boring, boring. My father was an engineer for Hughes (now Raytheon). He worked on things like the B-2 Spirit ground mapping radar system. For years he "worked at Hughes", and that was it. Later, he was able to say "I work on the B-2 radar system. You'd be amazed at some of the cool shit we do with it, but I can't say what it is."

I think it's more important to note, that it doesn't make them more dead, or kill additional soldiers, either. And really, thousands of far more tragic deaths happen each day. There are children being molested all over the world as I write this. Sorry if I don't lose myself over some minor military casualties while developing more efficent ways to kill people.

And like the Parent said, laughing does make the world a better place. When unfamiliar people find something commonly humorous, it really brings them together in a really strong way..

It really must be an Aussie thing. My mate is a cop, last month he had to tell a nun, that her wheelchair-bound brother had lost control down a hill and drowned in a duck pond.

But when she asked how he died, he could barely hold a straight face so he told her to ask at the hospital.

Later she saw him and said, "No wonder you couldn't tell me how he died". Seems, she nearly pissed herself laughing at the hospital. She also told him to practice more, he'd given himself away with a tiny lift at the corner of his mouth when she asked.

Personally, I don't get what a period of mourning achieves. Losing someone leaves any empty place, but I wouldn't want anyone to waste a moment of their life mourning the loss of mine. Why is it that the west treats death as some kind of divine punishment, and the east tend to celebrate it?

I teach robotics (on a VERY basic) level to high school kids. I explain that there are some really peculiar people out there who watch movies like Terminator and think "Hey that's cool! I wanna build a killer robot" and who then spend their professional careers trying to build machines that will lower our position in the food chain.:( They just don't sense the danger. Just like those designing artificial brains, smart weapons, doomsday plagues, better nukes......

Yup. And I'm not even looking at it from a robot uprising perspective. Strong AI may or may not happen but I think it's going to be far, far off, like practical fusion power. But in the meantime, weak AI robotics is coming along nicely, predator drones and SWORDS robots, etc. Just look at the anti-democracy crackdown in Burma, that shows you the power of force when applied against the people. There were reports that some of the military units were wavering, having second thoughts about killing civilians and monks. An automated gun doesn't care. We've already got that level of distance with aerial bombing. We killed what was it, twenty civilians trying to take out Saddam the opening night of the war? We've got Marines on trial for deliberately raping and murdering civilians up close and personal but we gave medals to guys doing the indiscriminate killing from the air. We act like it's different, like accidentally killing dozens of people in an air attack is different from shooting them up close and personal. Wow, I'm sure their families will see that distinction exactly the same way we do. And when our cruise missiles go off-course and hit the wrong target, they're going to realize that's entirely different from when a suicide bomber does the same thing with two tons of explosive in a truck. "Sorry, my bad."

Automated weapons are going to make the blood cost of war (to us) too low. We need casualties in the millions before our dumb monkey brains can figure out it's a bad idea, sometimes not even then.

Picture my family sitting around the corpse of my grandfather. He didn't want a funeral or burial. He was going to be cremated. We were there to say good-bye. My father (his son) said, "Wouldn't you shit if he sat up and said 'April Fool!'" (it was April 1st). We all had a good laugh.

My wife, an optometrist, dreamed of having her own practice. She has had her own practice for ten years and it remains a dismal failure. We are scratching and crawling out from under the debt we incurred, and eventually we'll reach the point where we'll be able to more or less survive. I'll never be able to retire. Neither will she. We won't be able to send our kids to college the way our parents sent us. Nevertheless, it is a constant source of humor. If we didn't joke about it, I think we would lose our minds.

People *do* joke about the suffering and loss of their loved ones, they joke about having their own dreams crushed. So, when you say you don't think anyone does, you're wrong. Maybe not everyone. But people do, and it is valid. In fact, it is just as valid for someone not directly involved.

Huh? You mean on 9/11, 2.5% of fatalities worldwide were due to terrorism? And since then, terrorist deaths have practically flatlined, with rarely more than 0.01%, way behind pulmonary heart diseases, the flu, starvation, war, crime, work accidents, motorvehicle crashes and all sorts of other causes? You mean it doesn't make sense to throw terabucks into the War On Terror when relatively cheap nutrition programmes could save 27000 lives per day?

To be fair to history, the "cruise missile off course" problem is a nice trade off for "razing an entire city, raping, enslaving, or killing the entire polulation, stealing all the valuables, and toying with the captives by seeing who can skin one completely without letting a single drop of blood fall."

Warfare, as recently as the second world war, was not limited by counting civilian casualties. And yet many of our refined and erudite citizens now take it as the norm, lamenting even one collateral kill. It is truly amazing the indoctrinal effects of "civilization;" sufficient even to erase the survival capabilities of hundreds of thousands of years of evolution in a few generations. Hopefully we never meet an enemy who has not learned to sublimate their instincts in the pursuit of some dubious higher morality.

As for automated weapons kiling indiscriminately, I think they are just suffering from an acute self-actualization crisis.

Every army that wants to be good needs to be a well oiled machine. Otherwise accidents like this happen regularly.

The parents "racist beliefs" broken down were:

The post apartheid government is black. True
Corruption is running rampid in SA, which has a black government. True
HIV is climbing faster than curruption. True
SA is now dangerous. True
SA government (which again happens to be black) spends money on needless things rather than helping the people. True

The facts are that in the post apartheid era, things in South Africa are in fact worse. I dont think it's a black thing vs. a white thing, but when anybody points out these above facts they are called racist.

Your issue shouldn't be with the parent being racist, it should be with your government being accountable to the above issues, whether the government happens to be black or white it doesnt matter.

Sadly, most of Africa seems to be following this trend which is a shame.

Out of curosity, are you sure that the parent kept the entire population dumb. Your post seems very accusatory of the parent. And saying murder or rape is to be acceptable is beyond idiotic. As far as South Africa not thriving, are we to blame the parent for the utter failure of the whole continent. Give me a break.