Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

cpu6502 writes "Robert Silverberg wrote a recent editorial about the dangers of robots and the legal consequences for their programmers and engineers: 'Consider malicious kids hacking into a house that uses a robot cleaning system and reprogramming the robot to smash dishes and break furniture. If the hackers are caught and sued, but turn out not to have any assets, isn't it likely that the lawyers will go after the programmer who designed it or the manufacturer who built it? In our society, the liability concept is upwardly mobile, searching always for the deepest pocket.'"

Maybe... But last I saw, Ford Motor Company wasn't liable for drunk drivers that use their vehicles to drink and drive, resulting in death or destruction of property. This makes me think that engineering a product doesn't necessarily make you liable for someone that breaks it apart.

Now, if your product was a security software, and you advertised it to supposedly prevent this...

A more apt car analogy would be suing ford because someone broke into your car. Or suing Microsoft because somebody hacked your server. There's zero precedent to these types of lawsuits, and adding "with a robot" doesn't change that. This article is fucking retarded, even for Slashdot.

If this is a likely scenario then where are all the people suing Microsoft because Windows let hackers install malicious code which deleted data. This happens frequently and yet I'm not aware of MS getting sued.

I recall the MS EULA specifically prevents any liability. It's near the bit that forbids the use of general-purpose Windows licences in the operation of nuclear facilities or other places there there is potential for epic fail. Maybe robots will have an EULA too, with a similar clause.

Why do people always assume that just because it's in the EULA, it's legally binding. It's a factor, certainly, but generally courts take a dim view of companies trying to weasel out of their legal obligations with this sort of thing. There's still an expectation that the software is fit for purpose.

Except there are a lot of lawsuit happy morons out there that don't accept responsibility for their own actions and want to sue everyone who's even peripherally involved in even the vaguest sense so long as they have money.You've heard about the moron that ignored the warning signs, climbed over the fence (by the warning sign) and tried to pet a dolphin that bit him? (I think the article said it was really minor, maybe didn't even break the skin.) He still got 10% of the insane amount he was asking for.

The Microsoft EULA (Which you agreed to) specifically says that you can't sue them for anything whatsoever. So sorry nope.

Many jurisdictions (including the US AFAIK) have the concept of unsonscionability [wikipedia.org] which limits on what you can "agree" to in a contract (assuming a EULA is even considered a contract.)

They can put in there that you agree to give them your first-born child on their 5th birthday. Doesn't mean they'd be able to enforce that in court. I suspect that a *lot* of terms in EULAs and contracts- and almost certainly that one- would be thrown out as unconscionable.

There are no security holes within the car involved in drunk driving. However if the programmer did do something that made it easier for people to break into something then maybe the company should be held liable. Just as Ford would be held liable if they put shitty brakes in their car to cut costs.

Did something that made it easier to reprogram it? Like, say, making it programmable?

That's pretty much the point of having a programmable robot. To be able to reprogram it. If you insist in car analogies, how about saying that the maker of the navigation system should be held liable if someone breaks into your car and installs some bogus maps?

> However if the programmer did do something that made it easier for people to break into something then maybe the company should be held liable.

Okay, but define "easier" for me. I mean, do we have it run Linux, and need security patches on a week to week basis? Maybe OpenBSD, that really raises the bar for cracking the software. Or, go all the way, have the whole software stack proven mathematically.

What I'm getting at is that actually, the level of security expected is very vague, and hard to determine

If something works as intended and it's open that is different from selling someone a product which clearly is broke, worry about patching it later and in the mean time people end up losing data through poor coding.

Maybe we should spend more time ensure things are safer rather than just worrying about getting it out now. I think it's equally unfair to say that the company isn't responsible for anything in a EULA as it is to say everything is the programmer's fault.

An even better example is the making of hand guns. The manufacturers have rarely been held liable for the way the gun was misused unless there was some really unusual wrong doing in the distribution of the weapon. In many ways our laws are able to be manipulated by language. For example a gun is not a weapon. It is like a whiskey bottle. If you try to bash someone with a whiskey bottle it becomes a weapon. If, instead of hunting, target shooting, or simply col

No, but you certainly can sue when, say, the hydraulics in the car aren't strong enough to apply the breaks when the temperature drops below -5 C.

Is the manufacturer liable for producing a product which has an inherent flaw? Yes.If a system is to be connected to the Internet, should security be a necessary design component? Yes. HIPAA, SOX, and FERPA all mandate that reasonable security measures to prevent tampering or access be taken for health, large financial, and public school information for example

the argument that will be used is that the software engineer or programmer didn't do the use cases to foresee the potential for misuse and abuse of the product by criminals - and that they were therefore negligent - I don't buy this argument but lawyers are a slimy lot !

the argument that will be used is that the software engineer or programmer didn't do the use cases to foresee the potential for misuse and abuse of the product by criminals - and that they were therefore negligent - I don't buy this argument but lawyers are a slimy lot !

Why don't you buy it? I'm amazed that Microsoft hasn't had the crap kicked out of them due to how vulnerable their software has been to exploitation, and exploitation that has left numerous unaffiliated companies that don't even use Microsoft operating systems. Users whose computers are easily compromised and companies who have suffered the ill effects of attacks made by compromised or hijacked computers should have one hell of a case.

As far as liability for physical devices goes, while the OP's point that as an owner, misusing a product (like drunk operation of a motor vehicle) might not open the manufacturer to liability, I don't think that would even come close to the argument in a lawsuit. The argument would revolve around how 1) the manufacturer used consumer equipment and communications protocols for their equipment allowing other consumer equipment to contact the manufacturer's devices, and 2) that the manufacturer also neglected to provide sufficient safeguards to keep out those who aren't authorized to use the equipment, and the consumer protocols and equipment selected by the manufacturer made it unduly easy for even the technological layman to tamper with only minimal instruction or assistance.

IMHO, the state of commercial software development is atrocious. I don't expect every software product or operating system to be completely immune to exploitation, but the fact that commercial operating systems with large paid development teams and oversight by paid management still manage to have hundreds or thousands of weaknesses that lead to remote exploits or infections through applications like the friggin' web browser despite many users' attempts to lock down ports, installing antivirus and malware programs, and hiding behind firewalling routers, developers should be ashamed of themselves. Having worked in a company that was developing a communications application, and whose job was quality assurance, I can tell you that part of the problem is that most developers lack the devious streak to know how their software could be misused. They only see how they can make it function for X circumstances, not how Y and Z circumstances could lead to its compromise. As the QA tester it was far, far too easy to break or exploit communications daemons, and when the programmers were confronted by the evidence they got defensive and indignant instead of wanting to know more about the nature of the test or the fault. As long as that attitude prevails in programming this sort of thing will continue to plague our software, and in my opinion, development companies should be responsible for the ramifications of their decisions.

The reason Windows has had so many exploits is because in the 90s people were demanding an OS compatible with their existing software, i.e. one where you run at root all the time with no firewall and all services on by default. All applications are trusted to make system wide changes. If you changed any of that they wouldn't install or run. XP made things a bit better but it wasn't until Vista that they changed the default to untrusted and look at how (un)popular that decision was.

Vista was the bitter medicine needed to force developers to start behaving. The much hated UAC prompts were not for users, they were to make the experience of badly written software so shitty that the developer would be forced to change it. Now 7 is out and more of the legacy crap has been dropped, as well as applications in general becoming less stupid and more compatible with secure settings, Windows is actually fairly secure. Certainly the days of limitless remote exploits are over, with most hacks being targeted at applications instead of the OS. A lot of exploits rely on the user doing something stupid such as installing random software they downloaded. Nothing is perfect but companies running Vista/7 have a lot fewer problems with viruses than those still on XP.

Insurance companies, by definition, are paid to absorb the risk. That is the only reason they exist. And they are sued for the actions or negligence of the driver, not the actions of the car. When the car is at fault, the manufacturer is sued, like Toyota is being sued for the accelerator issue.

For example, if I put a robot into your house using a dog door/chimney/vent, my ass would still be responsible. Nobody is going to sue the company that made the robot.

What happens if I bought the robot to wash dishes, then some script kiddie gets in and breaks a few thousand dollars worth of stuff (water damage can be expensive to fix). Liability is not a clear cut line in that case.

What happens if you buy a refrigerator specifically for keeping food cold, but I sneak in and put 400lbs of rotting meat and vegetables inside of it? Then, when you open it, the overwhelming stench causes you to throw up, then your wife slips on the puke puddle and breaks her leg? Yeah, I could see how GE might be responsible for making the fridge. Call it an attractive nuisance, I guess.

The unlikeliness of such a scenario may well make GE liable under the preponderance of evidence if you also stole 400lbs of fresh meat and vegetables before putting the rotten stuff in it. All the evidence would point to GE's alleged negligence, and under a civil standard of "more likely than not" GE would almost certainly lose.

Now, if GE ever found out that you were the one that stuffed it with rotten food, they would probably sue you for whatever they wound up out as a result of your joe job under the th

The script kiddie would be primarily responsible because he was the one whose actions foreseeably caused the damage.

Whether the robot creator is liable depends on if they were negligent with their security systems. If it was foreseeable that a script kiddie could hack into it, they might be. They could still go after the script kiddie for indemnification.

That's the theory anyway.

In practice, a lawsuit is so oppressive in the line of legal expenses that you may be paid a settlement just to make it go away

The same thing that would happen if someone hacked your robotic washing machine, and set it to not stop pumping water. That's right, there are millions of robot clothes washers in the world. A good many of them are computer controlled.

or for Christ's sake, wrote a virus that deletes files worth a few thousand dollars on your computer?

This item can result in liability, if my contract allows for it. That's already established.

The problem with robotic dishwashers or whatever is that they have implicit liability. That is, it's a machine that is intended to operate in a certain environment with certain expected risks. If a regular dishwasher breaks during routine use in a way that causes a lot of damage, especially if it's something that has happened repeatedly to other customers, then the manufacturer can indeed be liable for damages. Th

Don't think so? How about a Gun Manufacturer getting sued because their weapons were used by criminals to commit a crime? That has happened, and is still happening now. The 1-800-ASK-GARY scumbags of the world will go after whoever can pay.

That's not the fault of the ambulance chasers, that's a political matter. The idea isn't to get money for the attorneys, the idea is to make it so risky to make guns that gun manufacturers go out of business, thus banning guns through the back door. Seems unlikely som

What happens if kids break into your house and break your dishes? You sue their parents? You sue the school for not teaching them well? You sue the government for not putting enough money in education?

There is no logic to who gets sued. Suing is an interesting part of physics - whenever there is a "Lots of money" gradient, and a "Has worse Lawyers" gradient, the suing target moves.

It depends. If I bought a lock that was advertised as safe and the kids picked it with a paper clip then I may very well sue the company responsible for that advertisement. I think software is no different. You have to look at what was promised and what was delivered. The sophistication of the hack - actually the hole - also matters in determining if there was negligence. If the kids used a backdoor program that the devs forgot to take out or if the devs forgot to do something as simple as defending against

Well, given the kids were said to have broken in, if the parents didn't volunteer to replace the dishes, and said dishes were sufficiently expensive to replace... I could well imagine using small claims court to repair that defect in the parents.

Some people like to eat using fancy dishes. Like $500+ per plate. Not me, but I'm just saying, this scenario could have been about a serious amount of money. Less so if you eat off of $0.25 IKEA plates.

This in my opinion is a major reason our society is so screwed up - why should we even consider it reasonable that lawyers can go after software engineers and programmers to "make someone pay" because the real criminals have no assets. Product liability insurance is a major reason why some things cost so much and until we break the cycle and get the lawyers out of control (most of them run our governments)these frivolous lawsuits will continue - in the end the only people that really win are the lawyers. This is the same argument as going after a Glock handgun designer because one of their weapons was used to shoot someone - its absurdity to the max

why should a software company hold no responsibility for anything their software does or lacks? I don't think you can blame a programmer for a problem with their software if the problem originates outside of their code but I do think companies should be held responsible for pumping out shit software.

I completely disagree with you on that.A product designed by an engineer is sold to a layman, who is by definition not able to assess the inherent dangers of its use. If he was, he wouldn't need the service of an engineer to design the product in the first place. A layman is not an engineer. An engineer thus has either to transfer all the knowledge necessary to operate the product safely to the customer, or to make its use not dangerous, even the use the product was not originally intended for. And of cours

The need to point out dangers that are not obvious to a customer is, in principle, a sane approach. But it entirely depends on the level of common sense you can legally assume the consumer to possess. In europe, you usually assume the consumer to be mostly sane, and not retarded. In that case, the system works fine. In the US, you sometimes seem to assume that the customer is the most redarded dickhead nature was able to create. In that case, you are stuck with insanely stupid warnings like "hot fluids mus

Advertising using blatant falsehoods is likely more of a damper than liability losses.

The level of lies about the functionality and usability of most software (and most other things) boggles the mind. The lost time and effort for folks trying to dig into the truth is immense, and if anything it should be easier to take these companies to task for wild claims that are only farcically supported by reality at best.

Most software licenses have waivers of liability, and have a limit on the monetary damages. The limit is usually the purchase cost of the software. So, you can get a refund, and that's it. The only place I see that isn't waived are safety-critical applications, like medical devices, nuclear devices, vehicles, and factory floors. These are typically hard real-time systems. Besides, you can always blame the owner for not patching the system! The "unlock your car or home from your iphone" apps really worry me.

Most software licenses have waivers of liability, and have a limit on the monetary damages.

Liability clauses can work both ways.
If a robot company produces and sells robots that cause harm, then they can be sued under product liability laws, which apply strict liability standards (in the US, anyway), which means that absent post-sale changes to the product, the producer/seller is liable, period. (If the harm is caused by a hacker, that's a different story.)
However, if a software company provides a service to a robot company, then the software may fall under a different liability doctrine in wh

Hm. Do AV companies pull this same stunt? I mean, if there's any particular software that should produce liabilities (as in "in exchange for money, you protect my system" kinda deals, thus excluding free licenses or open source stuff), you'd think it'd be the ones hootin' and hollerin' that they're needed to protect you from internet boogeymen.

because the legal system encourages profiteering over reparations. especially for the law firms. it is natural that these firms would use any excuse to have people sue other people for all kinds of bullshit.

1. Manufacturers will very likely isolate their product from function, only selling unprogrammed tools with APIs, to companies who resell the devices with an OS with strict functionality limitations, and DRM-like lockouts to isolate themselves from liability.

2. Companies will be careful in the beginning to set precedence that allows them to bypass such liability. Likely they'll create a set of manufactured "harm" scenarios, with honest but complicit victims with a vested interest in blocking most future lawsuits based on indirect liability.

Only once liability precedence has been set will the APIs open up on consumer tools from the major manufacturers. The court system may be insane in many ways - but they function to the needs of large companies - mostly as a negotiation device, and a filter for amount of money owned ("You must be this rich to use the court system").

But programming has been around long enough that 1) I am sure there is an instance of this already and 2) There have been plenty instances of bad things occurring already that it should have happened if it was going to.

Guns are a little different from your other cases because having killed a person, you've only proved the guns fitness for purpose. You've made any liability suit for poor design harder, not easier, in that case.

Microsoft got ever sued for the damaged caused by the thousands of virus/botnets/trojans/intrusions caused by the "security" of their software? Not even got hit for delaying applying patches for known or being exploited vulnerabilities ever.

isn't it likely that the lawyers will go after the programmer who designed it or the manufacturer who built it? In our society, the liability concept is upwardly mobile, searching always for the deepest pocket.'"

If there's someone that will pay them for doing so, then sure, they may try. But why single out robots when there's already a device in most peoples' homes that is already being hacked for malevalent purposes? When is the last time anyone has brought a suit against Dell (and it went anywhere) beca

Simply have human operators responsible for "monitoring" the robots. They take all the liability if something goes wrong.

After all, that's why (largely autonomous) light rail / subway trains pay college students / poor people / etc. to sit in the cab and hold the "door open" button for the train.

Probably also why we'll never have fully automated cars and passenger aircraft as well. Easiest to just blame the driver / pilot / etc. for failing to handle the situation appropriately. Or at least they're their

Autopilots have a hard time dealing with situations that are not previously imagined, but don't underestimate how much the engineers can plan for, and the usefulness of an autopilot that can react far faster (and more calmly) than a human.

That said, having a person on-hand is a good idea anyway, out of the basic principal of having no single point of failure for a system.

You build a product that might possibly injure somebody, you buy insurance. But if the product does injure somebody, they can't sue your insurance company. Heavens to Betsy, some juror might find out that an insurance company is really the defendant! Can't have that, juries are dumb! We'll make them sue the programmers or engineers. We'll have some stooge geek sitting in the dock, and we'll pretend through the entire trial that the engineer is really going to be personally liable for paying the verdict

Instead of locking the punks up, make them pay the victim 7 times the value of the damaged property. Deny them welfare until they've paid it back. If they commit another felony while they're still paying it off, double the sentence for that felony.

On the surface, it may sound harsh, but if they do $1000 of damage to their neighbor and the court makes them pay back $7,000 as restitution and punishment instead of booking them in the pokey for two years, which is less disruptive? Having to pay back $7,000 with no interest at 1-2x minimum wage or doing prison time and then trying to find a job?

So wealthy crooks can laugh off their sentences? It's hard enough to get a conviction against the rich with their teams of expensive lawyers, and you'd want to make it so that should they actually lose, it can all go away with some tiny check?

Also, what do you mean, "deny them welfare"? Are you one of those ancient conservatives who still rails against "Welfare Queens"? That system was eliminated in 96. Or do you mean welfare in a more general sense, like food stamps and disability insurance? In that c

First of all, no lawyer is going to accept 33% of minimum wage as the contingency fee. So any such punishment would be fought tooth and nail by the ambulance chasers to prevent a precedence. So the counsel to the plaintiff would not seek it.

The victim does not care for the justness of punishment meted out to the perps. They want compensation, only when compensation is impossible they will settle for revenge. So the plaintiff does not seek it either.

If the kids had found a Sawzall in the basement and used it to trash the house do you think the homeowner could hold Milwaukee Electric Tool liable?

You wouldn't think so, but then you see things like the Louisville Slugger case, where the manufacturers of a baseball bat were found liable for the death of someone hit by a batted ball, and you realize anything can happen in the court system.

If the kids had found a Sawzall in the basement and used it to trash the house do you think the homeowner could hold Milwaukee Electric Tool liable?

You wouldn't think so, but then you see things like the Louisville Slugger case, where the manufacturers of a baseball bat were found liable for the death of someone hit by a batted ball, and you realize anything can happen in the court system.

Without getting into weather or not the verdict was correct, the argument was that aluminum bats are much less safe than wood and H&B should have known that when they started making and selling them. I think similar arguments will be made (and no doubt have been) for robotic devices as they start causing problems. Defective or negligent design is the responsibility of the manufacturer; of course the definition of defective or negligent is open to interpretation.

People have already been killed by robots [wikipedia.org] 30 years ago, so it isn't exactly a new thing that robots can do harm. Also why shouldn't the companies be liable? If you build something that is dangerous enough to do serious harm and sell it to lay persons, you better make sure that it has enough build in safety mechanisms and doesn't just go crazy because some script kiddy came along and wanted to have some fun.

It is highly unlikely that the programmers or manufacturer of the original device would be liable. There are two main reasons. First, the wrongdoing of the hackers is almost certainly a superseding cause of the damage, which negates liability for negligence on the part of the programmers or manufacturer. Second, the product was not defective when it was sold and it was modified from its original condition, both of which negate products liability.

Industries that have failed or may fail that face the same problem as this post include Aviation (they gained some protection from Congress via the 1984 GARA act), Education (teachers have to make their plans dumbed down for all, cut field trips due to liability issues, etc), Medicine (the cost of medical care is high because of the liability costs for valid care that somebody may have got a different opinion on).

That's as it should be. If you're doing something dangerous, you need to take responsibility for it.

When I ran a DARPA Grand Challenge team, we took out a really good commercial liability policy. We had hardware stall timers, an electromagnet in the accelerator system that had to be energized to get out of idle, a separate battery and relay system which slammed on the brakes if the stall timer tripped, a backup anti-collision radar system, and a separate emergency stop radio link which had to send a si

The company that I work for, which I shall not name, is a Fortune 500 company. There is a lot of money there. Because of this, everything they do, and I do mean everything, is vetted by one of the legal teams to minimize liability.

When I was going through one of the training classes, it was pretty much explained that way. If we have something delivered, and the company that we hire to do the delivery causes some damage to the client's site, we're the better target for a lawsuit because we have more money th

Until software companies and programmers get sued software will not significantly improve. That is why I say there is no such thing as software engineering. Engineers get sued, programmers don't. They won't be true engineers until they have to carry malpractice insurance.

As an attorney, reading this question invokes the same reactions that many of the/. crowd would have if I started trying to opine on the technical failings that would allow our mythical vandals to reprogram the hypothetical robot.

Not to get too technical, but just because you sue the company doesn't mean you win. The liability insurance that even the smallest companies carry would cover the legal costs of having such a suit dismissed. (For the technically inclined, look up comparative negligence and the proverbial "intervening bad actor").

The homeowner (the ones suing) would probably be found more responsible for not following basic security etc.

As others have pointed out, software companies have long been given practically a free ride in harm caused by poorly written software. First, they have been allowed to disclaim the standard warranties of fitness and function. This is akin to buying a car that the manufacturer won't promise to actually work or be safe. If Ford told you that they wouldn't guarantee that pressing the brake pedal actually engaged the brakes, would you drive that car? Yet every piece of commercial software we use specifically says that there is no promise that it will work at all, or do what the purchaser wants.

Here is a counter hypothetical (more realistic as it has actually happened). A relative dies in a plane crash. The FAA investigation conclusively shows that the accident was caused by a bug in one of the key computer systems. Should you sue: the airline? The manufacturer (boeing/airbus)? The subcontractor that wrote the software?

The answer is, you sue the airline, and the system is set up so that anything you win from them, they can then sue to recover from the party up the chain. Thus, everyone's liability is ultimately apportioned according to their degree of fault (note, yes it is a gross simplification). This is why people writing software for critical systems (ones where a failure can cause property damage or injury) need a good lawyer to write their contracts/licenses. They law has allowed programmers to avoid their responsibilties for a long time, so if a sw company doesn't take advantage of that, it is their own fault.

Consider, there is no educational or professional certification required to write and sell software that controls an infant incubator used in an NICU, but you need a government license to drive to the store. Programmers and engineers have been getting a sweet deal in liability for years, so it's awesome to hear them still complaining.

Yes it is the CEO or Engineering director they should go after. We had a boss who was insistent that we did not waste money getting our equipment tested by an external body for electrical safety, until he was informed that he was the person that could go to prison if some died as a result.

Software will not become more reliable until businesses start to value reliability, and they will not until the risk of liability becomes large and widespread.

As more and more of our lives depend on software, 99% reliability is no longer enough. A time will come when there will be a backlash against unreliable programmed devices, and litigation will be a part of this. At that time, organizations will have to entirely revise the way that they build their software, adopting methods

Business will not value reliability until consumers will start so.Not all, but most consumers will choose lower price to slightly higher reliability. My stuff doesn't have to be 100% reliable - stuff breaks, I replace it; if it breaks in the first year or two, then I get a free replacement due to warranty laws in EU; so why should I choose to pay more for vague promises of higher reliability?

I am not talking about something physically breaking. I am talking about something not working right when you need it to. Software glitches, such as a device freezing and needing to be restarted, or losing all of your data.

There is, however, a huge difference, in terms of development costs, between reasonably reliable and 100% mathematically proven reliable. The latter would result in a PVR no normal person could afford, with little real benefit.

Yes, this is exactly the case. We have similar situations in aeronautical engineering: If I'm an engineer working for an aircraft company and I make a mistake and put a flaw into the design of an aircraft, and if that flaw ends of causing a crash and killing people, I am not liable. The company I work for is because I don't sign my name to the design and, generally, I have no control over testing that might have been able to uncover the flaw. That is, unless I'm a licensed Professional Engineer. In tha

The problem is real for the manufacturers. But there may still be a significant question remaining whether programming provides a service or creates a product. The liability differences between the two are great.