If you're poisoned by a burger you can sue the restaurant that sold it - so why can't you take a software developer to court if their negligent coding lets hackers empty your bank account?

That's the question asked by University of Cambridge security researcher Dr Richard Clayton - who is calling for software makers to be made liable for damage resulting from avoidable security flaws in their apps.

Today software generally comes with End-User License Agreements that require the user to sign away their right to sue software developers if their app contains security flaws that leaves the user's computer open to attack by malware.

Clayton is arguing for regulations that remove the developer's right to waive any responsibility for security flaws in their software. It's an argument that has already won support from officials across Europe, with a House of Lords committee recommending such a measure be implemented in 2007 and European Commissioners arguing for the requirement in 2009 - however agreements to this effect have not been passed.

"It's remarkable that of all the things that you could buy as a consumer, software is the one where you're expected to make up your mind whether it's dangerous," Clayton says.

"We've been saying for some years that what is required is to make people [developers] responsible for when they damage other people. If you went down to the corner of your street and started selling hamburgers to passers-by they can sue you [for any damage you cause]."

Clayton thinks that developers should be held accountable in cases where avoidable security holes in their software are exploited to infect a user with malware, and that user suffers some form of material loss - for instance the theft of money.

The source of the infection would be established in court with the help of subject experts and the definition of what flaws are avoidable informed by legal precedence from earlier cases, Clayton says.

"The question is 'Are they being negligent?'. The usual test is 'Are they applying contemporary standards to the quality of their work?'," he says, adding that known flaws can be exposed by running code through commonly available security tools and validation suites.

The argument for developer liability also goes beyond increased consumer protection, to providing developers with additional incentives to minimise security holes in their software.

Clayton argues that in order for any agreement on developer liability to succeed it will have to be internationally binding.

"It's not going to be easy. There's going to be a lot of moaning from everybody inside [the industry] and we're not going to do it as one country, we're going to have to do it on a global basis and over many years."

There are cases where end-users have launched legal action against IT vendors in relation to alleged security flaws in their software, and Stewart James, partner with law firm DLA Piper, said that were several such cases to be successful, it could set a precedent, increasing the chance that subsequent actions would be successful. However even if these actions were to succeed it would not guarantee greater liability by software makers for security flaws, as software developers could respond by changing the conditions of their EULAs.

James said he was sceptical about how successful any new regulations making software makers liable for damage resulting from coding flaws would be, given the number of ways that developers could shift blame to the end-user: for instance by claiming the end-user failed to follow accepted IT security practices.

"There are lots of get-outs that a software developer would look to use to defend against a claim, for example, 'Has the user updated to the latest version of software that may have closed off some of those vulnerabilities?'," he said.

Clayton and other supporters of developer liability are facing powerful opposition. Given the potential size of the liability - estimates of malware-related losses are often put at at least billions of dollars annually - the software industry is likely to lobby hard against any such measure.

Perhaps unsurprisingly the software lobby argue that its members already make their software as secure as they can, given the complexity of code underlying applications. When the matter was debated in the House of Lords in 2007, software vendors argued against it by analogy: that when a home is burgled the victim doesn't usually ask the maker of the door or window to compensate them.

Another rebuttal of liability put forward by some developers is that it would stifle innovation and interoperability between apps, as software makers would stop their apps from interacting with third party code to guard against undesirable results.

There is also the question of who is liable for flaws in open source software where there is no clear individual or group responsible for its development. When the Lords debated the matter it was argued there should be exemption for individuals who voluntarily contribute to such projects.

About Nick Heath

Nick Heath is chief reporter for TechRepublic UK. He writes about the technology that IT-decision makers need to know about, and the latest happenings in the European tech scene.

this law will never see the light cz it is not even close to possible to create "the perfect" software (security speaking) hackers will always find a way. And in a side note this law will hit above all the OS companies like Microsoft.... we all know the security gaps in Microsoft's products, so after every release they will get few million law suites. It's waste of time to even think of this law.

Dr Richard Clayton might be a whiz "in theory" but in practice he is mistaken. Consultants/Vendors rarely do work for which their Clients do not want to pay. Consider 2 paths:
1) The Client is very insistent on all manner of testing, including security testing but the Client's own technical staff advises, "It will never be an issue".
2) The Vendors is contacted to consult with the Client and the Vendor is insistent on all manner of testing, including security testing, but the Client finds great expense and no real value in it. The Vendor is thanked and sent packing.
Scenario 1 is outright laughable for a few reasons; a) this scenario will never happen, and b) if the Client were to stress the importance testing the Vendors would pick up more business from the Client.
Scenario 1 implies the Vendor would be saying something we never hear them say; things like, "I don't like money", "get that cash away from me", "I can't stand the idea of more funding", "no, please, no more".
Scenario 2, on the other hand, happens so frequently that we all have our own polished rebuttals. However a proper Vendor would hire adequate testers and they should, at very least, advise the Client of the risks involved with not performing various aspects of testing post-implementation.
a) if this advice was not extended to the Client, the Client should now be reconsidering their relationship with the Vendor. This is a good time to lawyer-up.
b) but then there is the most common route: the risks were communicated to the Client and the Client dismissed those concerns outright. At this point the Vendor should have gotten the Clients views on security testing in writing with their signatures thereby exonerating the Vendor of any wrong-doing or lack of performance in the area of due diligence.
3) Let's consider a third option though, the Client purchases the software and never consults with the Vendor. Certainly we could not expect a Vendor to contact each and every Client that purchases their wares in order to make sure each Client is happy. Here, the Client should be ordered, by whomever, to sit in their own poopy diaper.
4) And the fourth option; Client and Vendor are sitting around a table both agreeing, in a language that only businessmen can speak, to ram a project through to completion. Testing is burdensome and represents a great deal of cost to both parties. Everyone implicitly nods at one another in agreement that testing is important and not to test "too much". This happens most of all.
In this case, as in all cases like it, if a mediation body were to read the contract, assuming there is one, and weigh those terms against the advice of an outside software expert (like Stefan Jaskiel, Rick Craig, etc) it would likely reveal negligence on both sides of the relationship, however great or slight.
Conclusion: Here it seems both the Client and the Vendor didn't perform proper testing. Any good comp-sci professor would stress the importance of testing - on both sides - as much as possible and as often as changes are made.
But, the question of a broad measure to remove any/all developers right to waive any responsibility for security flaws in their software would [i]ultimately[/i] result in companies having to hire their own development teams (full-time with salary+benefits) and having to write their own in-house software. These companies would have to if development Vendors were looking down the barrel of a law suit every time they made a release. More importantly a company like the one described above would only find themselves in the same predicament again, only this time at the hands of their own staff.
A better approach would be for Vendors to fully learn their business, write contracts accordingly, and have fail-safes along the way, like offering consulting services when Clients want to implement their software and getting a Client's signature when they opt-out for security testing.
When it all boils down, Vendors should be the experts, not the Client; for human cooperation to work at all we all need to be adept in our given fields. To change the scenario a bit, what if the Client were buying an vehicle? Is the buyer expected to be an expert mechanic?
It is the Client's responsibility to partner with the vendor in order to avoid problems like this. After all, who would know the internal workings of a clock more than the person who constructed it?
Assuming that all software problems are not new, it is a 70 year old profession after all, it would behove Vendors to draw from some software standards in order that they get closer to being true "professionals". Here is some light reading to start:
Software Reqs. Eng. IEEE 29148
Software PM Plans IEEE 16326
Testing Definitions IEEE P29119-1
Testing Process IEEE P29119-2
Test Documentation IEEE P29119-3
Test Techniques IEEE P29119-4
These standards, and there are many more, are documented and approved via review board comprised of experts, in order to keep simple mistakes from happening over and over. Vendors will ignore these standards at their own peril.
Software Development might be a great creative release and has the potential to make piles of money but, make no mistake, it is a serious business. Given the availability of the above standards, it's a pity that simple mistakes have to be repeated at all.

Security is an ever changing area. It changes to frequently. But, this does not stop the dev house to give a list of security holes they will cover by release. From there on it's part of the Service Level Agreement.
What dev houses should get sue'd for is production of poor code quality that is not maintainable.

A Security Hole (Backdoor) is usually left by software developers to fix a glitch or if programming can't be debugged because it had been compiled for run-time and burned to media, they can work around it. If they forget to make sure the software is really ready for distribution then they would be liable for any damages if they promoted their product as bug free. It would become a tort case.

Software developers are already as accountable as other engineers. The EULA forms no legally binding agreement in court with respect to responsibility. Mostly it is a reminder as its purpose for use under reasonable conditions (eg. Normal harddisk etc).

This politician is talking nonsense; all competent security is layered, because eventually all security measures fail, it is a unavoidable perpetual arms race; the Red Queen principle e.g. our DNA has the same problem, multiple challenges from the environment, predators, prey, parasites and both sexes; ironically we have sexes because sex is the only way most animals can outwit parasites and the war of the sexes is partly to assist sex.
Research sexual computer malware already exist, so soon the countermeasures will have to use sex too, so any proposed regulations are already obsolete farce.
The computer industry has grown so fast precisely because Statist, Stasist, Centralist dictorial morons like this backward Socialist nincompoop have not yet been able to regulate away its productivity and profit margins unlike other areas of business. If legislation is introduced, the computer industry will slow while the security issues will explode!
What we probably need to do is develop an immune system against parasites like this politician; given there is disturbing evidence that some are a developing, camouflaged, international threat to human survival, called "Watermelons" (Green on the outside, Red on the inside) by a book author.

The EULA is not a legally binding document. The document is only a reminder of what the developer considers are his rights but is not legally binding. The copyright and re-engineering part is only a reminder so that some-one violating copy-right cannot claim that it was wrong. Likewise the parts indicating the responsibilities of the end-user is only indicating what the developer considers as correct and does not constitute any loss of statutory rights. If you believe that the developer is directly responsible for damages caused then you can go to court and if you prove that you are correct then you may be awarded a suitable amount. Most companies will sack employees who produce bad work and, even if no case for damages is brought to the courts, most companies investigate the cause of the trouble and ensure that they will avoid the trouble once again.

But then you'd have to sue the managers who insisted the software be delivered based on a random date on a calendar (regardless of whether it was done or not), their bosses for selling vaporware to clients, users who provided useless specifications on the front-end and poor QA on the back-end and, of course, the final users of the product for not having virus protection, not using encryption on their WiFi, using passwords that are easily guessed and downloading every possible piece of malware they find while browsing porn at lunch.
Golly, it's great to work in an industry where the government says it's perfectly okay to not pay us any overtime. Now that people want to sue us, too... that just makes the job that much sweeter. Let us all shrug.
Then go on strike.

When Ford Motor Company chose to release the Ford Pinto, Management knew of the potential of an explosion should the car be struck in the rear. The outcome; people were killed. The courts went to the lowest point in management in seeking remedy for the victims families and survivors. We may consider Software a less volatile tool, but lets say the software is driving your checking account or your dentists instruments or the surgeon who is performing your or a family members heart surgery. What Say You?

and flaws and faults, thus coming under the same controls as other engineering projects. What is amazing is how many people think software developers should be allowed to get away with doing poor quality work and no pay for any damage done by their incompetence.

in a piece of software as long as you do NOT advertise it as bug free? Does that mean Ford can sell cars with faulty brakes as long as they don't advertise them as having great brakes?
Faulty engineering is faulty engineering, regardless of where it is. Many years ago when I attended a course of designing computer software I was told about the practice of putting certain types of break points and back-doors into the software in the early development stage so you can better debug and test the code while it's still under development. However, one point that was heavily stressed was how important it was to remove those break points and back-doors and thoroughly retest the software before sending it out for production and sale. The ones that left such things in were seen as unprofessional in their behaviour and actions.

frequently used to load trojans onto systems you feel the software developer should be held totally blameless for not taking basic security measure to stop that vulnerability being exploited in their software?
Such behaviour is as bad as the hacker who hits the person with the trojan, because if the known flaw had not been there the hacker could not exploit it.

You'd be off your gourd. You might want to entrust a company that would guarantee you the application was scanned for vulnerabilities, eh?
Well, even saying that, the Predator Drone was hacked, and I'm sure every precaution known to man was taken in the development and testing process.

it's "I paid my money and it shouldn't have anything wrong with it". Academia doesn't really enter into it. Reality should represent a possible solution and not "I want it so give it to me". But do you really know who is a good developer? I don't believe that software producers really want to invite trouble by producing software that is open to know security vulnerabilities. It reduces their profit margin and increases costs.

I never heard of anybody being able to sue a car manufacturer because consistent known bad design meant that motor vehicles would rust easily thus rendering your vehicle undrivable and illegal after 5 years. Why do you think that most vehicles have only 2 years guarantee when they can easily design them for the same price to much longer? The same can be said for many consumer electronics projects. I don't see anyone jumping to make it any easier for us to sue these manufacturers either when those cases are even simpler to prove for bad design.

Now you need to pony up. Either you prove the simplicity derived from your accusations, or you bow out gracefully knowing that you are out of your league and never understood the broader scope of the statements you keep making.
"Known" - "Common Knowledge" - "Everyone already does it" - These are cop-outs, and the intellectual equivalent of replacing everything in your statements with F-Bombs.
No, I will not let you get away with arguing using generics. You did that to me for two pages.
Either list all the Exploits, their Variations, and the mandated Rectifications you are supporting, or there is no value to your argument, except for argument's sake. After all, if a bill were to be passed using such vagueness as "known," "common place," and the likes, there would be an endless slew of court cases, and the development community and software companies throughout the entire world would be destroyed, and we would face a shockwave that would knock out the future of all technology.
No. No. No. Put the generics and vagueness aside, and make the case based on FACTS for what you are stating, OR bow out and admit that your arguments have no founding in actual provable knowledge.
And a -1 is not an answer to the problem with your arguments.

Either that or we must all submit to your willful ignorance.
EVERY COMPANY ON EARTH in the F500 list (not one, not a few, not many, but ALL) have been compromised.
You speak of legislation against this, but you cannot list the specifics, nor the specific remediation, except to make generic foolhardy statements like "everyone already knows them all."
You speak of remediation as being so simple, yet you can't list it here. You say that every developer already does this, but you can't show me how.
Stop with the generics and get to the specifics. Do you even know what a bill would look like that included every current known exploit, their variations, and the required remediation for them all?
If you can't produce that, your arguing has amounted to hot air.

They threaten to take people to court when that person has violated their copyright. They don't take you to court if you had a problem with their software. Since where does it say in any EULA that YOU can't take them to court ? The big thing in the media is about copyright and not what YOU are talking about.

Is ANYTHING without faults?
I haven't found anything in this life, human built, or otherwise, that does not have some sort of defect. I can name a defect with everything I own. Doesn't mean I'm angry and want to sue the livelihood out of each individual Chinese laborer that worked on those items, but it means I noticed.
Perhaps it's a discussion about entitlement.
You are correct that no software engineer WANTS to produce buggy software. Most true developers that have a passion in their field are extremely hard on themselves when something goes awry, or a bug is found (or is that just me?).
A serious note on passionate developers is that they are harder on themselves than you ever will be. Add a few lawsuits into the mix, and they will flat out stop serving the public. It's hard enough to recognize our natural fallibility without the world looking for ways to stomp us into the dirt.

What exactly are they asking that many software programmers are doing? Since the whole subject is very vague you have no proof. No developer is going to charge more, it will just be harder to find developers certified and experienced enough who know how to put all of the software practices into practice and avoid a lawsuit. They will also be offered special insurance just like some other sectors already have. I think you under estimate the task and responsibilities of a developer.

The process of following real software practices are far more complicated than flying an aircraft or being a surgeon. Perhaps the nearest to it would be a lawyer since they have to use so much documentation through out their job. I think you'll find that more security holes have been created by good quality programmers at Redmond and silicon valley and the likes of MIT graduates than those in code from bad programmers. Bad programmers normally make code so bad that it breaks before a hacker could ever find a security hole. Normally the security holes are in the business analysis and not in the coding. One example is the exploitation of browser plugins where it is a direct feature that creates the opportunity for a security hole. It's not as black and white as you think and depends on a number of variables. In addition most holes are a result of some unforeseen interaction between one application and a totally unrelated application.

Poor quality and vulnerability are not necessarily the same thing. Many vulnerabilities are due to the platform, not the code. .NET vulnerabilites would affect many programs, but the application coders have no control over that. There has been tons of poor quality mainframe code over the years, and even the well written code was vulnerable. There were very few people who knew how to exploit it or could get access. THose in charge of security , usually a systems group, kept the bad guys out.
Security is a field in itself. It is unrealistic to think a programmer would also be a security expert and could anticipate all present and future exploits. The home version of the security group is the antivirus program and firewall.
Your idea sounds great for the lawyers and insurance companies however.

Sue the developer...he will have insurance....prove your case.....if you can. I still can't see where the problem is? Do people really think they can't sue a software manufacturer now? Do they think that the EULA is all in law?

[i]There is a certain company that I know of that is named after a fruit that willfully turns their backs on exploits, denies them, and even punishes any hacker who makes their errors known to them.[/i]
Why would it? That is exactly what Professionals Expect to happen to them, they ignore things that incur a cost they end up having to pay for the mistake.
If your argument held true there would be no Nero Surgeons or Obstructions both of which are the most sued parts of the Medical Profession but as yet while the legal actions may dissuade some from entering those fields because they would expect to be sued and don't want to have to put up with the legal actions every second, second there as yet is no shortage of either disciplines in the medical profession.
What would be wrong with Suing Apple?
If they stuff up they deserve to be held accountable for their actions.
Besides I very much doubt any of the [b]True Believers[/b] will even accept that Apple has any issues remember you are holding the phone wrong excuse?
Besides Apple will not be producing anything very soon when Mr Mitt gets elected and stops the theft of American Jobs to China that is all that happens at Apple. ;)
Col

You guys live in, it would be incredible to meet the human that had the millions of known exploits locked in their heads. Makes me wonder if even Hawking does? We'd have maybe one programmer for the entire world if everyone had to know everything.
You guys live in fairy tales. This isn't the old west when programming, the web, and servers were straight forward. We're not talking about a few exploits, heck, we're not even talking about a few thousand exploits.
There is a certain company that I know of that is named after a fruit that willfully turns their backs on exploits, denies them, and even punishes any hacker who makes their errors known to them.
It would be a big outcry for this company if they were sued back into the stone-age (not from me). Their customers make excuses for them when they cover their tracks. They ASK for the known exploits.
But that's really besides the point. You could send me to school for 10 years, and I probably still wouldn't remember every potential exploit in technology.
This whole conversation is ridiculous and downright stupid. It's made by a bunch of ultra-ignorant individuals with no knowledge of the development and technology realm with a chip to grind because some high school student high-tailed it on a project that was too big for them, and now these ignoramuses want to sue the world.
Just what we need. Our court system is a mockery with patent lawsuits, why not fill it up with more absurd nonsense?

Car companies don't get sued over bad design, unless they physically harm people. If they did, Metros wouldn't be running around the streets still, and every car that broke down would be another court case.
You want apples to apples in your fairy tale world? Give me a call when software starts killing people.

That very much depends where you live. Here for instance the only time you'll find any salt on the roads is when you go to the beach and there is a wind roaring in caring the water spray with it. Not a very common occurrence and even then not many cars are adversely affected.
Even then with the coatings that are applied in the factory salt will not cause corrosion till the coating is partially removed so if you keep the car on the Black Stuff and don't run over things rust isn't so much of an issue.
The more that you damage the coatings that where placed there to prevent corrosion the more likely it is to rust. Even then the more likely to rust things are Hot Dipped Galvanized things like Fuel Lines and Brake Lines and the fastenings that hold them in place under the floor tend to be Galvanized which causes it's own problems with Dissimilar Metals and Electrolytic Decay but even that only occurs a long time after the design life of the vehicle has been exceeded.
Also do you really want me to comment on Safety? For instance where exactly was the leak and what did the fuel spray onto? Then did this vehicle have Low Pressure Fuel Delivery or High Pressure Fuel Delivery? That basically means Carburetter or Fuel Injection. Carburetters use about 4 PSI Fuel Pressure and even the basic Fuel Injection used somewhere around 80 PSI. Provided that any fuel spray doesn't go over the Exhaust or in the vicinity of sparks it's generally speaking safe. Though if the Fuel Line was subject to failure I'm sure that the maker released a fix but to get it you need to have the car serviced by it's Makers Agents who include that type of thing when they do services so generally speaking the owner doesn't know that a change was made and charged to the Maker or the supplier of the failed part.
Several Years ago I worked for FORD and we had a new model released with 1,200 Mods introduced before the car even hit the dealers. Not one of those repairs was paid for by Ford they where all paid for by the component suppliers who provided substandard components.
Does that give you any idea of just how badly designed/made modern cars actually are? OH and that car wasn't a locally Produced Vehicle it was a Fully Imported Vehicle from one of the better Car Makers who had a Good Build Quality unlike the local produced cars. ;)
Col

Likewise software manufacturers do not deliberately use faulty designs. Salt on the roads is an everyday occurence. Your point is like saying that a car does not need to be designed to avoid everyday problems that are not created by the manufacturer such as protection from everyday weather and it's consequences. (Not tornadoes) By the way, I once owned an Austin Allegro where the fuel line passes over the engine. Shouldn't I have received a patch to run it around the engine. One day the pipe chaffed on the air filter and started leaking on top of the engine while I was driving.

When the software is flawed every software manufacturer releases a patch. Also, you can still sue the software manufacturer but, like suing the car manufacturer, don't expect to win. It took years and lots of resources to prove that the Ford Pinto was flawed. Like I already said, even in software the manufacturer can be sued. Do you know that there is a known security flaw in my grandmother's pace-maker? Should she sue the hospital that fitted it? Who should I sue if the airport security damages her pace-maker? Perhaps it's the pace-maker's manufacturer that we should sue because they didn't make it safe under interference of known everyday sources of radiation? Oh! I forgot. You aren't supposed to go through the airport if you have a pace-maker. Her civil liberties have now been violated. Who should we go to now?

:0
Even then they may have run them in a tunnel that had the same thickness of metal between the Passenger Compartment and the lines themselves. But then I only worked on the 959 so what would I know? :^0
I do know that [b]The Old Man[/b] as we knew him as still came in at least once a week and had to have his hand in on everything we did. He wanted detailed explanations of why we did things the way that we did.
However saying that Brake Fluid absorbs moisture and is very hard to get burning. It's possible but it takes a lot of effort and tends to only burn at it's ignition point not up the line like fuel does.
The only fluids we had in the car was water in the Water Cooling System that provided Cool Water for the racing suits. It was a dedicated closed system divorced from everything else and was not possible to run any heat into the car.
Col

Owing to all Vehicle Design Rules I have ever seen you must not have any Fluid inside the cockpit that is a fire hazard.
They must be run outside of the passenger area where if they do fail and start a fire they will not harm the occupants of the vehicle.
Nowadays with Vehicle Design you are basically given a piece of Graph Paper with dots and to come up with the finial design you join the dots. There is a very good reason why so many new cars look the same even though they are made by different companies. ;)
Though to be perfectly honest other than 3.5 days working for GM I have not been involved in Production Cars but race cars and they most defiantly never allow any dangerous fluids inside a sealed compartment where people sit. It's much better to have fuel lines under the vehicle where they can get torn off by something than loose a driver. It's very expensive for the teams to loose people where as loose a car is time consuming and expensive but you'll likely be back in a weeks time racing again, where as you loose a driver that's not likely to happen. ;)
Col

To sue anyone for something that they didn't do. For instance Salting the Roads to prevent Ice Buildup which is the biggest cause of Rust in Motor Vehicles in the Northern Hemisphere.
It is just as easily argued by the Lawyers for the Car Maker that the owner knows of this and has failed to take the necessary steps like having their car Rust Proofed which is an option so they accept the damage.
Of course if like Porsche they Hot Dipped Galvanized their chassis after the bare Structure was welded together the costs would be greater and fewer people able to buy that model.
Though if they put in a Design Fault like the Go Peddle not working and causing the car to go as fast as it can, or a fire to start because a bit of plastic has collected flammable fluids and allowed them to ignite then the car maker does get sued.
What we are talking about here is deliberately not using a Known Fault after it has been discovered which is different to deliberately designing in problems and not taking any steps to prevent them from being there to begin with.
Sorry but after working in the Auto Industry as a Designer I can honestly say that they do not deliberately use faulty designs, that doesn't mean that things are not faulty just they they don't know that they are when they are specified. There is a very big difference between being sloppy and not caring and not knowing of an issue.
Col

recalled when a new one is found, but they don't happen often. The most famous case in the USA for a car company being sued for a KNOWN fault involves the Ford Pinto and it's rear end fuel tank puncture fault in the case Grimshaw vs Ford Motor Company. See wiki article:
http://en.wikipedia.org/wiki/Grimshaw_v._Ford_Motor_Co.
If something fails to perform to the intended specification due to a known fault, the company or person who engineered it can be held responsible for the problem, except in software it seems.
There is a big difference between bad design like cheap metal and a known fault.

Now I've been almost as anti as you as far as this foolishness goes, but you just put some serious space between our positions.
Fair Trade
That's the sort of trade where "we" get to trade how we want, and any booger else gets to trade how we want as well.
The US has never believed in fair tade, it believes it should be be able to impose tariffs to benefit itself, and doesn't believe it's fair for any other trading power to do so.
When Britain was Great, and Fair trade was a cornerstone policy of successive liberal governments, nobody else including the US wanted it. Mainly because we could have used our ecomomic might to make us unassailable except for various import and export tariffs imposed by other nations to stop us doing so.
Fair trade, is one of the great lies.

The government will always find a way. They've already infiltrated most of the ISPs for the purpose of finding individuals breaking the law. It shouldn't be hard to apply that on a micromanagement level to filter for bits of data. Not saying any of this will realistically happen, as I'd like to believe governments have enough sense not to pass laws such as these?

The European Commission is meddling in this idea. If something like this were to pass, you'd be hit with tariffs and sanctions placed against other countries. It would be the end of fair trade when it came to the code world.
Likely what would happen is that most of the coding would go underground, and people would exchange ideas and code through "technically illegal" channels.

A EULA defines how the end-user is to make use of the software when purchased and a SLA defines the quality and scope of the services, as in SAAS. There would be some clause(s) indemnifying the dev of consequential damages or losses as in the case, M.A. Mortenson Co. v. Timberline Software Corp, caused by the programming (a client database or patents compromise) right? Well, what if in any of thoses the developers didn't recognize there were some flaws in their software and distributed their software, what if at some point in the lifetime of the software excluding patches, fixes, updates, etc. there is some fault. You know what I mean? It doesn't matter. The damage would be done. And at that point you have some serious ramnifications.
So anyways thats what I meant by negligence per se.

In other industries it is not like that. In all cases you have to PROVE negligence. That is the basis for all court cases. Like I said, this is a non-issue. It will be just as hard to prove negligence in a case of a deliberate security hole as a vehicle design flaw.

First you raised the point about the EULA, which includes wording designed to release the software company from any and all liability in any way associated with the use of the software while also restricting how the person may use the software - well, that's the way it was the last time I read the EULA from Microsoft, Adobe, or the like in full; a few years ago. I don't see many EULA now as most of what I use are Open Source and covered in a different way.
Regardless of the EULA and its legal validity or not, the current legal system in the US (which is where most of this stuff happens at the moment) makes it clear that the only way you'll be allowed to sue any software developer is if you can prove 'beyond a shadow of a doubt' that they were deliberately negligent in the making of the software - about the only way you can do that is if you can find legal evidence to show they intentionally loaded a trojan to get stuff from your system without permission. Likelihood of doing that is so low it's not worth worrying about.
Yet, in every other field of engineering the responsibility is the other way around. Unless the designers can prove they took all reasonable precautions to make their gear safe, you can sue them for any damages or any sort of harm done.
The whole basis of the thread is some people, like myself, say that the software engineers / developers should be held responsible for any sort of damage done due to them not making sure the software did NOT include any flaws or faults that were KNOWN about at the time of development and release - they can not be held responsible for what becomes known afterwards. I also believe they should be held responsible for any damages done when they take an excessive amount of time to fix any new problems that are found. Some people take the position that a software engineer developer should never be held responsible for any damages done when they put out flawed or faulty software; as part of this line, some are saying it's impossible to ensure the software is free from known flaws at the time of release - despite some people being able to do it.
The real crux of the matter are the linked questions - Does not ensuring software is free from known flaws at time of release constitute negligence? And, can it be viewed as deliberate or intentional negligence if they made no serious effort to ensure the software is free from known flaws?
I regard this as the same as a professional driver making a safety check of a transport vehicle just prior to use - failure to do a proper check is seen as professional negligence.
..........
Just checked and found this in a copy of the Windows 7 EULA - it clearly tries to limit or stop any damages claims in any way, and you can bet the value they place on a OEM installation copy is damn low.
26. LIMITATION ON AND EXCLUSION OF DAMAGES. You can recover from Microsoft and its
suppliers only direct damages up to the amount you paid for the software. You cannot
recover any other damages, including consequential, lost profits, special, indirect or
incidental damages.
This limitation applies to
· anything related to the software, services, content (including code) on third party Internet sites,
or third party programs; and
· claims for breach of contract, breach of warranty, guarantee or condition, strict liability,
negligence, or other tort to the extent permitted by applicable law.
It also applies even if
· repair, replacement or a refund for the software does not fully compensate you for any losses; or
· Microsoft knew or should have known about the possibility of the damages.
Some states do not allow the exclusion or limitation of incidental or consequential damages, so the
above limitation or exclusion may not apply to you. They also may not apply to you because your
country may not allow the exclusion or limitation of incidental, consequential or other damages.

had to abide by it whilst doing training on software development and later on quality control processes and checking of software back in the late 1990s - before we could write the first line of code we had about a dozen different sets of documents we had to complete to set everything out properly about what was wanted etc. However, it seems a lot of people don't do that now for some reason, they just jump in and write code - or so it seems from the results.
If they create a feature for a browser that allows the system to be hijacked, then it's not a feature but a vulnerability. What is interesting is other companies making browsers can get their browser to do the same thing without that vulnerability. I also find it interesting that some companies can change their OS to create vulnerabilities in a third party browser just because the third party used the interaction code they were told by the OS company they had to use. That sort of behaviour is wrong and criminal, in my mind, yet there is no laws or regulations to stop them doing it at the moment.

And trust me, you don't want to go there.
What Ernest is suggesting is legislating something he has no intimate knowledge of, can't explain, and can't produce rectifications for (it's not within the scope of a law to figure out what it's governing?).
Basically, his statement about "mush" is equivocal to the legislation he's imagining in his head.

and was not possible to even approach the same complexity that we see today precisely because there are now tools the do the assembly coding for you. These days all code has to access loadable library components which in themselves are a separate subsystem. Most of they assembly code that is written now is device drivers and optimized parts of code. The new tools consider other subsystems as black-boxes but when your system and the other system interacts then only then they will open a security flaw. In 1992 when I did my Telematics MSc after working in the Space industry my thesis was on this subject. The thesis concentrated on the design interactions of the specification of the black boxes and what security flaws or deadlocks would be produced in telephony subsystems. (Such as using call-back option to allow callers to make illegal calls without it appearing on any telephone bill.) Now we see much more complicated systems on a single PC at higher abstract levels causing much more serious problems. The study of these interactions is a complete science.

true. Although it seems a different sort of complication. 25 years ago we assembler programmers had a deep knowledge of a few specific things, and there were many experts. Now you need to gather together pieces of many different technologies to cobble together a product. You don't become an expert on most of them, you just learn to use them until tomorrow when the next hot thing comes along. A lot of the new programming tools are truthfully no harder to use than a spreadsheet, and lots of "programmers" using them have no knowedge of what is happening at the machine level. You are so right about "features."

I've been coding for 25 years and the complexity of software has exponentially increased over the years. If I had not continued to code I would not understand how complicated the task is these days. There are a number of douments OUTLINING best practices but there is no list of known vulnerabilities. Besides a plugin can be considered a vulnerabilty or in most cases it is considered a feature. 15 years ago code was worse and the task of protecting code then was grossly underestimated. It's just because coders 15 years ago created problems we have so many problems today.

before, a vulnerability due to the OS such a .NET is the responsibility of the OS developer, thus a vulnerability in .NET would be the fault of Microsoft, not anyone else.
I do NOT ever expect the programmers to be able to devise code that is safe against all future vulnerabilities, what I do expect is for them to write code that does NOT include any existing KNOWN vulnerabilities, which they can do, many still do, and used to be an industry standard practice, but it's vanished in the last 15 of so years. Since the industry will no longer self regulate, then it's time for external regulation - just the same as many other industries.
I