Posted
by
timothyon Wednesday March 13, 2002 @03:43PM
from the sue-the-bastards dept.

satch89450 writes: "SecurityFocus had a column that I missed when it was first published a few days ago, titled 'Responsible Disclosure' Draft Could Have Legal Muscle, but I discovered it when researching an answer to a comment on the CYBERIA mailing list. In this article, Mark Rasch discusses how the Draft would set the rules for reporting security vunerabilities, and in particular define the boundaries of liability assumed by bug-disclosers. By adopting a "Best Practices" RFC, the IETF could help the reporters of security-related bugs do their job, and put the onus of fixing the bugs on the vendors who make the mistakes, where it belongs. (The RFC draft described in the article, 'Responsible Vulnerability Disclosure Process, is here at the ISI repository.) This is, of course, in direct opposition to the process that Microsoft's Scott Culp, Manager of the Microsoft Security Response Center, would like to see. As Microsoft is more part of the problem than part of the solution, I believe that the path to a formal process would better serve the entire community - and that community includes Microsoft's customers. I'm taking this seriously because the mainstream press is talking about the issue, and what it's going to take to fix it. Here is an example from BusinessWeek that scares me silly. I'm glad I'm looking to change careers from software development to something safe, like law."

The problem is not with programmers but as to how they are managed and how the result of their labours is marketed.

Linux, and software included with it, is not generally provided with massive claims as to it'simproved functionality. There is no secret that there are bugs in linux, but you can find out what they are. Linux has very successfully doneits own marking without it making claims - the users do it for them. Only recently has IBM, Oracle, and others joined the Linux bandwagon.For Linux and its stability they make no claims.Torwalds himself sits by and says nothing...

Let MS post its change logs for windows.This is not demanding open source from MS - does anyone really want to see it. It would be liketrying to keep the memory of a loved one in memoryas you are gazing at his/her rotting corpse.

Software is copyrightable.Books are copyrightable.

Software is manipulating a language to controla microprocessor.The purpose of a book is to manipulate a mind.

Which is more important.

If I wright a self-help book which contains principles to supposedly improve my life (let'ssay a get rich quick book) and I sincerely followthose principles and can document it - and theseprinciples don't work - can I sue for the immeasurable pain caused by dashed hopes, and the immense amount of time wasted by putting those principles to the test???

Probably not - the legal experts will say - because the principles are not warrented.

Read the EULA's of practically every shrink-wrapped software package.

MS doesn't warrent its software for use in NuclearPower Plants and other places where things can getcritical.

No one warrants Linux, but the FAA is rumored to be testing linux for its later deployment in Flight Controlling Centers. Does this say anything???

Programmers are idealists - programmers do not generally like criticism of their efforts, butprogrammers do appreciate the capababilities ofmore experienced programmers who do not brag about their position in a company or exert authority based solely upon the fact that he had afew beers together with the manager and found that they were passionate about the same footballteam.

A Good Senior Programmer or Team Leader does notinnately want to critizize the work of another. The goal of the Leader is to promote learning - evalation is involved. The best way to do this isto provoke the implementor to ask questions of himself - and on his own to maybe find a betterway.

In a code farm like the one MS maintains, and withdeadlines imposed my managers who have shown in there resumes that they can drive cattle, and marketers whose job it is to use pavlovian techniques to make the masses want more RIGHT NOW,how can good programming techniques be taught andgood programs be written not knowing about whatgoes on in a programmers mind. But probably,the concept of 'mind' is non-existent with managers and marketers...

I agree. I would never consider contributing to the OSS movement if I knew I could be held liable and there is no reason I shouldn't be because I did it for free vs being paid. Linux will not be held to be above this process.

If you purchase software in which the purchase amount benefits the commercial entity who develops the software, you are entitled to legal recourse in the event of failure due to the software. A guarantee of serviceability if you will.

On the other hand, if you wish to be absolved(sp?) of legal liability for software you create, then offer it for free, like most GPL software is.

I think this would be great for some of the excuses for shareware out there. If you charge a shareware fee, it better work. I've found better working freeware compared to shareware alternatives.

This would create a huge barrier to entry for the entire software industry. Joe Blow could no longer write software 'just cause the world needed it'. If you aren't hiding behind a corporate shield, you simply couldn't write software.

IMHO, even as buggy as Microsoft's software is, they are the best suited to defend themselves. In a liable industry, they might stand the best chance of surviving.

In theory, this should help the little guy and open source because they could be more responsible for their customer.

But in fact, it will have the opposite effect. It means that software will have to be "certified" before it could be released.

Little developers (guys in their basement) could never afford this. Big guys (Microsoft) could. Again, this favors big, established companies over upstarts.

But more seriously, lets look at the worst issue with having liability for unsecure software:

If I have a Firestone tire (as mentioned in one of the links), I expect that it will be safe to put on my car and drive up to the speed rating on the side. But if I used the tire as a swing in my backyard and I fell off and broke my arm, should Firestone be liable? After all, a lot of people use tires for swings, and they didn't do anything to make them safer for this purpose.

Silly? Maybe. But now apply to something like a computer operating system. What is its intended purpose? Basically its purpose is infinite. It will allow a piece of hardware to begin to have infinite possibilities. So now I have to make sure my software is safe in any possible circumstance that I can't even forsee!

Mind you, I'm not excusing bad software, but I don't see how this proposal will do anything, because a new license will come out that people will simply have to accept something like:

"I accept that if I use this software it is completely insecure and will allow bad people to do bad things to me and my computer. I completely waive all rights to bring legal action again the makers of this software, even if they knew there is or was a problem. "

Mind you, I'm not excusing bad software, but I don't see how this proposal will do anything, because a new license will come out that people will simply have to accept something like:

"I accept that if I use this software it is completely insecure and will allow bad people to do bad things to me and my computer. I completely waive all rights to bring legal action again the makers of this software, even if they knew there is or was a problem. "

This is silly. First off, the Firestone thing caused DEATH. So if a software malfunction/bug caused DEATH because of the malfunction/bug, whoever wrote it should absolutely get sued for writing bad software. Just like malpractice suits. It's DEATH because of poor quality.

Off of that tangent, I think this is a great idea. Maybe software will come out slower because people are being more thorough. Maybe software will have a higher quality because people spend the time rather than rush it. Maybe it creates a whole new insurance industry for programmer's insurance.

Do you want missile guidance systems to have software bugs in them? Do you want your financial institution to "lose your accounts" because of bugs in the software? This is serious stuff folks. It's time to get serious about it.

I personally don't think it'll hurt the little guys at all unless they're creating bad software. In which case, maybe it should hurt them.

It will definitely hurt the companies that can't afford to hire a full time lawyer. The exact effects would, of course, depend on the details of the law. I suspect that one of the reasons for the degree of apprehension about this is that we have recently seen so many laws that were only to the benefit of whoever was the highest bidder.

(Well, that's not strictly true. MS has benefited from laws designed to aid Disney. But if you consider categories of bidders rather than individual bidders, then it appears to be true.).

Recall that an American Destroyer was rendered dead in the water as a result of NT crashes and space shuttle missions rendered write-offs because of NT crashes. Not to pick on NT, but these are cases where lives did depend upon software. Death is just an example of liability.

It would be crazy to say that "Open Source" have no liability while
"Closed Source" do...

It's perfectly sane to hold
Open Source software less liable than
proprietary software.

Open Source software is more likely to
be free (price) than proprietary software.
If you get software for free (open or
proprietary), lack of liability makes sense.
Someone (or some company) gave you something
for nothing, it seems a bit unfair to
sue them when the free thing didn't meet your
expectations.

Also, Open Source software is, well, open
source. The software is guaranteed to
behave as described in the source code
(given a properly functioning compiler and
computer). You're free to audit the software
for fitness for your use, free to adjust it
(or pay someone else to adjust it) to make
it fit. With proprietary software, you're
at the mercy of the supplier. If it doesn't
work, well, tough luck.

Incredibly, the latest proposed UCITA modifications (to make it acceptable to more states) is the exact opposite of this.

Commercial software is exempt from all liability. Even if they acted in bad faith and consciously lied to you about the presence of critical bugs, you have no resource.

Open source software is held to the highest legal standards.

The legislation doesn't state it this nakedly, but it moves commercial software out of the "product" category and into a new category, so none of the consumer protection or product liability laws apply. Esp. if you never release the "final" version of your software.

In contrast, other definitions apply to all software. But since there's no exchange of "items of value" with OSS, there's no contract and it gets hit with the full power of the law.

This is totally indefensible for the reasons mentioned elsewhere. Microsoft has the ability to test its software bettter, and denies me the ability to protect myself, yet it gets a free pass. Meanwhile the guy who spent his weekends trying out an idea and who posted it with warnings that the code is not yet well-tested could lose his house.

For every active open-source project, there is a maintainer. It is the job of this maintainer to ensure that released software is bug-free.

I think that, if we're going to have penalties for insecure open-source software, we should:

hold the maintainer liable

Only have penalties for release-level software. No alphas, betas, or cvs nightly builds.
I also believe that a vendor or maintainer should be given a reasonable amount of time to fix a bug. There shouldn't be a penalty for a security hole that exhibits itself at one second after midnight on a full moon if the year is divisible by 7 when an attacker uses the root password as a user name. However, if this combination is discovered, and isn't fixed, then hold the maintainer/vendor liable.

I'm the maintainer of a piece of open source software, and no way in hell will I ever say any of my releases are bug free.

I'm giving my code away, you don't have to pay a penny for it. How can I possibly be held liable for it breaking? How would I ever get anything out of beta if I had the constant threat of being sued if my "release" code contained a bug?

Do you fix security bugs immediately? If you re-read my comment, you'll notice that I mentioned penalties for insecure software. Period. If the software you write allows a hacker to crash the operating system or get access to personal data, I would say that you are liable if you don't fix it immediately.

Although I am only human, I would like to believe that I would feel the same way if I were the one who was liable.

There should be multiple levels of liability for faulty sourcecode just as there are multiple levels of liability in other areas of the law.

Opensource should not automatically be excused of all liability. If a bug exits and a sizeable amount of time passes with no fix, as new users are downloading and using the product *without being warned* then the maintainers of the source should be held liable. Opensource vendors should be required to post an updated list of bugs as they appear and fix them before releasing the next version of the software.

Commercial software vendors should be given a certain amount of time to remedy the problem based on the severity and spread of the problem, and for each day/week/month incur fines until the issue is resolved. Registered users of the software should be notified both when the bug is discovered and when the fix is released. All users should be able to access the information via the internet. A new version of the software cannot be released until known bugs in the last version are fully patched.

The liability of vendors should be clearly outlined and have the same tiers and exceptions that current liability laws have. It should be clear that vendors are not responsible for misuse of intended features of their systems (ie: Linux developers are not responsible for warning people that rm -rf / will trash their system.) and vendors liability will be determined on a set of criteria: a.) Software version number-- it should mean something again. b.) intended impact of software--vendors of backup software will be held to a higher standard if their software fails than would the creators of games or graphics software.

Vendors should not be allowed to attempt to silence those who make bugs public knowledge. There should be fines for companies that try to initiate lawsuits for third-parties publishing bug reports, examples of exploits, or other information. Perhaps there should be a certain set of guidelines as to the "release schedule" of those bug reports, however. Exploits can only be made publically available after a patch is available, bug reports can be made as soon as the bug is discovered, etc.

I think software liability is a good idea as long as it's not a loosely interpreted law that is applied equally to all vendors regardless of software genre and company size.

So if a company has to hire an army of QA software analysts to review the code, where do the savings come in from this supposedly free software? Just because there isn't licensing fees involved doesn't mean it's cheaper.

Liability for a bug that's known for more than x amount of time and is not publicly disclosed is one thing; the company here is obvious being negligent in neither fixing the problem nor alerting its customers.

However, I shudder about the day where a company can be sued simply for a problem in the software. There's enough ways to sink a small company as it is, without the thread of "If your software isn't perfect[1], you're gonna pay".

We've all seen errors tied not only to products, but just as often to the people installing, maintaining, and using those products ("What do you mean I shouldn't use my anniversary date as a password?"). These would not just be vendors, but also middlemen and in house people. Any standards for legal cuplabilility would have to take into account the role of every person who touched a faulty product, not just those who built it and shipped it. The communications & expectations between these folks must also be addressed, especially in the form of documentation (e.g. the User Manual dang well better spell out how to test a product for a correct installation and give corrective action, not simply assume it went OK).

On principle, I welcome the concept, but the implementation for this is likely going to be messy indeed.

1) Higher development costs
2) Far fewer small companies in consulting
3) Shrinking job market for new grad coders
4) Larger legal costs on both sides on the fence

On the brightr side, it would also include:

1) Lessening of age discrimination - experience outweighs youth
2) Alteration of programming education to focus on security
3) Higher standard of programming excellence
4) Self-policing. Companies who fail to adhere will run themselves right out of business in short order.

Finally, legal liability for Open Source projects is not a bad idea at all.

You should not be held liable for a product who's distribution and use is volountary. What you should be worrying about is how companies would probably use free as in beer software less, because they would be unable to hold the creator of that software liable for damages incurred from use.

I am talking about OSS, and yes, I know the difference you pompus jack ass.

If I write some code and it ends up in someone elese product (ala GPL, or BSD, or cause I sold it, or gave it away or anything) than even the idea that I might be liable will be chilling to say the least.

Example: I write an anti-spam filter that and post it into the public domain (Open Sourced). Microsoft uses it in their next whiz-bang mail server. My filter accidentally let certain viruses through. Nimda III strikes. I am on the hook for $40B. Great.

I write an anti-spam filter that and post it into the public domain (Open Sourced). Microsoft uses it in their next whiz-bang mail server.

Who sold it, you or Microsoft? The one selling it bears the liability. Same as when a component of a physical good is defective. The end user sues the seller, and maybe the original componenet manufacturer. The seller may also sue the manufacturer to recover their own legal costs.

But end users always sue the guy with the deepest pockets. In your example, I don't think many people would waste their time suing you.

Well, you dont have 40B dollars. However, if you did have 40B dollars, your 'spam filter' is not held to 100% perfection. People can't sue condom manufactures when they get pregnant, because the condom industry isn't so stupid to say, "Hey, these things are 100% effective." Presumably, as a sane individual, you wouldn't be selling a '100% effective spam filter', but a 'spam filter that can cut spamage, possibly up to 100%'.

Companies are liable when products do not live up to claims or they release products with known defects that cause damage. The OS community, if anything, would be less susceptable to releasing shoddy code under laibility laws, because there is no 'rush' to release unready code, nor any sales or market driven motivation to make claims about the product that dont stand up in the real world.

So, in closing, you would be on the hook if you distributed your software under the guise of unrealistic claims. Liability would not result in shitloads of lawsuits, it would result in companies having to think twice about their 'claims' about their products. OS developers would benifit, since they dont have sales teams and revenue expectations forcing them into situations where they must lie about the functionality, safety, or power of their solutions. Companies would finally be able to rein in their fucked up management and sales guys, because suddenly,/they/ would be the ones resonsible for lawsuits, not 'buggy software'. Developers in companies would finally get to say, "Fine. I'll release it. Just understand that you might incurr 40B dollars in damage on our company."

The problem isn't that software isn't perfect - EVERYONE knows it never will be. The problem is finding an honest to god reason to tell the sales team to go shove it up their ass and stop breathing down the software engineers neck. Because, FINALLY, EVERYONE IN A COMPANY will be resonsible for shipping software that didn't live up to claims, not the developer for not 'inventing' an extra 200 hours a week in order to make the software live up to the brochure that was printed 4 weeks ago by a bunch of suits who didn't know dick all about software.

To reiterate, so long as your product does not undermine your claims (absolutely no problem in the OS world, as you arnt trying to 'sell', so you can be realistic), youre safe. This will just rein in companies selling one thing, but distributing a whole other thing (read: Windows, Oracle...)

Yes, it would be the end of Open Source. Who in their right mind would code for a project part time if it meant they were legally liable for anything that might go wrong with it?

There is no software you can write which I cannot make faulty with the right (wrong) compiler. And it doesn't matter how good a programmer you are, or how simple the program you wrote.

It makes no sense to hold the author of the software liable for faults, because the faults could be intorduced by the compiler, or by the later stages of deployment and configuration. So there should be blanket immunity for anyone who vends software in source form under the theory that anyone who has access to the source must exercise due dilligence to ensure that the software is appropriate for the situation in which it is deployed.

On the other hand, vendors who deliver software as a pre-compiled binary must assume some liability, as the consumer is no longer in a position to exercise due dilligence.

This would be a win for free software developers, as long as they only deliver code as source; no liability.

This would be a win for companies like RedHat, who would be able to offer pre-compiled free software, and assume some of the liability for making sure it was compiled correctly.

This would be a win for anyone who uses software, because vendors would ensure their products have less faults, under threat of liability.

This would be the death blow for Microsoft, because (as a company which vends primarily pre-compiled binaries only) they would be fully liable for the software they ship, but would be fully responsible for detecting and correcting their own faults.

No offense, but in most every case, there is legitimate reasons for choosing youth over 'experience'.

in most companies, technical experience gained 10, 15, 20 years ago will be inversely useful today. Even business has changed in the past 2 decades. Things learned long ago obsolesce (sp?).

Experience is very very useful and desirable, but sometimes companies forget that experience doesn't always equate with ability. It is a better barometer for how mature, and understanding the worker is with standard policies, and the unwritten rules of the workplace.

Given that, most companies will then hire a 25 year old with 4 years of expereince over a 35 year old with 14 years of experience for a common coder job if they've similar talents. Why? Because the 35 year old probably has a wife, kids, and is asking $120k. The 25 year old is probably not as needy, and given the 'experience' factor, is probably only asking $80k.

If I'm using a tool, component, or class library from a 3rd party, what happens if the vulnerability is in their code? As a contractor would I have to spend $10,000 in legal fees just to prove it's Borland or MS or Sun's fault? Besides, how can you gurantee 100% that anything is safe? With the lawsuit happy society we have today the smallest mistake could put even a medium sized company right out of business. And if you think this will help open source, it won't. Would you use "free" software that has no liability while commercial software does? Would you get a "free" operation from a doctor with no liability or pay for one from someone who does.

Any liability law should offer an exemption for software that is distributed along with buildable, commented source code.

The reason is simple. The end-users of open source software are in a position to verify the integrity and correctness of the software. Even if such an end-user is not a programmer, they could, if they were concerned, pay someone else to inspect the code. They have been provided with the ability to protect themselves, because the source code accurately describes the actual operation of the product.

The end-users of proprietary software are in no such position. They are absolutely dependant on the software vendor to verify the integrity and correctness of the software. They are powerless to protect themselves, and without the source code, they are only left with a representation of the operation of the product. This is far less information then the source code, which specifies the actual operation of the software.

Therefore, only proprietary software vendors should be held liable for bugs in their software.

Not reasonable. For a project of any complexity, verifying the integrity and correctness of the code is a financially gigantic undertaking. If you disagree, I have a favor to ask.

I'm kind of concerned about using this Apache product. Would you mind trundling off and verifying the integrity and correctness of all the source code please? Oh yeah - and if it includes standard libraries I need those verified as well.

Can you get that done before the weekend? I was hoping to install on Saturday.

The end-users of proprietary software are in no such position. They are absolutely dependant on the software vendor to verify the integrity and correctness of the software. They are powerless to protect themselves, and without the source code, they are only left with a representation of the operation of the product. This is far less information then the source code, which specifies the actual operation of the software.
Therefore, only proprietary software vendors should be held liable for bugs in their software.

Interesting, but this is all unncessary.

See, its all about what we need. For example, if I need a redudant system that never crashes, runs 24/7, and will power my mission critical applications I can certainly make that happen. I need the proper mix of software vendors, hardware vendors, support contracts and local personell.

The bottom line is that I of course can procure that level of quality. It will however cost me lots and lots of money.

On the exact opposite side of the fence, if my little brother Jimmy wants to buy a video game with proceeds from his part-time job, he can certainly purchase that video game. He goes into the store and looks over the selection - there are games ranging in cost from $20 to $100. And he of course can choose the game which bests suits him.

The point of it is, software security/reliability is a feature: look at Oracle for example and how they advertise their software ("Unbreakable").

The market has created clear categories of software that range from the rather unreliable (Windows, piddly silly games, etc) to the extremely reliable (commerical Unices, VxWorks, QNX, etc). Interjecting liability laws into this arena will only throw that balance off and eliminate the lower-cost alernatives (including maybe boxed Linux distros!).

The bottom line is that liability, for any vendor, will artifically move the well defined lines around and alter the software industry.

Which means no one would use open source software. If you've got two competing products, one open and one closed. The possibility of a bug in the application will lead to millions of dollars in damages. You know this from the beginning. With a liability law, even a lopsided one like you suggestion in place, a company is going to go with the closed solution. Why? Because if you know a bug will cost millions of dollars you're going to go with the product you've got a chance of recouping damages with. You'll pick the software with a vendor you can sue. Bah source code shmorse code, the cost of fixing potential problems in code in damn high if it is done well. You've also got the fact that you can never squash all bugs in software. Yet again the user of Free software gets shafted, not only can they not sue the vendor for a million dollar bug but they also have to spend their own money in order to try to fix it.

Therefore, only proprietary software vendors should be held liable for bugs in their software.

If this happens, we've codified into law, the current myth that already plagues open source/free software. That if something goes wrong with free/open source software, there's no one to sue. Thus there's a business liability in choosing free/open source software.

Right now this is a myth, and is completely untrue, because if something goes wrong with ANY software, there's no one to sue. I've been opposed to software liability in my/. sig for some time now. Here's a journal entry on software liability [slashdot.org] that I wrote. It has three comments. Unfortunately the comment period expired some time ago. But I think we still need to talk about how to do software liability without putting open source/free software at a disadvantage either directly or indirectly.

Lets say that a company is giving away free sound cards. However, the sound card when used for more then 20 hour straight without rebooting will melt. Now lets say I had some MOBOs damanged by this... can I sue?

Sure, you can say that I bought the service contract and not the software, but I suspect you're going to have a really, really hard time convincing a judge of that when there's a box sitting on the store shelf that has "Red Hat Linux" printed on it. Specifically, that convincing is going to cost $ in lawyer's fees which RH can ill afford.

Seriously, this is a *terrible* idea. MS has lawyers out the wazoo and the cash to pay them to tie up any such suits forever. (See antitrust case) RH and other small companies don't, and they are going to get hammered the first time a major problem comes along

This topic matched with the previous story about USAF laying blame on Microsoft seems to indicate a new trend... blame the software company! Does this mean that small companies will get sued out of existence? Yet another advantage for big business?

What about considering alternatives and choosing wisely? If it's such a crucial point, why not take open source and audit it thoroughly, like the NSA?

I think a lot of people will probably see this as a good thing, forcing M$ to take responsibility... but I think it has the potential to lay a huge disadvantage on the competitors as well.

This topic matched with the previous story about USAF laying blame on Microsoft seems to indicate a new trend... blame the software company! Does this mean that small companies will get sued out of existence? Yet another advantage for big business?

Unfortunately Microsoft is the root of the problem. While plenty of other software from Sun, IBM, and even Free Software has had bugs, no one has been remotely as irresponsible and, yes, negligent, as Microsoft has, particularly when it comes to security.

So while the trend of blaming Software Companies is unfortunate, it is a trend that the practices of the most well known, and notoriously shoddy, software company, Microsoft, has virtually invited.

Alas the "cure" being proposed by these fools could very well kill the only real cure to this problem, Open Source and Free Software subject to public audit, in favor of legal hoops through which only the big players can jump and which, in the end, won't really result in better software anyway.

In short: businesses and people are beginning to notice an ugly symptom of a monopoly-controlled market (high prices and shoddy quality), and rather than addressing the root of the problem (Microsoft, and in particular their desktop monopoly) they grope for a solution that ignores the underlying issue, will stifly the entire industry to such a point that expertise will almost certainly go overseas, never to return in any signficant quantity, and exacerbate the entire problem by killing the one solution that has, quite naturally and organicly, already developed but to which they appear to be willfully blind: Free Software and Open Source.

I'm sure that if such a lawsuit were to happen, there's gonna be a huge arseload of blame-storming going about. I can't count how many times I've said, "It's there, but I've got a few things that need to be sorted out.", which have been replied to with, "Screw it! We need it now. Just send a patch later.". After that comes having to prove that said person did say that to you (usually a saved email. You did make sure that person sent you the go-ahead in email, right?). That person goes on to say that his/her higher up gave the go-ahead or set time restrictions, and so on, and so on...

OTOH, even if it is the programmers fault, how are they going to prove an intentional oversite and what sort of testing standards will they have for software before it's unleashed?

Software companies don't spend enough time on design and testing the product before it's made public

No. Managers at software companies don't spend enough time listening to their own #%*@$*)(#^& engineers who are ignored, ridiculed, shouted down, laid off, downsized, or outright fired when they point out repeatedly that the product is being developed WRONG.

Of course, it's always better to be a team player. Just sign up for the donut list, keep your mouth shut and wear a big smile, BIG SMILE at all the meetings. That's how the job is kept.

Competence, craftsmanship and professionalism are no longer of any value in the workplace, and until they are, it will be impossible to fix these problems.

I'm not a developer, BUT, I can understand the amount of work that goes in to developing a large product. I would hate to think that you would be liable for some unforseen circumstance or combination of things that could cause an exploit that you would get sued for. We'd quickly see development slow down to a crawl.

Also, what happens now when a company compares a commercial product to open source? Let's see, we pay Microsoft $800 for Win2K Server...but if it's broken we can sue them. We use Linux.... That means we can't sue them since we didn't pay for it and have no contract, but our customers can sue us. They'd go with the $800 of "insurance".

Microsoft employees pointed the finger at users who didn't safeguard their systems... But they left that task to users and, more often than not, it was ignored. "People didn't spend the two clicks to do it," says Craig J. Mundie, Microsoft's senior vice-president.

If I give you a car, am I liable for the fact that it has no brakes? What if I sell you a car?

What if I give you a tool? Am I liable that it breaks and breaks whatever you were trying to fix with it, too? What if I sell you one? What if I sell you one and say that it's rated for the work you're trying to do, but it still breaks?

See the differences?

Now for software:

What if I give you a binary? Am I liable that it doesn't work? Am I liable that it has flaws?
What if I sell it to you? Am I liable then?

Now for something completely different: Source Code
What if I give you source code? It's available for your inspection... Can we say that source code documents itself? If you are worried about what the code does, you can read it, compile it, debug it, step-trace it. Source code is NOT a program, it's closer to an algorithm than to a program. Can I be sued for giving you instructions on how to tell you computer to do something?

If source code if just instructions, directions for a computer, then source code starts to look like something different, and precedent must come not from binary-software but from things like legal advice.

And you know how that goes... IANAL, so I can say anything, you take my word if you want to. So, if IANAP (not a programmer), can I give you whatever source code I want, and I won't be liable?

IMNSHO, this would be a really good thing. One of the current problems with software (and a lot of other things) is that cost are shifted away from where they belong in order to make a product cheaper.

It is cheaper to write software that works most of the time, but has a few bugs than it is to have an proper design, implementation and testing process that prevents buggy software from being shipped too soon. In general the industry has the felling that it cheap and easy to release a patch for a bug later so the cost of not catching it early is small.

This is the exact opposite of hardware engineering, were companies go to extreme measures to try and debug the design be commiting to Si since it is very expensive to do this.

Increasing the cost of bugs to the software developer will decrease the quantity of code and increase the quality of code, something that is sorely needed.

This could have a wonderful effect on upgrades. No more mixing fixes and feature adds -- too dangerous (aka Service Packs).

Can you imagine MicroSoft's position? New license agreements with WinXP require users to upgrade every two years. MS will be held legally liable for the stability of those upgrades. They better damn well get it right.

Remember that U.S. Navy ship that switched to NT and was dead in the harbor? Imagine the Navy sending a bill to Bill.:-)

At heart here, and often forgotten, is the issue of "merchantability". What is that? It's the assurance that something is saleable, that reasonable expectations of performance can be made, and that the product does, in fact, perform its intended function.

Because of this, it can be SOLD. If I sell you a keyboard for $20, you now have the expectation of merchantability. It is expected to work, and both reasonable business sense and many local and federal laws require that if it does not, I either provide something that works, or give you your money back, within a reasonable period of time. (14 days in California)

If we re-institute the concept of merchantability in software, all that would happen is that you could get your money back - thus little to no effect on OSS software.

Red Hat may be impacted, but since they are already selling services rather than products (you can download all their stuff for free) even they would be minimally affected.

So, as an advocate of open source and "free" software, I welcome the issues of product liability and the enforcement of merchantability. It would improve the industry, force it to get better, and would finally provide its customers what they've been promised all along - a better, easier life!

What should happen? A date set for a software "merchantability horizon". All products released before that date would be exempt, any products released/sold after that date would have to fit the definition of merchantability, products sold before that point can continue on their merry way.

Can you imagine how many people would upgrade their Windows if they knew that MS would be liable thereafter if it screwed up?

1. Software vendors exisit today that will provide you with necessary levels of support and uptime/reliability. That's a fact.

2. Merchanitability is not liability. As far as I can see, this already covers software, correct?

3. Liability holding software vendors responsible for how a client uses the software. This is a wrong idea. If I use inappropriate software to do a job then I should be accountable, not the vendor.

4. Liability for software would likely hurt and chill free software development. The chance that a person giving away code could be held responsible for flaws/data loss/bugs/security problems would cause many many many people to stop writing code. Additionally, companies who sponsor OSS development (like for example the now defunct Loki) would be hesitant to release open source code if they thought it might be a legal liability in the future. Imagine if SDL caught on and 99% of games were based on it. What happens if a bug is discovered that causes those other games to crash. Loki would be liable for damages to all those other game companies who built from SDL. That'd be a very bad thing.

5. Defining what software actually caused a fault can be nearly impossible. In a complex system, with perhaps software from over 50 vendors, the finger pointing would NEVER end. What will happen is that vendors will force clients to use entirely controllable environments - you'll get single use servers for everything and actually increase proprietary software levels. What vendor would allow/encourage interoperability if it could widen their liability?

6. Finally, I agree with at least on Merchantiability. Software should do what it says: "servers webpages" or "effectively blocks attacks" or "writes backup data to tape".

Merchanitability is not liability. As far as I can see, this already covers software, correct?

Most modern EULA's specifically disclaim merchantability to any purpose whatsoever. The poster you're replying to is simply saying that if your software doesn't do what the seller said it would, then they owe you your money back.

You downloaded it for free? Then they don't owe you anything. You paid $50,000 for multiple installations and several hundred user seat licenses? They owe you a refund.

But if I've read the EULA's correctly, it's been multiple decades since any mainstream software product was sold. They've just allowed you to access it for awhile.... Of course, they say that it is leased, but it doesn't look like any lease that I've encountered in any other context..

Those products were sold, before you got to see the EULA. Thus what the EULA says is irrelevant.

The only software that is licensed is that which is agreed to before any money is paid. If you call up Microsoft and ask for a site license, they can hand you a list of restrictions. If you walk into CompUSA and buy the software, you've bought it free and clear.

(And are only bound by existing law. You can't copy it, but you also can't use it to bludgeon someone with, and not because of any restriction from the vendor.)

I believe a good model for liability in the software
field is to move to the service and practitioner of the field model.

A customer
asks a practitioner of the software field to solve a particular problem. The practitioner then writes and/or
reuse and/or adapt existing software to solve the customer problem.
Then the provider is liable for having provided
a wrong solution according to current practices
of the field.

For example delivering a closed source software with poor security track record
as part of a contract specifying security
as critical would rank as an obvious cause of liability,
since the provider choosed it amongst
various solutions, he/she will have to justify
its choice before a court.

I believe the regular mechanism to cover potential
liability damage in other fields, insurance companies, will play
its cleaning up role by not accepting
to cover software solution providers with poor practices.

It will probably also make the free software
code base the center of most of these
service providers, since it easy easy
to customize, most of the code base have
well known status, and there is no hairy
licensing issues when you use them

As for shrink wrap software, it should
install on the designated system, but after
that you probably have no recourse at all
if this doesn't work that well.

I attended a lawyer conference on software licenses and liabilities, and there are vague
texts and no case law, and most lawyers were
quite sure that the standard warranty disclaimer
was with high probability invalid (under French law). They talked about services and "open source", and some recognized that using that
as scientific knowledge and having practioners
use it to deliver solution was like architects
building bridges vs people creating mathematical
models of gravity: the scientist is not responsible if an architect use his/her model
(reviewed and published in good faith) to design a bridge and it falls down, it is
obviously the architect responsability to
choose a model that works, to the level of the
accepted practice of the field of course.
If the architect has a solid track record,
if the phenomenom is beyond current knowledge,
then it is up to insurance companies.

Since a piece of software shares a lot
with a theorem applying to symbolic information
I find this model of liability very pertinent
to the software field.

Now if you want to give away software you'll really have to pay for it. Sooner or later a responsibility document was going to happen but the areas where it's going to hit hardest are not in mainstream press but in free software, where programmers won't have enough money to release anything in the first place.

Unlike the 'real world' example of the tire mentioned in the BW article... software developers have a much harder time controlling the environment in which their software is used.

For example, If I buy a car tire from firestone, but instead use it on some home-build dune-buggy that I use to drive over lava fields in Hawaii and the tire blows (flipping me into the lava) should Firestone pay? I wasn't using the tire according to the specs that they call for the tire.

Imposing liability on software will only force software manufacturers to list hardware/software configurations on which they are willing to accept liability. If you use the software outside of that configuration, then you're on your own. My guess is that this would disqualify just about everybody, as they'll only be able to certify a limited amount of equipment (as it will entail actually owning that equipment to test).

I mean, would you accept liability on a product that can be used on a multi-use computer that may have god-knows-what software/hardware config?

So this will lead to something like:

the back of the software box listing the exact system requirments that the software is good for (and liable on) and if you use it outside of that environment, you're no longer using the software as it was intended.

Which then just gives software companies even more reason to offer less support, as they'll then only need to offer support on their specific hardware, or risk the liability of condoning the use of their software on unsafe/untested environments.

more incentive to legislate the demise of the multi-use computer in favor of locked computing appliances... which is exactly what a number of people would like (think DRM)

I've said it once and I'll say it again. CowboyNeal should be held responsible for these vulnerabilities. *grin* Anyway, here's a very similar slashdot discussion [slashdot.org] and the related article [eweek.com] at eWeek which I don't believe is referenced in this new incarnation.

It will lead to VERY VERY strict licensing terms for software, and software development tool - sort of like Civil Engineering

Let's say I was Microsoft (or ANY other software vendor)

You buy a new motherboard - my answer is, "I do not approve of my software being installed on that hardware" - You will very quickly see things like "Approved Configuration Lists" - X Brand Motherboard, with Y brand Video Card, Z keyboard - ONLY. The "ONLY" other software I approve on the box at the same time is AAAA. Make any changes and your on your own

Heck, buy a car, change the suspension parts yourself to NON factory parts. Flip over due to your front wheel falling off - good luck suing the car mfg, you'll have to prove it was not YOUR changes

Heck, buy a car, change the suspension parts yourself to NON factory
parts. Flip over due to your front wheel falling off - good luck suing
the car mfg, you'll have to prove it was not YOUR changes

Sure, now buy a new car. Modify the alternator to serve as a welder (this is doable). Now have a rollover. You will have no problems convincing the courts that your midification didn't cause the problem.

When your example is modify the suspention parts, and then roll over, of course you have problems. Changing any suspention part changes the way the vechical acts. If there is any suspention modification I would expect the vechical to behave different (sometime better!) when driven to the edge of rollover. Of course if your change is shocks to a homemade shock it is impossibal to prove they are liable for the problems. If your new shocks are just an aftermarket brand, the manufacture will stand beside you to prove it isn't their problem. (or alternatively if it is their problem you are suing the wrong guy. Or are you arguing that if I install an aftermark brand of shock (assume a quality shock) but screw up the instation that the car maker should be liable for my goof? I won't agree to that.

This article is talking about security problems. That's only one kind of bad. Other kinds of bad include unreliable (hangs, bsods, whatever), incompatible, obfuscatory, and so forth.

Microsoft might be able and interested to remove security bugs from their software, no downside for them there. But what if Microsoft would engage in some obvious "good software practices" to make their software less bad? Like what if they made their software simpler? More modular? Like if their OS could run whatever window system, window manager, file browser you wanted, a la UNIX. Or whatever web browser. Imagine.

What kind of idiotic system design is it that has all these user-mode applications inextricably woven into the fabric of the OS? What unfathomable nonsense. What person who ever studied software engineering buys this silly story?

How about if MS would use unobfuscated data formats, so that it would be easy to work with document data (let's grep through my.doc files!)
or multimedia data (let's convert between.wma and.mp3!).

How about if they had a simple and stable API for writing software, so that it would be easy to port software between the MS OS and other OS's. Fat chance.

These are some of the things that make MS bad. Will they ever address them? Magic 8-ball says, "Outlook not so good."

Ok, so I'm currently working on a auction system that is in use by at least one company. They ask for a change in the software so the commission percentages that are charge to their consignors are handled in a slightly different way. I make the change and under certain conditions, it's now possible for the consignor to be charge half of what they should be. I can see there should possibly be some liability here especially if I were "selling" the product.

Now, they ask for a change that resizes the storage size for the Notes for each customer. I make the change, but my code does not also make the change to their database schema. I provide a separate script that does that. The customer installs the upgrade, but does not upgrade the db. Who is liable? Can I be held liable for not making my upgrade *easy* enough if the client forgets to run the db upgrade script and loses data?

Let go even further. I use MySQL for the db, python-mysql for the db module, python for the language and Qt for the interface. ReportLab is being used for pdf generation, lpr for printing, X-windows for launching the program, KDE for the desktop manager, and Acrobat Reader to parse the pdf files into ps for printing. Without these things, the program will not run.

Now, due to a bug in MySQL, the company finds that it is losing n*$50 where n is the number of items in the auction for every auction. Perhaps the 50 entry fee is not getting stored correctly and suppose that's a database problem. Who's liable? Me, for leveraging off an existing system without it being totally stable? The db? Maybe in this case it's clear the db maker would be held responsible.

Now let's lose some data because MySQL was not *configured* correctly. Who's fault now? Customer, me, or MySQL?

Lastly, let's lose some data due to a bug in the database that was caused by a ambiguity in the API of glibc that allows a function to be called in a way that was not intended and works as expected most of the time, but is clearly not a bug when it doesn't work the expected way. Who now? MySQL? The library they used? Me for using MySQL? The customer for being stupid enough to hire me when I'm not even competent enough to ensure the tools I use have absolutly no bugs in them? ARGH!

I'll tell you one thing... I've never associating my name with a general library if this kind of thing goes through. Blame would very often be passed back down the chain as far as possible trying to find a scapegoat other than yourself.

As a programmer, I have often given a simple explanation of why I can't write reliable software. On most vendors' computers (Microsoft obviously, but also Sun, HP, IBM and most of the rest), the inner workings are totally hidden from me. I can't even in principle know what a lot of my code will do in all cases, because I much make calls to the underlying system and its libraries, and the code for these things is a proprietary secret.

What I usually use as a parallel is: Imagine that the people who built buildings or bridges were required to use commercial steen and concrete, but the specs for these materials were trade secrets. Imagine that construction firms had to use whatever material was delivered, and were not permitted to see its specs. There would be no way that anyone could calculate the effect of loads and stresses, and things would fall down under load.

This is how software is built.

On Open Source systems, it's somewhat different, because the source is available. But even there, you can only understand the system "in principle". You usually don't have the time it would take to thoroughly investigate all the components that you use. Open Source software does generally work better, true, but it's not because every programmer has examined every piece of the source. It's because a lot of them have examined a few pieces, and they can tell each other about problems (and fix them).

This probably has significant legal impact. Consider the construction parallel again. If I design a structure and specify materials of a certain quality, those materials are used, and the structure collapses, I am probably liable. But if the material vendors substitute material with different properties (usually for cost reasons), all I need to do is show in court that the material didn't meet my specs. I'm not liable, and the vendors end up facing some serious fraud charges.

With software, this sort of fraud happens routinely, with all sorts of system components that are delivered knowing that they don't do what the manuals says they do. Or the vendors don't even bother checking that things work right, because they know they can't be held liable. Then people hire programmers like me to write software using such shoddy systems, and expect us to write reliable software on top of it. Then it turns out that some parts of the system have "undocumented features", and the code doesn't work right.

Until we find a way to force reliability on the Microsofts and Suns and IBMs of the world, the way we have with companies that sell steel and concrete, there's no way whatsoever that programmers can ever write reliable software.

First of all, I don't like these "soft" RFCs (aside from joke ones) that are not technical.

Second of all, the RFC really has no force given the RFC language. The two key provisions, that companies SHOULD fix holes within 30 days, and that customers SHOULD apply patches in a timely manner, can both be ignored since "SHOULD" in RFC-speak is different from "MUST".

Thirdly, this RFC is a bit too targeted at Microsoft:

1) The Vendor SHOULD ensure that programmers, designers, and testers are knowledgeable about common flaws in the design and implementation of products.

2) Customers SHOULD configure their products and systems in ways that eliminate latent flaws or reduce the impact of latent flaws, including (1) removing default services that are not necessary for the operation of the affected systems, (2) limiting necessary services only to networks or systems that require access, (3) using the minimal amount of access and privileges necessary for proper functioning of the products...

This is too "ripped from today's Microsoft headlines". This stuff about removing default services is bogus. Something like UPNP in Windows (designed to makes things easy for novice users) is useful only if it is turned on by default. Anyway what does "not necessary for the operation of the affected systems" mean. You can run Linux without a GUI...so if an exploit is found in KDE or Gnome will someone jump up and say, "You enable the GUI by default and it wasn't necessary and you violated the RFC"? The solution to flaws in UPNP to not ship with them, not to disable everything in the box.

Can we please keep the social engineering out of the RFC -- this is an absurd requirement to put in there. Why not just say "Customers SHOULD give preference to open source software because we think it's k3wL"?

I think a lot of software is released buggy as hell simply because investers and customers expect development houses to show results very quickly. Many contract jobs are six months or shorter, barely enough time to come up with a dog & pony slideshow of great software, let alone develop a secure product. Most developers depend on tools from other companies to cover the gaps in the process -- tools like IIS and apache.

The problem lies with the fallacy of internet time -- that software advances can keep up with hardware advances. The difficulty here is that Moore's law is based on years of research -- an advance in memory that doubles the speed next year will have begun five years or more ago with tons of R&D. Software doesn't really have that luxury -- it's all about the now.

One might say that this sort of demand is a requirement in business -- but in many ways, it's a self maintaining fad. Look at biotech -- a biotech company might do research for dozens of years before they can release a new drug or procedure. They have amazingly tedious checks and balances. Why? Because human lives are at stake. Because a single slip up will cost them millions in malpractice.

Holding software companies liable for security failures is a great idea in the respect that it will force dev houses to make better software. But in the process something will have to be done about the expectation that software is a need it now sort of deal.

As a side note: this sort of legislature would be a godsend for contract programmers. If company X has to wait years for a secure product to come out of Microsoft or hire somebody now to do the work cheap and sign off on the liability, they'll probably choose the latter. It'll also decrease on the feature blitz of new products that is leading to the increased need for pay for play software licensing.

There is a lot of "sky is falling" rhetoric going on about this that is just wrong-headed. Clearly, it would be a bad idea to make a company liable in perpetuity for a software product, with that liability beginning the moment a vulnerability is reported to them, or worse yet, discovered.

However, it is possible to write reasonable legislation around this. Consider: you can do any software task in hardware, albeit possibly less efficiently and frequently less easily and at higher cost. If you were to make a circuit which performed some function, and that circuit were to have an error which caused economic harm to someone, that person could sue you for damages. Thus, why should it not be legal to sue for damages a company which makes a product which *could* be reduced to a circuit, provided that the other circumstances were the same?

If a law were written to allow users to sue a software company for liability, under the conditions that the company had known of the vulnerability for some time (say, 30 days just to be arbitrary, or say 3 years - whatever), and knowing that, had neither produced a fix nor issued a recall to all registered customers, I don't see a problem.

You would certainly want a grace period for the company to fix the flaw or recall the product. You would probably want limitations on liability to the provable immediate losses, or the cost of the software, whichever is higher (possibly with some limited damages above that). You would likely want such a law to exempt programs distributed as or with complete and understandable source code, on the same basis that you couldn't sue someone who printed a design from which you built your own circuit. (That is, including source code would transfer liability from the producer to the user.)

This would allow companies which depend on commercial products that they cannot inspect to have legal protection, while not bankrupting companies who act responsibly by fixing problems within a short period after they are found.

I think making software companies liable for their products so they would be forced to fix reported bugs would be a great idea. I remember a year or so ago I found a bug in a game by Activision, and I dutifully reported it to them. I didn't make it public at the time, since I wanted to give them a fair amount of time to issue a patch, but their complete refusal to do anything about it leaves me little choice. Maybe they don't think that fixing the bug in Ghostbusters that prevents you from entering one of the buildings on the map from a certain direction isn't worthy of their attention, but dammit I paid $30 for that game back in 1984, and it interferes with my enjoyment of the product! The customer support rep's excuse? "I'm sorry sir, I've never heard of a 'Commodore,' so I must assume we do not support it." Where does it end?

Speaking for myself, I'm all for this. How many times have you wanted to do a better job but were given impossible deadlines, leading to shipping something you knew wasn't tested well enough, and hoping to fix the bugs later? Most programmers WANT to produce good software, but are not given time or tools.

I hope that something like this will cause managers and execs to provide proper tools and sufficient time to produce truly stable programs. I do believe that, like other forms of liability, though, unless intentional negligence is shown, liability must stop at corporations, not individual programmers.

Also, there must be still a way for free software to escape liability. If you're getting something for free, you can't expect the author to take liability.

I would think that in this situation, Microsoft should WELCOME liability law; it would be a great selling point for them in the face of Linux, if they could say "if you use free software, nobody is liable if it destroys your business, but Microsoft IS liable for any harm caused your business by our software." I imagine that many corp execs would give that argument a lot of weight.

However, at the same time I don't know if it would be 100% effective, because by now enough CTO's have realized that Linux (and other free solutions) is a more reliable platform for many applications, and it's still better for all involved to use something that works than to use something that causes you monetary loss and then try to recoup it in court.

If you want liability for software kiss the GPL goodbye and look forward a stifling of developmental progress in software. Under a liability law the GPL would be unenforcible because it provides that the author is in no way responsible for the software you're using. One of the two isn't going to work out and I think the liability law would have a little more clout. That is assuming people even develop software anymore. I'm not going to put myself in a position to get sued because of a bug in my software. I'm not going to go through the hassle and effort to try to start my own business if any software we write is going to lead to our legal raping because we couldn't possibly squash all the bugs in our code.

The GPL and free software in general would be forced the way of the Dodo. If your license couldn't absolve you from responsibility for your code fucking up a whole tenet of the GPL would be meaningless. Besides being impossible to develop no one would continue to use it. If the possibility for a software glitch to cause monetary damage are you going to pick a vendor you can sue or can't sue? Managers are going to go with the folks they can slap a lawsuit against in order to recoup damages. Why would you use an open source application in which a bug could cause you millions in damages that you couldn't recoup? The only reason managers go with open source software now is they can't sue vendors of proprietary software for bugs so they go with the lower TCO (whichever option that is).

It is also ridiculous to compare an operating system like Windows to some RTOS or firmware system that control hazardous equipment. Windows and Linux aren't designed for use in hazardous environments. They also are not cleared to operate on certain pieces of equipment. If a system doesn't pass a safety inspection it isn't going to get sold. A heart monitor isn't going to run Linux and the control equipment for a nuclear reactor is not going to have Clippy morphing into a bicycle.

Great, another revenue source for lawyers. Does any one else see a problem with this?

Imagine someone suing everytime they got a blue screen. The ONLY way to make the software super duper lawyer proof would be to overly control the hardware. Thus stiffling inovation and the creative process as a whole. Remember that original IBM PC and the clone makers were more successful than Apple because the box was open and could be added to and hacked with relative ease. No persons box will have anything "easy" about hacking at it after the lawyers are finished.

For almost any problem where litigation has been the answer, the solution is often worse than the initial problem.

IANAL, but IIRC, source code has been found by the courts to be speech. Software liability will create a prior restraint on the expression of that speech. I don't think that any liability laws will be upheld in the courts for people who release source code. They can claim that it's simply the exercise of their 1st amendment rights.

But this will impact the distributions, who release software in binary form. I don't believe that binary code is considered speech. So the Red Hat's, SuSE's, Madrake's, Debian's of the world might be in trouble with their current distribution method. But probably not the authors.

All told, I still find the idea of software liability to be discomforting. Unless it can be done in such a way that it doesn't immediately disadvantage free/opensource software, either directly (by holding authors/distributors liable) or indirectly (by making free/opensource software a business liability since there's no one to sue), I think it's a really bad idea. See my journal entry [slashdot.org] for more details.

As it is, software companies get off scott free, with only their reputations at stake (and those w/ deep pockets can afford the advertising budget to counter the bad experiences and boost their reputation). But it would be nice to see some sort of financial incentive to produce better quality, reliable software instead of just a lousy implementation of the latest greatest big idea. Just like there are contracts that reward being completed on time and punished for being late, we could have mandated licensing terms where a major bug (like the UPnP hole thing) VERIFIED by a disinterested 3rd party, would result in a partial refund, to partially cover the expenses of patching. I would not go so far as making a company legally liable for some of those always overinflated 'costs' that show up in class action lawsuits. Noone should have to code in fear that a missing comma is going to cost the company a million dollars. But a simple system of rewards and punishments to get over the 'flashy crud' that so many consumers fall for, and onto a more stable, robust, secure world.

A large proportion of the of the security problems would just go away if the subroutine return address was stored in a separate memory area from the data area. This would make the buffer overflow / stack-smashing type of attack impossible. It's such a simple idea I am amazed that it has not implemented long ago. There must therefore be something wrong in my thinking, what is it?

If you want your software to be guaranteed to have feature 'x', then demand that your vendor sign on the dotted line a promise that the product meets your expectations. And be prepared to pay money to get what you want.

Otherwise, read the damn license. You know, the one that says "NO GUARANTEE OF FITNESS TO RUN NUCLEAR POWER PLANTS BLAH BLAH BLAH". If a vendor is explicitly telling you that they are NOT promising you anything, then you are just plain stupid to think that you have the right to demand more. If you don't like it, put your money back in your pocket.

Where you might take issue with are products that hide the fine print inside the shrink wrap. Of course you have no such problem when you can see the source.

I seem to recall, in big bold letters, a statement at the end of the standard EULA that says without question that installing the software makes the user assume any and all responsibility for loss due to the installation or use of the software being licensed. Even if the law generally requires people to give reasonable disclosure, I don't see how someone can't use the EULA and say,"Sorry bud. You read the agreement, and there's your notice."

One thing that is encouraging is that Software Engineering may become a real discipline and not just a buzzword. It is inevitable that Software Engineering will take the same course as other traditional engineering disciplines. Our reputations depend on it.

One thing that is discouraging is the possibility that hobbyists will be shut out depending on what sort of legislation occurs. This is something that hasn't happened in many other disciplines. Would wonders like TLC's "Junk Yard Wars" be possible if the Mechanical Engineering industry were regulated to death? What about model rockets? Home chemistry sets? Do-it-yourself electronics? Helping your neighbor build a tree-house for the kids?

I hope the people behind any new legislation understand that purely non-commercial efforts, where the would-be customers pay nothing and nothing is promised, should not be regulated.

Free Software is non-commercial and nothing is promised to the end-user, so it should be left as-is. However, those who choose to commercialize it, such as Red Hat or IBM, should be willing to accept some liability. After all, they are making money off of it.

In conclusion, software should be treated just like any other product. If money is being made off of it, then the customers are due what they paid for. If no money is involved, the lawyers and politicians should just keep their hands off.

Obviously, since simple software is both more reliable and easier to prove, I'd limit myself to simple software. Good-bye GUI, hello command line. Also, since most software these days is built heavily dependant on someone else's libraries, I'd either have to have the source or roll my own: black boxes, no matter how well guaranteed by the vendor, won't fly because of the costs of litigation. So what we end up with are small, simple programs to which the source is widely available and easy to tinker with.

Let's say MS buys some code from a small competeing company. MS runs the code and it crashes one of their servers and causes some minor damage. MS then, using these new laws about accountability, sends it's massive legal department after the small competing company. The small company, having no finances to put up against MS, will cease to exist.

Sure the new laws of accountability sound nice but it takes money to enforce them.

Sigh. There are safety regulations in place that prevent software like Windows from controlling the fuel air mixture of your car's fuel. The reason for this is explicitly liability. No one would stick Windows on a system requiring real time control and processing. No one would stick Linux on them either. Go tilt at windmills someplace else.

Actually, since software bugs have not sufficiently been responsible for lots of deaths yet (unless you count the self-induced forehead traumas incurred by windows users), this would be a good thing.

It would finally level the 'expectations' playing field of software. Over promising and unreasonable expectations is what is KILLING this industry right now. Kill a few people with it, and maybe the suits might start wondering if software really/can/ be what its cracked up to be in the business world.

If you are a responsible employee, you're situation doesn't change. Right now, if you can't deliver quality on time, you should:

1) Inform your supervisor that the demands cannot be met with quality.
2) If that is ignored, inform him in writing.
3) If the expectation still does not change, then let the deadline pass. "I'm sorry, it's not ready. Here's why" - and show him the letter from #2.

I work in a similar environment to you - I design internal software for an industrial company. Deadlines slip - it happens. Life goes on. I will not provide half-assed code just because the timeframe is running long.

Except that as a programmer for a company, you wouldn't be the one being sued -- your company would. First time that happens they will wake up and adjust the deadlines, or they will soon be out of business.

Simple fact of business is that most marketing types don't know how to say "no, we can't do that". Customers need to be conditioned to hearing "no, that timeframe isn't reasonable".

See, there will be a bad side effect if legal software liablity comes to pass.

Companies like RH, who sell the fruits of individual coders labours, will be in a tough spot.

For example, the latest Zlib bug - the people here at work spent about 2 hrs fixing our boxes because of this problem. Should RH have to pay out $125/hr fee for that time? I mean, we bought the software from them! Its broken! They should fix it! Just like a car or anything else!

And that will be the problem. Suddenly, companies like RH have to charge 10x or 100x the amount they charge now for boxed distros.

If you want RedHat or any other software vendor to go out of business then sure opt for the idea of software liability. If a relatively simple bug in a 100$ piece of software I have on 10,000 machines causes me millions of dollars in damage I can sue you for my damages. If you're found liable for the fault (forgetting to close a back door lets say) you're going to owe me millions of dollars. How many software companies do you think can survive after paying out a compensation of several million dollars? Do you want to see RH tank because there was a bug somewhere in their code?

A check in the Campaign Donor box guarrantees the
holder insulation from legislation which may find
the card holder liable for any damages, further, the
card holder may be elligible for assistance from the
Department of Justice in legal matters.

IAAL (I am a lawyer): Best practices are one thing, liability is too much, and this proposal just seems like a stretch. Generally, liability requires a duty, breach of that duty, and damages.

As for the duty - the draft RFC says the company has the duty to disclose and either provide a patch or admit there's no fix. This proposal doesn't aim to create better software from the beginning, only to inspire software companies to quickly and responsibly respond to reported vulnerabilities. So, in essence the software company is not liabile for the trashcan software's initial release. What if there's no fix? If the company admits it as the RFC calls for, does the whole thing stop there, no liability? Also, the RFC rests on the premise that the vulnerability is reported to the company, but not publicly. If someone finds a bug and decides to report it publicly, is the company excused from responsbility because the reporter didn't follow the RFC process? That's one huge loophole.

The biggest problem in my mind is damages. Even if the RFC becomes a legally recognized standard of care and the software company drops the ball, what's the loss? Except for critical systems used, for example, in medical facilities or some such that might get someone killed (Mr. Rasch's unsafe car analogy is a red herring; generally, the security holes in software don't put my life at risk), it's generally going to be lost data and the person-hours required to restore it, perhaps lost business during the downtime. The damages will be pulled out of the sky by some bean-counter.

Tort law (liability for you non-law talking types) generally involves the concepts of assumption of risk and contributory negligence as well. How far will this proposal go in forgiving the people who aren't keeping backups of their critical data but lose it due to some reported vulnerability the company did not correct? Are these irresponsible users excused because the programmer missed a bug or security hole? What about Company X's system that gets hacked through some glaring security exploit and my personal information gets swiped. In addition to Company X suing the software company, am I also eligible to sue?

Many of us protest everytime a cracker's jail sentence and fines are jacked through the skylight based on some victim company's astronomical estimate of the "loss" from an attack where no actual damage is done, but some files are swiped. A security breach by some acne-faced script kiddie isn't worth $1 million - what's good for the goose . . . The hackers (non-perjorative) of the world will become what they have beheld.

Best Practices and voluntary standards are great, but the liability bandwagon has travelled too far already. Accountability in software should be driven by the market, not the courts. When someone sells worthless software - stop buying from them. We don't need something more to sue people over.