The National Academy of Sciences (NAS) is asking Congress to make it easier for companies releasing insecure software to be punished. The panel concluded that there should be consequences for those companies producing software which puts consumers and businesses at risk. NAS' Computer and Telecommunications board stated that the laws in place today fail to provide enough incentives for the market to respond adequately to information and software security issues. Suggestions by the board include taking steps toward making software, systems vendors, and system operators more liable for security breaches.

Although these solutions have been voiced before, the fact that the NAS is now officially recommending that something like this be done could spark urgency among legislators as some decisionmakers think about what might have happened if a cyber terrorist attack had ben synchronized with the attacks of September 11. The precedent has been set by Congress that companies could not be held liable if their software was not Y2K-compliant, and technology companies have naturally steered clear of laws that increase their liability.

Mark Rasch, vice president of cyberlaw for security firm Predictive Systems Inc. states that security accountability laws would be difficult to nail down. For example, what should be the security standards for Solitaire? He also asks why companies buy products that are not up to the job, or fail to purchase a product that would be secure. He asserts that education is the best solution to increased security. The NAS also finds fault with mainstream businesses, making the point that they do not take security seriously, and recommends that security administrators have more money and the authority to tighten the security of the information vital to the survival of the business. Another recommendation of the board is that ordinary workers be trained in individual computer security and be given the tools with which to shore up that security.

RON'S OPINION
What have been the countless comments that I have read on this site about competent network and security administrators, making decisions to buy the right software that is secure, patching the software that is out there to make it secure, and educating the masses about basic measures they can take to shore up security on their own systems? How many times have I read on this site about certain companies being held to a higher standard of security because their products are more popular and more targeted? What will be the incentives that will cause a natural gravitation of the market toward solving security problems quickly?

Although I do not believe we can “legislate morality” (or information security, for that matter), I do believe there needs to be some “incentive” (punishment/consequences) for those who are not diligent to release reasonably secure software, and for those who are not diligent to patch and maintain the information security of their systems so that the rest of the world is not infected with the latest new virus, and so they are not unknowingly used in the perpetration of DDoS attacks or other malicious hacks.

I believe something needs to be done; unfortunately, I think those of us on both sides of the industry (software and administration) should have taken the initiative to fix the problem before it got this big. I really don't believe that Congress will be able to effectively create laws which ensure the liabilities of those writing and administering the software so that they balance correctly. For example, we shouldn't punish the systems operator because he or she was not authorized to make the decision to buy the most secure solution for the problem.

That said, something must be done, and I believe the NAS' Computer and Telecommunications board has made some good recommendations that we already know are working in some cases now.

USER COMMENTS 19 comment(s)

MS (9:27am EST Thu Jan 10 2002)get ready to pay – by Nit.

Economics…(9:38am EST Thu Jan 10 2002)Shouldn't the market itself regulate this? If a company is selling a bad (read insecure) product, shouldn't the comsumer switch to another product. If the consumer is too lazy, or too stupid to switch then is it not the customer's own fault if he gets hacked? Similarly if the customer is once again to stupid or too lazy to install the patches/updates for said product is that not the fault of the customer? Granted this only addresses the business customer, as the home user neither knows better nor has any viable alternative, but it seems that the article is mainly adressing the bussiness comunity. – by StaticShock

With vulnerabilities being found one after another and patches for these vulnerabilities creating more vulnerabilities, the market should have sent MS sales into the tank. But, despite all the problems, MS is showing increased sales.

As for keeping up with the patches, this is very difficult and may indeed require a full-time position just to watch for updates. After applying an IE patch even I would want to think that I don't need to worry about further IE patches for a while. But, in this case that while was only a few days.

Despite the fact that the market is NOT doing its job, regulating bad software, any law governing this is highly unlikely and would certainly be hard to enforce. It would also, significantly stifle innovation. Think about it, you develop a desirable software package, to the very best of your abilities and you want to go into business selling it. But, think of the risk. If you missed something (I know this never happens to “real” programmers), you could be ruined by such legislation. If you used personal assets to get your business going, as many entrepreneurs do, you could lose everything. Thatís a major risk that would likely stop a lot of great ideas from ever coming to market.

– by Get a Grip

Re: Get a Grip(10:30am EST Thu Jan 10 2002)We are in total agreement over the (foolish) attempt to 'legislate security.' However, I would have to disagree with your statement that the market has failed to do its job. The fact of the matter is that markets rarely 'fail.' This percieved failure may actcually be an indication of something else. In this case I see two possible options:

1. Corporations do not value security.

2. Corporations do not understand security.

Neither of these options are excusable. However this should partially shift the onus of responsability from the corporation to the consumer. If it is our data that they are 'leaving out' then the consumer should demand change. If it is their own 'trade secrets' that they are letting loose, then it's their own issue and they will be 'punished' accordingly. However, if they are a public corporation…

My simple point is that it is not the market that has failed, but the consumers. – by StaticShock

I think that if there could be an unbiased resource that evaluated products and released bulletins. We have the Gartner Group and CERT, and others. But, its amazing that they are little known by IT managers.

Punishing may not be the best thing yet, first we need to do a better job of educating managers/purchasers about associated risks.

There is one other empty perception though, many managers believe that by sticking with certain companies, that they have someone to hold acountable and therefore if they do get hacked or have crashes, they can go to the vendor. This just isn't true. If you got hacked and then tried to sue the maker of the software, they'll probably laugh at you… they have plenty of money and influence. Heck, they bribed their way out of a DOJ investigation!

– by The Scavenger

not a good idea(11:13am EST Thu Jan 10 2002)Legislation like this would only make it harder for an up-and-coming programmer to release his/her software. Now, not only does the developer have to devote resources to improving their product's stability and functionality, but they have to also devote resources to paying a law firm to make sure they don't get sued by some company that didn't know what a firewall was.

Large corporations (MS, Cisco, Sun, et al) will adapt quickly — they have plenty of resources to go around. But the independant software developer will suffer greatly, if not become extinct entirely. – by fishbert

a good idea(11:28am EST Thu Jan 10 2002)Why not develop better ways of finding people that exploit software insecurities? Going after the software developer because some script kiddie got some credit card numbers off of your server is like going after gun makers because little Timmy found daddy's gun and thought it was a lollypop.

There is insentive built into the economics of software development. Nobody (no, not even MS) wants their name to be synonymous with insecure/buggy software. If that was the case (and if security was a high priority to the consumer), people would be eager to adopt somebody else's software. It's like the airline industry. They don't need to have the government do their security. As recently witnessed, people dying on your planes is bad for business. Planes crash = less people get on your planes = less money in your pocket. Don't go around crying “Oh, those bad people! Look what they did to my company! Give us money so we can survive.” If you don't take care of the reputation of your business, you deserve every bit of the backlash. Same with software. More security breaches = fewer buyers = less money in your pocket. And 'money in your pocket' is what companies care about most anyway.

Don't security holes that get exploited the most have readily available patches from the vendor (that never got applied because some over-paid IT folks got lazy)?! I'm pretty sure I read that somewhere before. – by fishbert

Legislation For Lawyers(11:55am EST Thu Jan 10 2002)I think that its good that software exists in a spectrum. Good code is very expensive and so it makes sense for software to exist that is pretty good but not very expensive so that small businesses can afford it. Small businesses can take the risk since the risk is smaller. Larger corporations would need expensive highly secure software since they're at a high risk. – by joe

Defective product …(12:04pm EST Thu Jan 10 2002)If I buy a Ford Explorer with defective tires which subsequently blow out, causing the car to roll over and kill my mother, I can sue the manufacturer for selling a defective product, right? If I buy a MS product which is defective and my business suffers, why can't I sue MS? Similarly, if my credit card # is stolen and misused when I purchase a product online from a company which uses the defective product and hasn't applied appropriate security patches, aren't they liable?

One hates to have to rely on the lawyers to make companies do the right thing, but that's the way the law works. – by GaryG

Legislation For Lawyers(12:17pm EST Thu Jan 10 2002)Congress will agree (after all, their a bunch of lawyers) and no good will come of this.

These laws will only serve to make a bunch class action attorneys richer. Large companies like Microsucks will be subject to class actions that do nothing more than fatten the lawyers purses and shaft the consumer.

Look at the Iomega case. The settlement was: lawyers walk away with a multi million dollar cash payoff. The consumer: got discounts on mechanise that can only be bought through the Iomega store. But if you looked closer you discover that even after the discounts it was still cheaper to buy the merchandise through the normal retail outlets. No smart consumer would have used these discounts. THE ONLY WINNER IN THIS CASE WAS THE LAWYER. He or she should be sued by another class action lawyer (representing the class of consumers screwed in Iomega class action).

Smarter consumers is the only solution to fixing the security problem. – by LaughAlot

Not the Same(12:27pm EST Thu Jan 10 2002)Sorry GaryG, the Explorer issue is NOT the same. First of all, a computer virus or security hole has yet to attack and kill a user. Second, there is an issue of personal responsibility. If I go out and buy Firestone tires or a Ford Explorer today, it is MY FAULT. I know better but have willing made a bad choice. The same goes for tobacco and other dangerous products. If you believe that MS products, or any others, are faulty don't buy them. Once you do it's YOUR fault.

So, how many of you sys admins are willing to be fined or go to jail if you make a security mistake? Let's see a show of hands.

The FDA tried something like this back in the old days when I was doing programming for several drug companies. You had to sign a statement declaring that the software you wrote was free of bugs. There were penalties and jail time listed as punishment. Commonly used packages, such as SAS, were “pre-approved” and assumed safe but anything written with that software was the programmer's responsibility. Guess what, it did not work! There was just no way to apply the regulation that made sense.

Security is at the user level, both in purchasing and maintenance. Only when we are allowed to throw politicians and bureaucrats in jail for bad choices and poor actions would I be willing to embrace something as dumb as this.– by tjack

GaryG, think of it this way: (part 1)(2:10pm EST Thu Jan 10 2002)I think this hypothetical scenario has more parallels than the Ford/Firestone analogy.Suppose I own a corner-store. Now, I'm not too rich or up on cash register technology, so I buy some middle-of-the-road cash register. It's not too expensive, but it's not too cheap either I know I'm not just buying plastic crap to hold my money. Now, one night while the store is closed, some guy jimmies the lock on the door of my store. Then, he takes all the money that's in the cash register because I forgot that I left the cash register key in the machine. I notice this in the morning, and what do I do? I call the cops and they start looking for the robber. Do I sue the maker of the cash register (whose manual clearly suggests not leaving the keys in the register)? No. Do I sue the maker of the lock that's on the door to my store? No. Why not?! Because these two manufacturers have done nothing wrong. The only person that did anything wrong was the robber, and that is who should be held accountable (well, and myself for leaving the key in the register).

How does this relate to software and security holes? Many corporations today (large and small) use software to protect and secure their funds, information, and products. I used the cash register to secure my funds and I used the lock on the door to secure my store (my products and information). The robber probably broke the window on the door (a brute-force script) to get in, and he used the key to open the register (a well-known exploit for which the vendor has a workaround that I didn't pay attention to). – by fishbert

GaryG, think of it this way: (part 2)(2:11pm EST Thu Jan 10 2002)Now, as rebuttal to the Ford/Firestone analogy:The problem in this case is that a product malfunctioned on its own accord through normal and expected use. The loss of property and/or life in this case is a direct result of the product itself. There was no third party (robber/cracker) involved. The product is responsible for what happened, and therefor, the maker of the product can be held accountable for such events.

For illustration purposes, I'll use an example of software responsible for monetary transactions at a large bank. If the software malfunctions, on its own accord, through normal and expected use (ie, what it was designed for) then yes, the persons/company responsible for the software should be held accountable for whatever damages occur as a result. However, let's say a cracker breaks through the software's security and steals a bunch of money. This loss of property did not occur as a direct result of the software, but as a direct result of the cracker's actions. The persons/company responsible for the software should not be held responsible for what happened as a result of a criminal's actions.

Will the bank be upset that their money is gone? Of couse! Will they want someone to pay for what happened? Of course! Does that mean that it is right to make the author of the bank's software foot the bill? Hell no.

I do not believe in blind justice. – by fishbert

GaryG(5:39pm EST Thu Jan 10 2002)If you buy a MS product which is defective and your business suffers, you can't sue MS because you agreed to the EULA when you installed the software. As I recall, the MS EULA states, among other things, that MS isn't liable for any defects in their product.

In my opinion, a reasonable legal solution to this problem would be a law which overrides this type of clause in the EULA. Treat software vendors like any other manufacturer and I'd bet there would be noticable improvement in a short amount of time.

The potential problem with this solution is that it could open up Open Source and Free Software developers to law suits. Maybe there should be a provision for free software, but then where do you stop handing out provisions? – by mrresistor

One little point I would like to add(8:33pm EST Thu Jan 10 2002)Microsoft touted XP as the most secure operating system the company had ever made! It had a gaping hole that they kept quiet about for 2 months and did not admit to until outed in the press. If the weren't advertising the security virtues it would be one thing but they did make claims that they obviously could not back up. If they issued a statement about it and the consumer didn't patch it you could blame the consumer. But they left a lot of people wide open when they knew about the problem. If that is no irresponsible I don't know what is.

The software industry is the only industry that can do this kind of thing and not be held liable. Why? It's the same thing as Ford and Firestone knowing they had a problem and burying their heads in the sand. – by kuhndog

Freedom(1:28am EST Sun Jan 13 2002)We live in America, well at least i do. I believe a person/company has the right to do or make anything they want. I also belive the a person/consumer/company has the right to buy and use anything they want. It is called Freedom.. If someone buys a software with security issues, then that is thier problem for not researching first. If the security issue shows up later, and the comsumer does not wish to do anything about it, then again it is thier problem..

Nobody is destined to use one software despite popular belief. This includes Operating Systems. I highy disagree with the Government and the puplic telling a man/company what he can or cannot make and how it should be made. If i or even you where to make something, would you like it to be governed by someone else.. This goes against FREEDOM..

example: Windows security nightmare. I know this, I hate this, I hate Microsoft bussiness tactics. I use Windows. No one forces me too. I have a choice. I also use Linux and Apple. – by ViLenT[LG]

Only one problem…(9:52pm EST Sun Jan 13 2002)THE ONLY WINNER IN THIS CASE WAS THE LAWYER. He or she should be sued by another class action lawyer (representing the class of consumers screwed in Iomega class action).

In most states, the outcome of a lawsuit cannot be the sole basis of another lawsuit. – by Bill

Security(9:20am EST Fri Jan 18 2002)What tools would a home PC user have for assessing security of their? How educated must a home user be in order to effectively use any and all such security tools? And what about deliberate security problems vs unintended holes? Quite the issue. – by mmmna