Information Anarchy: The Blame Game?

Full disclosure of security risk information is still under fire—this time driven by the recent outbreak of malicious worms such as Code Red and Nimda. Last week, Microsoft published an essay written by Scott Culp, manager of the Microsoft Security Response Center. In the essay, Culp refers to full disclosure as "information anarchy" and says that Microsoft is working with other industry leaders to form a consensus protesting such information release. The company will ask its customers to support the adoption of the resulting consensus.

The central concern with full disclosure is that people often take vulnerability demonstration code—sometimes released in fully functional form—and use the code to create a weapon against unsuspecting users. "But regardless of whether the \[security vulnerability\] remediation takes the form of a patch or a workaround," Culp wrote, "an administrator doesn't need to know how a vulnerability works in order to understand how to protect against it, any more than a person needs to know how to cause a headache in order to take an aspirin." Although he's right to a certain extent, we need to consider a larger perspective.

Worms such as Code Red and Nimda definitely played upon well-known bugs for which patches had long since been available. Those worms showed us how many administrators don't consider security to be a priority in operating their systems. Granted, the worm writers seem malicious in releasing such nuisances, but is there a silver lining to those dark clouds? I think so. As a result of regularly demonstrated administrative complacency, Microsoft has adopted significant new policies and practices. The company has expanded its customer support efforts and is committed to providing even more robust security in its products and more robust tools to help automate and manage security. For example, because of these worms, Microsoft is now giving in a bit to the habits and needs of its customers instead of the somewhat idealistic visions of its software architects. So who benefits in the overall scenario? Everyone does. Culp wrote, "Customers who are considering hiring security consultants can ask them what their policies are regarding information anarchy, and make an informed buying decision based on the answer. And security professionals only need to exercise some self-restraint."

In reality, Microsoft doesn't benefit by condemning the sharing of detailed vulnerability information. Instead, the company should be scolding the misguided focus and relative complacency of its customers' administrative efforts. It seems that Microsoft is doing that now indirectly with its new Strategic Technology Protection Program (STPP). The effects should benefit information security in general, but getting a new program fully operational takes time. Perhaps any new consensus is going a bit too far too soon. In any event, a new consensus will benefit Microsoft by buying the company some time to get STPP into full swing. So again, who benefits from any new consensus in the long run? As Culp pointed out, "Even in the best of conditions, it will still be possible to write worms." So a new consensus won't eliminate the core problems of administrative latency and faulty code.

The full-disclosure problem comes down to timing on three fronts: Researchers publish explicit details in many cases without enough consideration for the time required for companies to develop a patch and coax customers into loading the patch; users wait too long to apply patches, if they apply them at all; and Microsoft product cycles are probably still far too quick to market for effective code development.

What do you think about full disclosure? Is it a detriment or a benefit to the user community, or does it seem to balance out fairly equally in the bigger picture? Go back to our home Web page and take the Instant Poll. We're eager to learn your perspective. And if you want to express detailed comments regarding any new consensus, you can post them in response to this editorial—you'll find a copy posted on our home page, too.

Discuss this Article 32

Steve (not verified)

on Oct 24, 2001

As a system's administrator, I find the idea that any vendor does not want its faults exposed to be ridiculous. By not exposing the faults, the vendors will become complacent and have a tendency to not fix security flaws. The bad guys will share the information and then we will truly be vulnerable. Given current circumstances this would not be a good policy to cover up security flaws or not report them.

I thought that groups like the l0pht (now a commercial security organization) repeatedly contacted Microsoft directly, even offering to fix their code, but were ignored or rejected.
Knowing about security loopholes should be part of the admins job. Unless you can admit to being easily replaced by a monkey. Go read bugtraq, that's why it's published. If only the people who want to use the exploits knew about them you'd probably be in a lot more trouble than you are now.
Microsoft *has* to point the finger somewhere, they often have trouble pointing at themselves.

Full disclosure is the only way we in the field can benefit due to many vendors, including Microsoft, who prefer to delay and obfuscate reality. Microsoft can, and likely will, provide better security and information in the future. However, I don't think this would happen if not for full disclosure. Aren't these the same guys who had Passport and their DRM cracked recently? How can they convince anyone with a straight face that they care about security? Thank God for full disclosure...

It is hard for me not to be cynical about a company that has been convicted in court of being a monopoly. Scott Culp seeks to place the blame for Microsoft's ill repute in the security sector on those who publish exploits. I hope no one is fooled by this transparent sophistry. Microsoft is first and foremost responsible for delivering insecure code, and systems admins are secondly responsible for keeping their systems patched.
But hey, most of the SysAdmins are probably MCSE's....
Ooops. Sorry. I said it was hard not to be cynical...

It is ironic that Microsoft should "blame the messenger" for the natural consequence of the decisions made long ago by Microsoft, probably an outcome of giving short term marketing advantage veto power over security concerns long understood: to put macro programmability in almost all applications, ensuring that viruses can be written for application environments; to bundle applications such as web browsers or mail clients with the operating system, ensuring the application vulnerabilities are widespread and difficult to remove; to withhold information on how to remove the bundled applications, ensuring that many administrators will not try and those that do will have difficulty, even if the use of those applications is banned by local security policy due to the number and severity of the bugs in those bundled applications; to install unneeded optional buggy server software by default with insecure default settings; to make default for the operating system such glaring security holes as autorun on insertion of removable media without even the requirement for console access; to retire availability & support for older operating system & application versions in an attempt to force unwilling past customers to purchase upgrades, driving users to a choice among transitioning off the Microsoft platform, freezing at a specific trusted version and accepting that new vulnerabilities will be discovered for which no remedies will be available, purchasing third-party software (such as SecureIIS to protect against vulnerabilities ), or gambling on newer more complex versions with whole new classes of vulnerabilities and requiring expensive new hardware to run at acceptable speed. All these issues have arisen in the course of my involvement in supporting my employer's network. I have 3 sisters; all have personal computers; none are computer wizards; the 2 PC users have lost their hard drives to viruses despite owning antivirus software addons; the third owns a MAC. I have reported vulnerabilities to Microsoft in the past; I never received a fix. Perhaps if I had used full disclosure there might have been a response. Several years ago I was an advocate of NT. Microsoft has betrayed the public's trust. It should be broken up into several units, such as Office application, File server, Internet server, Internet browser, client operating system, and legacy system support companies, with the legacy system support companies receiving source code for and right to modify for all software copyright 2000 and earlier, and required to provide quarterly rollups of fixes, with the initial rollup of all flaws known now provided to all customers free.
(the opinions above are my own and do not represent those of my employer.)

I would not restrict the freedom of speach of anyone at the same time I see posting code that facilitates a crime by those who are not able to develop the code themselves as irresponsible. Those who posses the talent and determination to break the law by destroying others' property will continue to do it. This is just like posting the instructions for creating explosives, how defeat locks or any other text book that facilitates a crime, it may be legal but only the scum of the earth do it and there motivation is not social good but personal enrichment.
Thanks
Paul

I agree with Microsoft that too much detail is being placed in posting security vulnerabilities on public forums. I also agree with your editorial that our industry needs to pressure suppliers to enhance their products to remove, as much as possible, culpability of administrators.
One item that was missing from your editorial is the totally lax response of law enforcement to these crimes against business. I can get ten years for knocking over a liquor store for $100. If prosecution and punishment were balanced by the economic impact of these attacks, publishers of viruses and worms would be put away for live.
And good ridance to them too!

The only item I have issue with is, and it is not mentioned, is that Microsoft seems chiefly concerned with exploit code being released as a "testing tool" and I agree. This practice is said to be part of "full disclosure." While it makes sense to fully explain the exploit, I do not see any legitimate reason to release working code that can easily be used to create a worm.

Here are the real problems:
1) Microsoft has shown little interest in producing quality products. Their software is so bug ridden that any kid with a free afternoon can find a new security hole. This is Microsoft’s fault and no one else's.
2) Microsoft has earned a reputation for responding slowly, if at all, to bug and security fix requests. Researchers release exploit code as the only way of prompting Microsoft to action. This is Microsoft's fault and no one else's.
3) If Microsoft produced a reasonable and efficient way to patch the many flaws in its software, administrators would do so. As it stands now, any site with even a few servers could dedicate fully half a qualified technician's time to the endless cycle of research, test installation, testing, production installation, and production testing. This assumes that all goes well. On too frequent occasions the bug fixes must be backed out because of still more bugs in the patches themselves. This is Microsoft's fault and no one else's.
As a response to these shortcomings the strategic technology protection program is, at best, inadequate, at worst, insulting.

To augment my earlier comment. If you would withold this type of information would you also tell the newspapers to withold any publication about crime? Never mind that the prevention of disclosure is an infringement on the freedom of speech.

Hi Mark,
While I understand Scott and Microsofts' concern of releasing the "blueprints for building these weapons", we as responsible systems professionals require such knowledge in order to better understand the threat and deal with it.
It is not sufficient to just get a security email notice from Microsoft that merely says to "apply this patch to fix this vulnerability". It is most important that as systems professionals we understand the full nature and impact to our systems posed by these threats.
Scott is implying that we not spread information about the threat and just blindly apply security fixes as they are provided by Microsoft. However, look at the problem we have now by relying on Microsoft to provide us with secure systems.
Also, as an exploiter I don't need to look on a website for information of what the exploit is and how to produce it. Heck, I only need to look at the code of one of the billions of replications of the darn thing to figure it out. It isn't very hard to get a copy (darn it!).
As Scott points out, no system will ever be 100% secure. I completely agree. However, I am also proactive in ensuring my systems are secure and I want all the information I can get in order to quickly and correctly minimize or eliminate risks to the business. And, I can't wait days for Microsoft for that information, or the latest security patch.
Sorry Scott, but this is the information age and there will always be those who use it for both good and evil. Rather than worry about the bad guys getting information on exploits (that they will get regardless) our time is better spent on thinking about the security up front. Please pass that along to the proper department at your company.
Thanks
Dennis Lancaster
Portland, OR

Most security professionals will tell you that before "full disclosure" became widely used, software companies didn't care about fixing security problems. Full disclosure puts heat back on the software company to fix the issue. Microsoft is just feeling the heat, and they don't like it.
Microsoft has demonstrated many times that they still don't care about security. They continue to push bad security concepts through COM/DCOM/ActiveX and now through .NET. I'm glad to see that they have increased their quality of code which can directly increase the security of the code. I doubt that security concerns had anything to do with this. Rather it was customer complaints about BSOD every 5 minutes.
If Microsoft would do the right thing and further increase the quality of their code, support good secure protocols, and work with the security community in supporting good security practices, then Microsoft wouldn't get stung so many times by full disclosure. (BTW, have you read Microsoft's software license because they don't even trust their software.)

How can I be sure a patch worked if I don't have the code to test? I'm just supposed to trust the same company who put out the mistake when they say they fixed it? That said, I would not be opposed to a closed source, compiled demo of the security hole. It would save me some time and be one step harder to reverse engineer into something dangerous.

Culp would better serve his customers by railing against his employer's habit of pushing code that hasn't been sufficiently tested for security issues. Microsoft needs to take a fundamentally different approach to features in their software. Take a page from the Apache book and disable everything by default instead of enabling and installing all those bells and whistles.
Furthermore, if full disclosure were to stop for some reason, the alacrity with which vendors now respond would, I'm sure, deteriorate. Necessity drive the response, not a sense of urgency for cleaning up security holes.

While researching a security issue recently, I came across a disclosure policy that seems to fit bill. It is a modified full disclosure, where the bug-finder notifies the maintainer and makes no public mention as long as the maintainer keeps the finder "in the loop," with periodic status reports.
Microsoft has frustrated my efforts to monitor attacks on my site, by making it difficult to identify known attacks against bugs that have been fixed, to filter them out so I could spot new attacks.
This effort by Microsoft can only be understood as an effort to enlist the world into agreement to protect Microsoft from disclosure of its errors. By blinding the world to the frequency and magnitude of Microsoft's errors, they can maintain the image of a competent software producer.
Full disclosure, modified as above, is necessary.

John Savill's Hyper-V Master Class

Join John Savill for 12 hours of comprehensive Hyper-V training. This master-level online training course will explore all the key aspects of a Hyper-V based virtualization environment covering both current capabilities in Windows Server 2012 R2 and looking at the future with Windows Server vNext.