Zero-Day [wikipedia.org] would mean that Microsoft had zero days to fix it or no time at all to patch the system that had the security vulnerability between the time they release the software to the time the bug goes public. By that definition this would be best described as a "five day exploit" or more in fact if they knew about it before Ormandy's notice.

Wrong again, Zero-day refers to the amount of time that the bug/vulnerability has been disclosed to the public, not patch. It is still possible to secure your system with just the knowledge of how the attack is reaching you.

Now see I always read "Zero Day" as being a vulnerability that either not found until it was exploited in the wild, or released before the vendor had a patch in place. In other words the vendor has "zero days" in which to patch the bug before it is or could potentially be exploited. Strictly speaking this bug would only qualified as "Zero Day" if the guy had released it publicly before or at the same time as he notified Microsoft; but an argument can be made that since there isn't yet a patch, and the vul

I always assumed it to mean that the day the software is released, an exploit is found -- kind of like a zero-day crack to pirate software. Apparently I was wrong, and it means whatever the article author needs it to mean in order to sound as bad and scary as possible like "z0mg! we have zero days before the end of the world!"

I have been led to believe that "Zero-day" refers to the amount of time that exists between public knowledge of an exploit and when you see it being used in the wild.

No, it's the time between public disclosure of the vulnerability and the time when the exploit is released. When you hear about it or when you see it is quite irrelevant.

It's kind of like "hacker" though, and gets thrown around to mean all sorts of shit that it does not.

Yes, as demonstrated by your comment. Zero-day cracks are cracks which come out on the release date, and Zero-day exploits are exploits which exist in the wild (whether you have detected them or not) the same day as the disclosure.

Not true, he says in his advisory that Microsoft acknowledged receipt the same day.

They didn't do their own advisory within 5 days (actually 4 1/2), which is perhaps what made him think it was the right thing to go public. Ormandy himself has begun to realize that he handled it badly.

Bear in mind that he reported it the Saturday before an especially heavy Patch Tuesday. It's reasonable to presume that people at the MSRC were busy.

And if anyone thinks Google is involved they're obviously wrong. I'm sure the

"Ormandy admitted that he reported the vulnerability to Microsoft only five days ago -- on Saturday, June 5 -- but said he decided to go public because of its severity, and because he believed Microsoft would have otherwise dismissed his analysis."

Microsoft was informed about this vulnerability on 5-Jun-2010, and they confirmed receipt of my report on the same day.

So they did respond. They just didn't fix it in five days:

Those of you with large support contracts are encouraged to tell your support representatives that you would like to see Microsoft invest in developing processes for faster responses to external security reports.

That's what he was complaining about, and I think it's a legitimate complaint.

Confirming receipt of the report sounds like "yes, we got your email of the report". I believe what we are looking for is if Microsoft provided any information (timeframe, severity, anything), so the point is still open. The fact that this article and every article I've read on it has not said anything about Microsoft giving some info is smoking-gunnish that it didn't happen. Still, until there's a credible source the question is still out there.

I submitted a security issue in how one of their management products generates a private key for signing internally distributed programs and other things. I gave them all the details, it took a while, but they patched it and included the fix in the release of the 2010 System Center Essentials (a mishmash of their pricier more specific products).

Full disclosure is of course, the only way to go when you don't get a response. If they don't treat security as a serious matter, then don't waste your breath. But complicated bugs can be difficult to fix, and fixing those bugs requires not insignificant regression testing.

Those of you with large support contracts are encouraged to tell your support representatives that you would like to see Microsoft invest in developing processes for faster responses to external security reports.

That's what he was complaining about, and I think it's a legitimate complaint.

He did get a response. He didn't get a resolution (in the time frame he wanted one in).

Lets put a not so hypothetical situation out there to consider. You're working your ass off getting a project out the door, coding yo

You might want to pick a subject you know a little about before pontificating. Tavis Ormandy has reported dozens of critical security vulnerabilities to Microsoft and others. Just search for "Tavis Ormandy Windows kernel vulnerability" to get some of his top finds. And in these previous cases you can compare the report and disclosure dates to see that he's waited several months, or in some cases more than a year for the patch release. If you actually read Tavis' disclosure and note the trivial nature of this bug, you'll see that he just got sick of waiting on Microsoft's extremely long fix pipeline, and chose this as an opportunity to push back.

Now, I'm not saying I agree with Tavis' actions here, but the actual situation bears no resemblance to your uninformed framing.

This service is turned off be default on all systems I manage both as part of initial installation; and where possible by Group Policy. Just another parasitic service which is not necessary....because everyone just uses Google anyways.

This service is turned off be default on all systems I manage both as part of initial installation; and where possible by Group Policy. Just another parasitic service which is not necessary....because everyone just uses Google anyways.

You should turn off everything you don't need but if you turned off every insecure component of windows you would be left with a machine just running its BIOS.

"Public disclosure of the details of this vulnerability and how to exploit it, without giving us time to resolve the issue for our potentially affected customers, makes broad attacks more likely and puts customers at risk. One of the main reasons we and many others across the industry advocate for responsible disclosure is that the software vendor who wrote the code is in the best position to fully understand the root cause. While this was a good find by the Google researcher, it turns out that the analysis is incomplete and the actual workaround Google suggested is easily circumvented.

No, it's the "look, seriously, give me some time before you tell everyone how to pick our locks" approach. 5 days is a *ridiculously* short time in which to expect MS to turn around a fix, doubly so given they've been burned in the past by fixes hosing obscure configurations.

What's the "right" number? I don't know... 15 days is probably more reasonable, but it really depends on the scope of the issue. But 5 days is *clearly* too short... well, at least to anyone with half a brain and experience in the so

Except these moves don't punish MS in the slightest. It punishes end users who are just using their computers and have no say in the policies here.

Not to mention, 5 days certainly is not enough time to do the testing MS needs to do to release a patch. I'd rather just perform a work around (limited rights, removing functionality, etc) than deal with a patch that will cause me further problems.

If he has only given five days before releasing it into the wile he is recklessly irresponsible. It just shows a person can be intelligent one way and a complete eejit in another. Could he be sued for this by someone who gets infected?

This could be one of those infamous bugs that MS has known about (secretly) for two years, but they never bothered to fix. If that's true and the programmer knew the bug had existed for two years, then I consider him a cyber-patriot for whistle-blowing. Maybe now MS will get off its 1200 pound ass and fix it.

It depends on the nature of Microsoft's response. Consider the following:

(a)"Thanks, this looks serious. We've got a team looking into it now, but we've found some difficulties with your suggested fix. If you don't see a security patch in the next several days, don't be alarmed. A patch is coming soon, but we don't want to release a fix that creates more problems. We'd appreciate it if you kept this under your hat while we're working on this. We'll be sure to credit you with finding this problem when the

I thought there was a big fuss a few years back about how vendors didn't respond to researchers and how they took forever to fix problems with close sourced software. So the industry decided that 5-7 days after letting a vendor know about a problem that everyone would release the information so that everyone would know about rather than just the bad guys and so system admins would know to watch for that type of attack and force the vendor to fix it in a timely manner.

Seem like this is just standard timing since vendors have gotten in the habit of ignoring researchers and not spending the time and resources to fix problems that they should have tested for in the beginning and most of the time don't want to bother fixing. Historically companies have not wanted to spend manpower and money required to fix program bugs. They more want to fix them when they get around to having the free time a few months later to fix the bugs. After all bug fixes don't make them any money. If I remember correctly there was a quote from Microsoft saying that exact thing. "People don't want bug fixes, they want new features and bells and whistles instead." So if Microsoft really feels that way then this shouldn't bother them at all, since people don't care about having bugs fixed.

The quote was from German weekly magazine FOCUS (nr.43, October 23,1995, pages 206-212). Bill Gates was being interviewed when he made statements to that effect.

If you treat program bugs as a PR issue, then don't be surprised when people use PR against you for bugs you don't want to be bothered to fixed, in a timely manner historically.

So the industry decided that 5-7 days after letting a vendor know about a problem that everyone would release the information so that everyone would know about rather than just the bad guys and so system admins would know to watch for that type of attack and force the vendor to fix it in a timely manner.

Except he doesn't give 5 days. This guy minimizes the amount of time Microsoft has to respond to the issue while trying to stay in the 5 day window.

First he could have given more than 5 days, ie. at least a week. He chooses 5 days.

He chooses the worst possible day of the entire week to report the bug. Saturday. Even Sunday would have been better, since have the weekend is gone. Also it would be easier to get a bigger emergency team on this the following day.

After all this he reports the bug, first thing on the 5 day!

This just shows how dirty the IT fighting has become ( not that it was ever civil ). And as many have pointed out, even if you don't like Microsoft this affects the XP and 2003 Server users the most.

The standard (called "responsible disclosure") is to give the vendors a chance to work a fix into their regular release schedules (be that monthly, quarterly or whatever). This includes making sure they have time for patch development and testing before the release.

"People don't want bug fixes, they want new features and bells and whistles instead."

I remember that interview: Bill Gates was asserting that people won't pay for bug fixes, but only for new bells and whistles. And he's right! People expect software with no bugs and they expect that the inevitable bugs will be fixed for free. The big problem, of course, is that Microsoft put new bells and whistles at a higher priority than bug fixes since they get paid for the former but do the latter for free.

5 days is plenty of time to issue a patch, even if it just closes the hole while a proper fix is worked on.

You live in a dream world. Yes, 5 days is fine if you have a non-os product that isn't part of an ecosystem with millions of applications running on it. For example to patch something like a text editor - 5 days is probably enough. But a responsible company with millions of installs (Microsoft, Apple) isn't going to rush something out that would break more than it fixes. That would be stupid.

There is apparently a simple registry edit that can fix this, as Secunia advised. Surely MS can do something stopgap? I mean my goodness, a single Google guy found the bug, found a function partially responsible in helpctr.exe, offered a binary patch to partially fix the issue, and created PoC code. A secunia guy then reviewed the patch, found the REAL culprit function, and offered a working registry patch. This all occured within the last week-- and a multi-billion dollar company hasnt done anything ex

Sure some companies don't give a fuck about incompatability caused by updates and that sort of thing, however MS very much does.

Further, as they have such a large share of the desktop and server market that depends on working it would be irresponsible of them to throw out a patch in a mere 5 days that can't have been fully tested with countless configurations and ended up causing more harm to customers machines than if they'd just not bothered to patch at all.

You can't reasonably build and test a patch that has minimal effect on your customer base in 5 days when your customer base is as large and varied as Microsoft's.

It may seem that so, but the reality seems to disagree. Most Linux distributions release security updates within a day or two after the vulnerability is announced and while I maintain dozens of Linux machines, I had witnessed a security update breaking something at most once. On the other hand, I have seen problems caused by Windows updates countless times.

Most Linux distributions release security updates within a day or two after the vulnerability is announced

Which distributions?

Just last week Ubuntu released two kernel updates (at least for x86-64) for 10.4. I can't help but think the reason is that there was a flaw in the first release that forced a second.

This happens less often with Debian, because Debian uses its unstable tree (where Ubuntu gets its packages) to have users check for crashes or conflicts before promoting them to the testing tree, where,

Perhaps for some that is possible, although clearly Microsoft has no process in place to do something in that amount of time. With analysis, design, implementation, unit testing, code reviews, and whatever else their software cycle involves, I don't think they have a chance at having anything at all releasable in 5 days. So this expectation is a known impossibility, and likely known to some degree by those responsible for releasing the information.

If a industry leading OS vendor who has a legally declared monopoly doesn't have a process in place to fix serious reported bugs in 5-7 days, after what 10-15 years as an OS company, then they deserve everything they get. Microsoft is the largest software company in the world. They should have had a team in place for years to deal with these kinds of reports and have a process in place to get a "hotfix" out within a few days and a serious stable long term solution with a week or two max. If Microsoft can't

This story would be funny if not for the fact that the Google engineer may have put a lot of computer users, and probably its own customers, at risk in this little game of one-upmanship.

It reminds me of a quote from Robert DeNiro playing Jake LaMotta in the great film Raging Bull by Scorsese. He's sitting at the table of some mobsters who are needling him about the impressiveness of another fighter: "Maybe I'll put da two of ya in the ring together and you can fuck each other".

Sorry, but it seems that you are a little bit confused about the real cause. First of all, the blame lies on MS for creating the bug. Secondly, a responsible vendor should fix a security hole as quickly as possible, because security bugs are rarely discovered by a single person only. It is highly probable that the same bug is already being expoited by the black hat hackers in the wild. Five days is more than enough for the vast majority of security problems and delaying the fix is completely irresponsible.

Dang, and here I'd al;ways assumed "Zero Day" meant the bug had been there since the day the software was released. Like the bug in the.BMP rasterizer, revealed in 2004, that had been there since Windows 3.0

Missing from the summary is that not only are they documenting the exploit in detail, but they are also providing a hack to patch the hole.

The point of releasing this "Five day exploit" which has been vulnerable for 9 years now (XP was released in 2001) is to point out that Microsoft needs to do a better job responding to security threats and that the closed source model is less robust to these kinds of threats. Had this been open source, they could have simply issued a patch to a mailing list to close the hole.

No compiled software is safe from someone with the means and the motivation to modify it. Having the source code does not make it any easier or harder to exploit, but it does make it easier to patch exploits and allows for more people to examine the code for exploits.

The point of releasing this "Five day exploit" which has been vulnerable for 9 years now (XP was released in 2001) is to point out that Microsoft needs to do a better job responding to security threats and that the closed source model is less robust to these kinds of threats.

Linux developers have issued a critical update for the open-source OS after researchers uncovered a vulnerability in its kernel that puts most versions built in the past eight years at risk of complete takeover.

Both the linux kernel null pointer dereference bug & the malformed character escape bug we're discussing today were reported by the same guy: Tavis Ormandy. I think that this refutes the claim that a few people are making that today's incident is just an attempt by google to sabotage microsoft. It seems to me like this guy is disclosing vulnerabilities wherever he finds them and letting the chips fall where they may.

For some reason, my up-to-date Opera on XP SP2 just executes VideoLAN to load a (non-existent) JPG instead of the supposed WMP execution -> vulnerability trick that IE is vulnerable to. VLC then just errors out because the hcp:// protocol is obviously nonsense to it. I assume my copy of VLC is somehow associated with opening unknown protocols in Opera.

And in the IE case, WMP executes and then ZoneAlarm (ancient version) pops up and asks if I want Windows Media Player to access the local network. Twice. If I Deny, nothing happens. If I allow (both times), Windows Help and Support Center opens and then another ZA popup asks me to give permission for that too (and that says "Internet" rather than local, which would be blocked by default). If I allow that too, I get a copy of Windows Help and Support Center with a search for the nonsense page and not much else. "Computer Information for \\eval(unescape('Run("calc.exe")'))" is what's literally written inside it, and calc doesn't execute.

My IE, WMP, ZA and Windows Updates on this machine are NOT up to date by any means. The only thing that's up-to-date is Opera. Nothing untoward would have happened under normal usage. So it seems of dubious use at best, it's not a particular killer of a vulnerability.

However, the technical analysis was quite interesting and the problem basically stems from shitty programming at every level - not checking return values that indicate failure, continuing on and then passing arbitrary (and unescaped) strings to other functions, a cross-site scripting error within the Windows Help internals (due to insufficient escaping of data), allowing script execution to happen again on dynamically-generated script code because someone tagged "defer" (a Microsoft-only invention) to a script tag, and finally a way to avoid a security-related prompt on versions of IE, Firefox and Chrome by hiding the very same code inside an iFrame / Object which executes WMP. It's like a catalogue of errors, some of which have been previously reported and well-known for ages. It's just crap all the way down to actual execution of anything you like using wscript. And that's present in XP - a 9-year-old operating system with millions of deployments, Server 2003 and probably a lot of others using non-ancient version of IE, WMP, etc.

Stop whinging Microsoft, and fix this crap. That's been in the OS that millions of people used for **years**, after all your patching and service packs, and you never even spotted it, even when you were the only people with the code to the damn thing. I'm not saying it's easy or you should find everything, but FFS - the problems there just show crappy programming and patchwork all the way to the OS core. That "defer" thing just REEKS of someone saying "But I need a way to bodge this...". Whether it's responsible disclosure or not - fix it first, whinge about their methods later. Where's my response saying when you'll fix it? Where's the estimated patch release date? Where's the hotfix? When you've put those out, you can whinge about them being irresponsible with security. And then they can say "But we're one of your main competitors!" and laugh at you, the same way you would if one of your researchers found a major bug in Google's websites / OS / browser.

Google, like Apple, is no longer any better/different than the companies they claim to be better than (from an ethical stand point).

I don't know about that. MS could have really used this to their advantage - 'We praise Google in finding and releasing this exploit of our windows XP OS. This is just another example of why everyone should transition to Windows 7. Insert fancy marketing for windows 7'

I'd also argue that anyone still using windows really should upgrade to a more modern OS and Google was just trying to put XP out of its misery. Sometimes you have to do harm to not do evil, like cutting off a leg to save a life.

What?? Given Microsoft's history of fixing their bugs, I would of released it as a 0-day instead of a 5-day! Google's just doing everybody a favor. Looks at all the other companies that are afraid of angering MS. Don't forget that Google's recent security breach is directly because of MS products.

Google, like Apple, is no longer any better/different than the companies they claim to be better than (from an ethical stand point).

Yeah yeah. Apart from the the guy not actually doing this as a Google employee;

"Finally, a reminder that this documents contains my own opinions, I do not speak for or represent anyone but myself."

And the fact that Google, Apple and everyone else have got a long way to go before they approach the utter moral bankruptcy required for the likes of the Halloween documents, the derailment of OLPC, the ODF/OOXML fiasco and so on.

True, the hole shouldn't have been there but there is a difference between shout "Hey! Everybody! You can break into the fort here, the wall's broken!" and quietly saying the fort owner "your wall is broken, people could get in through there".

Google, like Apple, is no longer any better/different than the companies they claim to be better than (from an ethical stand point).

Did you RTFA?
The Google engineer - who btw didn't use any indication that they are from google, other than the link back to code.google.com - also posted a hotfix. So... they told Microsoft 5 days ago AND GAVE THEM A FIX...
If this person was from a company that wasn't a competitor, would anyone call disclosing an (NON-ZERO DAY) issue on the security list so that security professionals are aware evil, after giving MS time to see the vulnerability and test the potential fix - I'd expect a company that derives Microsoft sized revenue from their OS to have someone readily available for these issues.

Not really, but I think his hotfix is a starting point, and testing would/should be at least partially automated. As another poster stated, they could put out an advisory or diable the service or do something more than they have done for the past 5 days.

Right, they won't use the security researcher who found the bug that their "evolved" process missed...
And that's why Microsoft has such a great and well deserved reputation for producing secure products. Internet Explorer, SQL Server, IIS, the Active X framework, every version of Windows OS before 2008/Seven.
Firefox has been a terribly insecure product, but they do make timely efforts to fix the bugs when they are discovered. For me, that counts for something.
I don't want to be an open source zealot,

You are aware that said code was submitted to Microsoft by someone who works for what is currently Microsoft's biggest competitor, whom they are currently in a 3-front war with (Browser, Search Engine, Netbook OS)?

This is a moot point, though: Google could later claim copyright over said code and sue Microsoft over it. Something that doesn't apply to your fire analogy.

Actually, he recommended against using the hotfix, and instead suggested disabling the protocol handler. Speaking of, is anyone aware of the hcp: protocol being used for anything other than security exploits? Because in its 10 years of existence I've never once seen it used legitimately, but I've repeatedly seen it expose security vulnerabilities.

Microsoft is a gigantic company with gigantic resources. Is it possible their priorities are not in patching? If ONE GUY can whip up an exploit and a patch in a few weeks, MS should be able to review the damn thing in a few hours on a few thousand virtual machines running in Hyper-V. You cannot tell me they dont have the resources to test this quickly.

Im sure his hotfix and one man testing matches MS's extensive testing. Seriously, do you think any company would just release this fix immediately without serious testing?

I'm sure this was tongue in cheek. I'd safely bet there's a whole lot of "one man testing" that far exceeds MS's lack of testing based on these types of stories that keep coming out about MS's lack of quality control. After all, isn't MS the company known for selling software and letting their customers beta test it?

As for MS releasing the fix? How hard is it to test something when you've been pointed to the flaw, given all the test conditions, and the fix, and it's in a relatively small piece of code? Gran

In the unlikely event that you heavily rely on the use of hcp://, I havecreated an unofficial (temporary) hotfix. You may use it under the terms ofthe GNU General Public License, version 2 or later. Of course, you should onlyuse it as a last resort, carefully test the patch and make sure you understandwhat it does (full source code is included). It may be necessary to modify itto fit your needs.

MS are the ones focusing on the hotfix and claiming it's flawed without providing an explanation. MS are also the ones desperately trying to frame this as Google, when it was Tavis operating independently on his own time.

I'm not saying I agree with what Tavis did, but MS' shady response certainly isn't making me less inclined to side with Tavis.

And why, exactly, is Google at fault here? The actual post on Full Disclosure states the following at the bottom:"Finally, a reminder that this documents contains my own opinions, I do
not speak for or represent anyone but myself." He makes no mention of working for Google, posting this with Google's sanction, nor does he even post it from a Google email address.

The fact is, a guy who posted this vulnerability in a private capacity, and he just happens to work at Google. Just because he works at Google, som

Im not going to bother defending google here; Id much rather you flesh out your argument. What is Google doing wrong, how does it benefit them, and how would you have handled it? Point number 2 especially I would be interested in-- If google had wanted to mess with MS, why not release it as 0-day, and not report the thing at all?

I mean, its really popular to just throw flames at whatever company is in the articles posted here, but I think people should at LEAST be required to spell out what theyre sayi

This issue has absolutely nothing to do with Google. Google has a strict policy that what you do on your own time and dime is yours. That's why they have a lot of really good security people there who all conduct independent research that's completely unaffiliated with Google. So, to be very clear, Tavis did this entirely on his own. MS mis-framing it as Google (and Slashdot buying it hook line and sinker) is just a smokescreen. Sorry, but you've been suckered.

I don't think his managers approved his conduct. He doesn't believe in responsible disclosure, but it seems like Google as a company do. So I wouldn't be surprised if apology or termination would follow soon.

I don't expect any corporation to have morals, but I don't like Microsoft because I don't like its software. Well, Excel is ok, but that's only because all the other spreadsheets suck even worse.

What really bugs me about Microsoft is you can't hardly buy a non-Apple computer without getting Windows. How hard would it be for them to give me a choice of OSes? Probably pretty hard; MS has most likely made deals with the hardware manufacturers preventing it. THAT'S the immoral business practice that I hate, bec