And also, it's contradictory to what google did earlier this year. They released a zero day for windows [threatpost.com] and gave microsoft hardly a week to patch it. And as a bonus, they made the disclosure public on a Sunday.

I am all for more industry standard accountability, but this looks very one sided and google choosing to pick the instances where it gets a good publicity.

Microsoft OS and App vulnerabilities are the only internet currency better than eGold. If you travelled in those circles you'ld see how bad the situation is. I've been there and back, so I'll tell ya: it's bad. Bad. Really, really, really bad.

If you'll pay $500, there's folks out there who will deliver the contents of your own email inbox unedited, for as far back as it goes, externally and without assistance. The most honest of them will sell you that info and let it go, but we all know there's a lot

This article is basically laying out a policy Google will follow in the future. Here is the most critical bit:

A lot of talented security researchers work at Google. These researchers discover many vulnerabilities in products from vendors across the board, and they share a detailed analysis of their findings with vendors to help them get started on patch development. We will be supportive of the following practices by our researchers:

Placing a disclosure deadline on any serious vulnerability they report, consistent with complexity of the fix. (For example, a design error needs more time to address than a simple memory corruption bug).

Responding to a missed disclosure deadline or refusal to address the problem by publishing an analysis of the vulnerability, along with any suggested workarounds.

Setting an aggressive disclosure deadline where there exists evidence that blackhats already have knowledge of a given bug.

Now that "zero day" (well 5 days really) the Googler gave Microsoft was only because Microsoft would not commit to fixing it. That is perfectly consistent with the article, which points out "responsible disclosure" is a 2 way street and only works when the person with the vulnerability acts responsibly as well (which Microsoft didn't in this case). You could argue that he should have set a deadline regardless of whether Microsoft agreed to it, but I would not say they are contradicting themselves. They also point out in the article that responsible disclosure isn't always the best route. So I'm going to have to support Google in this article, which is simply about laying out their "supported" disclosure policy for their security researchers in the future.

Now that "zero day" (well 5 days really) the Googler gave Microsoft was only because Microsoft would not commit to fixing it. That is perfectly consistent with the article, which points out "responsible disclosure" is a 2 way street and only works when the person with the vulnerability acts responsibly as well (which Microsoft didn't in this case).

that is twisting the truth more than a little. MS said they would get back to him with a timeline by the end of the week, he then went and published it anyway. the irresponsible party in that instance was definite Tavis Ormandy.

I've reported dozens of critical vulnerabilities in Microsoft software over the years, and I still have multiple open cases with Microsoft security, this particular case wasn't as simple as you have assumed. I would not be so presumptuous to explain the ethics of your work to you, but evidently you believe you're qualified to lecture me in mine.

If I were to read the sensationalised lay-press coverage of your latest publication or project, would it prepare me to write a critique of yourwork?

Google didn't release a Windows vulnerability, someone who happens to work at Google did. He took pains to point out that he was was working independently, but MS and some in the media chose to imply that we was acting on Google's behalf. Don't take my word for it - read Tavis' original post [seclists.org] and spender's interesting, if bitter, analysis [seclists.org]

It was a potential conflict of interest [wikipedia.org], given that he is a paid employee of Google who works as a security engineer. If this was inconsistent with Google's policies, there is definitely a problem, the problem would have been Tavis' fault (not Google's), but it would be up to Google to repudiate the actions if it believed Ormandy was not in compliance.

This article instead suggests that Google's policy is consistent with Tavis' actions, so it really doesn't matter.

Hcp vulnerabilities are a well known attack vector for Windows, and given that the specific vulnerability he found has existed in Windows XP for 9 years, he felt it was very likely that black hats had found the same technique and as such there was a very high likelihood that it was being actively exploited in the wild. I'm sure the ease with which it can be executed factored in as well - it's literally just a one-line hcp url with execution code in it. Therefore, he felt full disclosure so security professionals could begin mitigating the issue (i.e. disable help center) was more important than giving Microsoft ample time to fix the problem.

Personally, I agree. Microsoft has a history of sitting on high-severity vulnerabilities for years if they aren't disclosed publicly, and this was an extremely easy to execute exploit. The prudent course here was to get the information out ASAP, with little more than a courtesy call to Microsoft before he did.

Dear Google,
We would like for you to meet us in the 'Square'.
All manner of issues have been squashed in the square and I am sure we can some to some kind or arrangement over this issue.
Sincerly,
Chinese government.

Apparently being the skinniest kid at fat camp is important to anyone who watches NBC's The Biggest Loser. Whoever loses the most percentage of body weight wins. And in any case, being the skinniest at fat camp is better than being the skinniest at concentration camp.

This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.

Yes great for a laptop if needed or charity spending if your a trustafarian.
The world gets better apps, Google gets to spread more ads, the SC community learns from bug fixes.
I find googles views on privacy and their past mistake to point at some deep issues.
Better than DRM lock in/out turn off stagnation with MS and Apple for now.

This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.

There's just so much wrong with your characterization of these bounties as "truly competitive" that I don't know where to begin.

In reality, the only competition these bounties represent is a competition for free publicity and goodwill.

Missing the point. The bounties aren't what's competitive. What's competitive is the browser market. That they need to keep upping the amount of money they are offering to find problems in their browsers is a function of the competitive browser market.

Mozilla is a foundation, not a company. That is, if we exclude the corporation subsidiary that deals with sponsorship etc.

So, this is not some capitalist ideal of two companies competing for our benefit, rather a very non-capitalist foundation that is encouraging what capitalism discourages. If Mozilla didn't exist, Google might not have bothered to up their reward.

This is a sign of a truly competitive market. When Chrome and Mozilla are competing to the point where they need to bid on how much they pay for people to find flaws in their own software then there's serious competition. And the result is that we, the consumers, benefit the most. This is market dynamics with honest companies at their best.

So if I were a conscienceless skilled hacker and discovered a vulnerability, then in a proper free market I'd be free sell it to the highest bidding criminal?

I'm sure a lot of people here will lament that 60 days is way too long to release a fix for most vulnerabilities, and I think that's true. On the other hand, it's probably a "reasonable upper bound" for very complex problems like the TLS session re-negotiation vulnerability, which required coordination between multiple vendors and the IETF in order to fix.

In other words, if you think you should get a 60-day head start to fix a security bug, your bug had better be at least as complex as CVE-2009-3555.

I'm sure a lot of people here will lament that 60 days is way too long to release a fix for most vulnerabilities, and I think that's true. On the other hand, it's probably a "reasonable upper bound" for very complex problems like the TLS session re-negotiation vulnerability, which required coordination between multiple vendors and the IETF in order to fix.
In other words, if you think you should get a 60-day head start to fix a security bug, your bug had better be at least as complex as CVE-2009-3555.

OTOH, It's a lot easier to say that if your product that needs fixing is a few magabytes of browser (and your customers do most of their complex processing on the server) than if your product that needs fixing is gigabytes of operating system with thousands of products that are much more complex than a browser running on top of it and that may be affected by the fix.

One. You are correct. Google is almost certainly taking advantage of the fact that browsers are substantially less complex(and people are comparatively tolerant of little rendering glitches, unless they scotch the whole page or "people" happen to be graphic designers...). It is a cynical; but very logical, tactic to talk most about the virtues you can cultivate most easily(though, conceivably, 60 days might actually be a much tighter limit for some of their server stuff, I don't know how hairy that can get).

Two. If your product is too large, and too tightly coupled, to turn around a fix in two months you had better have a very compelling reason. Arguably, Microsoft's relatively tight coupling of an enormous number of pieces has been very good business; but not very good design. In the short term, Google's implicit dig is rather cynical. In the longer term, though, they are really scoring a point in a battle of architectural philosophies. Microsoft probably actually handles size, complexity, and tight inter-relation better than most(they'd be dead if they didn't); but the problems that it causes them are basically their fault. They made that mess, they deliberately coupled stuff for economic reasons that could have been decoupled for engineering ones....

It's not like one guy wrote Windows - there are teams and teams of engineers. Microsoft employs thousands of them - there were about 1000 who wrote Windows 7, for example. Each part of the operating system is divided up into manageable components. Each of those components is divided into manageable parts, and each of those parts is divided into manageable functions, on down until you have a handful of engineers responsible for one small section of the OS.

Not to forget, but windows is a paid for OS. I would expect at least some reasonable support no matter the sophistication then. Google's product is free to the public. How many of those do you know that provide amazing support if any. Their only real responsibility would be image and PR. That is why open source is so great because anyone interested could do the patching themselves instead of relying on the owner of the software to care.

(Not just "behaving to spec" but "behaving exactly the same as it did yesterday")

I'd say that that has a great deal to do with loose or tight coupling... Your point is valid in that Microsoft is not, personally, responsible for all the tight coupling in the Wintel ecosystem. They do do some of it themselves, but their larger contribution is probably in aiding and abetting the producers of various large, expensive software stacks in doing tight coupling of their own.

This has been good business, Microsoft's customers generally like the backwards compatibility, because it saves them fro

It is not taking them 60 days to make a patch because of product complexity, It is probably taking them only a few hours for the patch, however because of the huge ecosystem around windows they have to do a massive amount of regression testing to ensure they are not breaking anyones products, imagine how much adobe would scream if a security patch broke their products or how about apple for itunes and you can bet the stories wouldn't be "Apple itunes breaks because of poor Apple development practises", it w

but, apple releases update to OSX that breaks photoshop, and they have fixed the bug in xcode(the approved way of software on OSX), then shouldn't adobe just update their xcode/OSX, press the "re package, patch" button, and ship the update? why should MS be concerned about 3rd party apps? does MS software still work, or receive patches? good problem solved.

Windows is an OS kernel, a very large set of system libraries, plus a few hundred applications (everything from Calculator to Internet Explorer). Linux source is just the kernel. If you want a real comparison, compare a Linux distro (say, Ubuntu) to Windows. Wikipedia already did it for you [wikipedia.org].

XP is 40M LOC, its contemporary Debian 3.0 is 104M LOC. I don't have a source for the size of the Windows kernel source code, but Windows 7's compiled kernel is ~5.4MB; Linux's compiled (core) kernel tends to run abo

We should probably distinguish between simple coding mistakes and fundamental security flaws in standards-defined protocols.

Just because some flaws are complex enough that it may justifiably and reasonably take more than 2 weeks to deliver a patch does not mean there is a free pass to be laid back and wait that long to patch all flaws.

It's not justifiable to wait 30 days to fix a one-line coding mistake that allowed a buffer overflow or underflow condition.

It's a lot better than the 720 day upper bound* that Microsoft uses, though.

* I don't actually think they have an "upper bound" that starts from when they discover the vulnerability. The clock for the upper bound doesn't seem to start until the vulnerability is publicly disclosed. I'm sure the only reason that 720 day old high-severity vulnerability was fixed was because some Microsoft was bored that day.

I think that 60 days is a good time frame, because on top of fixing the vulnerability, you also have to convince users to install the patch. Deploying a patch to a large userbase is going to take time, and probably longer than it takes to fix the problem in the first place.
That being said, maybe a more responsible approach would be to tell the vendor (and the world) that you're going to publish the vulnerability in X days or 30 days after they release a patch, whichever is longer. Now the vendor has serio

Although it's great to have a company pledge responsible behaviour, the logical next step for the industry would be to put security vulnerability reports in escrow, with an automated time release. This could be as simple as having a CERT server distribute unique encryption keys, with each key being publically disclosed after a countdown from the time it is generated. A security researcher would encrypt each of their reports with such a key (a different one each time) and publish them on the web. Besides reducing the political squabbling between companies, this kind of system would also be great for priority disputes between researchers.

Unfortunately, the major vendors will be violently against such a thing, so it's something the researchers themselves will have to implement. In doing so they'll probably lose friendly relations with major vendors - and by "lose friendly relations" I mean be slandered constantly.

Part of the idea would have to be having a REPUTABLE escrow service disinterested in publicity - a service that can work with both the vendor and the security researcher and balance the competing interests.

Every security researcher wants to maximize the severity rating of the bug, an instantaneous commitment to a fix timeline, and an absurdly tight deadline (expects vendor to drop everything to analyze the bug, fix it perfectly on the first try, and release immediately). Responsible

I think it's a terrible idea. It's like pre-planning your future investments and locking them in. What if you did that just before our latest recession? Putting your fate in a dead hand is never a good idea.

So google is defending the actions of an engineer who posted attack code on a Windows vulnerability 5 days after he reported it to Microsoft by saying that 60 days is more than enough time to fix a critical vulnerability...how exactly does that reasoning work?

Progress comes in little steps. I would say this is better than nothing, baby steps towards holding these megacorps feet to the vulnerability fire is incentivizing the disclosure of the bugs, instead of the cover ups and cease and desist letters who just want their shit fixed for the betterment of the world (or the glory of being 1337).

Google is saying that some companies *cough* Microsoft *cough* sit on security bugs for years until they're finally exploited, putting their users at risk. It's only by publicly disclosing the bug that these companies fix the problem.

So what is the business case for a Google researcher to be looking for vulnerabilities in Windows code other than for competitive reasons? Does Google guarantee that there are zero vulnerabilities in all of their own code? If not, why isn't he still looking at Google's code?

You forgot the part where the hypothetical researcher has privately reported numerous critical vulnerabilities to the hypothetical company and waited months or years for fixes. You also missed the researcher providing two different fixes for this particular vulnerability in his disclosure announcement. But hey, why make a fair comparison when you can resort to sensationalist slander?

Read the actual reporting on what happened. Tavis gave MS 60-days, but they refused to commit to any timeline. So, he went ahead and disclosed immediately, along with a fix for affected systems.

It's also important to understand that Tavis has been reporting critical vulnerabilities to MS for years--and in some cases waited over a year for them to push a fix. This time he saw something trivial that should be fixed immediately and he put their feet to the fire. Oddly enough, they did push out their own fix in under 60 days after the vulnerability was made public. So you don't have to agree with his methods, but you should at least frame the situation correctly.

Tavis Ormandy reported the bug to Microsoft on a Saturday and wanted Microsoft to commit to a 60 day timeframe.

On Tuesday (a patch tuesday, mind you) Microsoft told mr. Ormandy that they would be able to present a plan the upcoming Friday - i.e. 3 days later and 6 days after the bug had been reported.

Wednesday mr. Ormandy went public.

Microsoft *never* refused to commit to a timeline. They didn't commit to a timeline within 3 days, so 4 days after reporting the bug mr.

Ormandy went public. If he truly believed that 60days would be reasonable he could just have informed MS that he would go public exactly 60 days later. But no, Ormandy just needed an excuse to go public and show the world how much smarter than Microsoft he is.

60 days may seem long, but it is actually very close to the current average for the largest software providers - not just Microsoft. Mozilla patches much faster but we have also seen several incidents where a Mozilla patch broke the browser and/or was ineffective. Consider the fallout if suddenly all French Windows XPs/Vista were unable to boot. MS needs to regression test each and every combination. Remember what happened when malware caused Windows XPs to not boot because and old DLL had been patched and addresses assumed by the malware had shifted?

Well, since he posted 5 days later, it sounds like that's exactly what he did.

Actually, it says "five working days." Meaning you give them five non-holiday weekdays, and then disclose it on the sixth weekday.

The sixth weekday in this case would have been 9 days after the author originally reported the bug; because it was a Saturday, you'd have two weekends before the sixth weekday, during which you'd report it.

The black hats aren't limited by weekends and holidays. Once the black hats start exploiting a vulnerability on the Wednesday night before US Thanksgiving, the start of a four-day weekend, then what should a legitimate security professional do?

The black hats aren't limited by weekends and holidays. Once the black hats start exploiting a vulnerability on the Wednesday night before US Thanksgiving, the start of a four-day weekend, then what should a legitimate security professional do?

adtifyj was the one who suggested Tavis Ormandy should have waited "five days" according to RFPolicy. I was pointing out to Bigjeff5 that RFPolicy actually says "five working days." If you want to argue the merits of RFPolicy, I suggesting replying to adtifyj's post,

It's not the security researchers responsibility to cover Microsoft's ass. Anything he gives them is a gift not a god damned right. If you want to blame someone for all the exploits blame the dumb ass that decided to couple html help shit with everything and allow it to execute binaries. Just fucking stupid.

Sounds to me like Microsoft sat on it's ass for three days and then told him/we will get back to you on Friday/ which would piss me the fuck off too. You can't fucking figure out if you can commit t

At my job (and at Google's company too), we are using agile methodologies, and especially TDD (and also more complete regression tests).TDD implies that you write the tests before writing code, and this allows to quickly test any kind of components automatically.Regression tests are automated too, in order to early locate any kind of problem, and we are doing it with virtual machines, to avoid installing tons of computers.

At my job (and at Google's company too), we are using agile methodologies, and especially TDD (and also more complete regression tests).TDD implies that you write the tests before writing code, and this allows to quickly test any kind of components automatically.Regression tests are automated too, in order to early locate any kind of problem, and we are doing it with virtual machines, to avoid installing tons of computers.

If your application has problems, the user's computer continues working.

People keep saying that Microsoft needs to regression test each language and version of their operating system. That is not true. A well-designed program operates independently from it's translation files. All that is necessary in a well-designed program is to catch all instances of "translate ('english string')" and create a library out of it. In most software packages and even operating systems you can drop in and out of different languages even on a per-user basis.

I have reported a bug to google chrome (issue 45970) more than one month ago. They are not responsive at all.This bug is a regressions that completely prevents usage of local copy of java api documentation.
Recently, I have found another bug in google "Documents": the AltGr key does not work in new documents. On a french keyboard, I can not type any more the characters {, [, |, \, ^, @,...
The workaround is to duplicate an old document.
I think they are drowning in an avalanche of bugs.Correcting security

Tavis gave MS a timeline and they said can't commit right this instant but we will get back to you by friday (pretty resonable considering it was a patch tuesday for them). Tavis then publishes on wednesday like a total douchebag. There is no way you can twist this that makes Tavis look like anything but a douche. The only possible way he could have looked less of a prick is if he waited till saturday and had no further response he could have published it, even then though it goes against what he claims is responsible disclosure.

Tavis gave MS a timeline and they said can't commit right this instant but we will get back to you by friday (pretty resonable considering it was a patch tuesday for them).

Resonable, sure. But the point is, if Tavis offered MS a lack of disclosure for a guaranteed timeline for a fix and MS's response is anything but an "I accept", Tavis has no responsibility to keep the offer standing anymore than MS is obligated to keep the price on a copy of Windows 7 on Friday the as Tuesday just because on Wednesday yo

Because Google is putting out an article/official statement regarding vulnerability fix timelines and public disclosure with his name in the byline. The implication is that they fully support his view on the matter, though the timeline that's being touted as acceptable in the article is not one he stuck to. It's a lot of "do as I say, not as I do."

Just release the discoveries and let the sane companies adapt and start testing their software properly before shipment. Pussyfooting around companies that has no other interest in security other than PR is never going to accomplish anything.

Though after the utter utter fiasco of getting the syetms used to facilitate law enfocement requests hacked. on one would have thought that as Clem Atlee said “A period of silence from you would be appreciated” might have been better for Google on security matters.