On Sat, Jan 2, 2010 at 1:45 PM, Larry Masinter <masinter@adobe.com> wrote:
> However, in the case of HTML, it seems that the
> "reverse-engineering costs" is mainly used as a reference
> to the cost of determining what popular software (IE) did,
> because other HTML engines wanted to be compatible
> with the market leader.
Speaking from personal experience, I've found it necessary to reverse
engineer behaviors of IE, Firefox, and Safari. If you look through
the archives, you'll see some of the results of that effort posted to
this list.
> It seems, though, that reverse engineering only applies
> when either
> Â a) popular software does not follow already documented
> Â Â standards and practices.
> Â b) the documented standards and practices are insufficiently
> Â precise to determine interoperability.
Both of these are common causes.
> I would claim that if there are market forces that
> promote (a) (as happened at least in Browser Wars 1.0),
> that little or nothing that is actually written in the
> standard can matter.
The current dynamics of the browser market appear to be that Firefox,
Chrome, and Safari are incentivized to converge on a common behavior.
The spec is a useful vehicle for negotiating and documenting that
common behavior. Opera is likely on the same convergence path, but I
don't have as many personal contacts there, so it's hard for me to say
for sure. IE is stuck in another equilibrium point where they appear
to be required to support many non-standard APIs and behaviors.
Whether they will eventually converge to the consensus behavior
remains to be seen.
> * What are the costs associated with reverse engineering?
> When are those costs "vast"?
Vast is of course relative to some scale. As a rough estimate, it
seems safe to say that browser vendors burn at least millions of
dollars a year on reverse engineering. I suspect this cost was even
higher in the past.
> Although there is some literature on the general costs
> or processes of reverse engineering, this particular argument
> seems to be addressed at the past cost of determining
> Internet Explorer behavior.
You seem to be assuming that everyone is reverse engineering IE.
That's not really true. The Chrome team has a specific reverse
engineering protocol that goes something like this:
1) Observe that a web site doesn't work
2) Does the web site work in Safari? If yes, then adopt the Safari behavior.
3) Does the web site work in Firefox? If yes, then adopt the Firefox
behavior and try to convince Safari to do the same.
4) Does the web site work in IE? If yes, then the story gets more
complicated because matching IE often involves deviating from
standards, etc. Sometimes sites in this category go to evangelism,
sometime they implement hacks, sometimes they convince Firefox and
Safari to adopt the IE behavior.
Notice that the protocol causes the web platform to converge on a
consensus behavior.
> In general, though, the cost of reverse engineering
> existing HTML software's behavior in responding to
> version indicators has already been paid; whether it
> was or wasn't "vast" in the past, is there any reason
> to believe that allowing an optional version indicator
> in HTML would add any additional "reverse engineering"
> costs at all?
Imagine if content were able to target Safari 2. Then, those sites
would come to depend on quirks of Safari 2 that have been corrected in
future version. Instead of having the platform converge, the platform
would diverge into a sea of sites that work only in ancient modes of
various browsers. To enter this market, a new browser would need to
understand all the quirks of all previous versions. Disaster.
> In any case, I don't see how this argument applies to
> THIS actually change proposal. It's an interesting
> data point, a sore spot, a rationale for providing
> more precision than has been the norm in previous
> HTML specifications.
Precision is very useful in standards. Engineers often ask me how to
implement obscure functionality and I can point them to the exact
sentence in the spec that says what to do. For example, when creating
a web worker, should we complete relative URLs based on the lexical or
dynamic scope? The spec says what to do, and that's what we do.
> Yes, if version specific behavior were allowed or
> encouraged, that *might* increase the relevance of the
> "reverse engineering costs" argument, although
> even then the behavior of market-leading implementations
> against specifications is really out of the control
> of specification writers.
It is in their control if they listen to vendors and spec things the
way the vendors want to implement them.
Adam