Exploration

I’d like to tell you a little story, if I may, from way, way back in 2002. (The exact date is lost to the mists of time, but the year is pretty solid.) Like a lot of stories, it’s little bit long; but unlike some stories, it’s true.

As the engineering staff at Netscape prepared a new release of Mozilla, the browser off which we branched Navigator, those of us in the Technology Evangelism/Developer Support (TEDS) team were testing it against high-ranked and partner sites. On a few of those sites, we discovered that layouts were breaking apart. In one case, it did so quite severely.

It didn’t take much to see that the problem was with sliced images in layout tables. For some reason, on some sites they were getting pushed apart. After a bit of digging, we realized the reason: the Gecko engine had updated its line-layout model to be more compliant with the CSS specification. Now images always sat on the baseline (unless otherwise directed) and the descender space was always preserved.

This was pretty new in browserdom, because every other browser did what browsers had always done: shrink-wrapped table cells to an image if there was no other cell content. The only problem was that behavior was wrong. Fixing the flaws in the CSS implementation in Gecko had broken these sites’ layouts. That is, it broke them in standards mode. In quirks mode, Gecko rolled its behavior back to the old days and did the shrink-wrap thing.

We got in touch with the web team at one of the affected sites, a very prominent social networking site (of a sort) of the day, and explained the situation. We already knew they couldn’t change their DOCTYPE to trigger quirks mode, because that would break other things they were doing. We couldn’t offer them a simple CSS fix like td img {vertical-align: bottom;}, because their whole layout was in tables and that would throw off the placement of all their images, not just the sliced ones. All we could offer was an explanation of the problem and to recommend they class all of their sliced images and use CSS to bottom them out, with assurances that this would cause no change in other browsers.

Their response was, in effect: “No. This is your problem. Every other browser gets this right, and we’re not mucking around in our templates and adding classes all over just because you broke something.”

The truth, of course, was that we were actually fixing something, and every other browser got this wrong. The truth was not relevant to our problem. It seemed we had a choice: we could back out the improvement to our handling of the CSS specification; or we could break the site and all the other sites like it, which at the time were many. Neither was really palatable. And word was we could not ship without fixing this problem, whether by getting the site updated or the browser changed. Those were the options.

Let me reiterate the situation we faced. We:

Had improved standards support in the browser, and then

Found sites whose layouts broke as a result

Whose developers point-blank refused to alter their sites

And we had to fix the problem

We couldn’t back out the improvement; it affected all text displayed in the browser and touched too many other things. We couldn’t make the site’s web team change anything, no matter how many times we told them this was part of the advance of web standards and better browser behavior. Two roads diverged in a yellow web, and we could choose neither.

So we found a third way: “almost standards” mode, a companion to the usual modes of quirks and standards. Yes, this is the reason why “almost standards” mode exists. If I remember the internal argument properly, its existence is largely my fault; so to everyone who’s had to implement an “almost standards” mode in a non-Gecko browser in order to mirror what we did, I’m sorry.

We made “almost standards” mode apply to the DOCTYPE found on the offending site—an XHTML DOCTYPE, I should point out. While we were at it, we rolled in IBM’s custom DTD. They were using it make their site validate while doing all kinds of HTML-invalid stuff, and they were experiencing the same layout problem. And lo: a third layout mode was born. All because some sites were badly done and would not update to accommodate our improvements. We did it so as not to break a small (but popular) portion of the web while we advanced our standards support.

Now take that situation and multiply it by a few orders of magnitude, and you get an idea of what the IE team faces. It’s right where we were at Netscape: caught between our past mistakes and a site’s refusal to accommodate our desire to improve support for open standards.

Some have said that Microsoft is in a unique position to take leadership and spread the news of improved standards and updating old sites to its customers. That’s true. But what happens when a multi-billion dollar partner corporation refuses to update and demands, under the terms of its very large service contract and its very steep penalty clauses, that a new version of IE not break (for whatever value of “break” you like) its corporate intranet, or its public e-commerce site? It only takes one to create a pretty large roadblock.

For all we did in publishing great content to DevEdge, proactively helping sites to update their markup and CSS and JS to work with Gecko (while not breaking in other browsers), and helping guide the improvement of standards support in Gecko, we could not overcome this obstacle. We had to work around it.

Looking back on it now, it’s likely this experience subconsciously predisposed me to eventually accept the version targeting proposal, because in a fairly substantial way, it’s what we did to Mozilla under similar conditions. We just did it in a much more obscure and ultimately fragile manner, tying it to certain DOCTYPEs instead of some more reliable anchor. If we could have given that site (all those sites) an easy way to say “render like Mozilla 0.9″ (or whatever) at the top of every page, or in the server headers, they might have taken it.

But had we offered and they refused, putting us back to the choice of backing out the improvements or changing the browser, would we have set things up to default to the specific, known version of Mozilla instead of the latest and greatest? The idealist in me likes to think not. The pragmatist in me nods yes. What else could we have done in that circumstance? Shipped a browser that broke a top-ten site on the theory that once it was in the wild, they’d acquiesce? Even knowing that this would noticeably and, in a few cases, seriously degrade the browsing experience for our users? No. We’d have shipped without the CSS improvement, or we’d have put in the targeting with the wrong default. We didn’t have version targeting, but we still made the same choice, only we hinged it on the DOCTYPE.

A short-term fix for a short-term problem: yes. Yet had we not done it, how long would Netscape/Mozilla’s standards support have suffered, waiting for the day that we could add that improvement back in without breaking too many sites that too many people would notice? Years, possibly. So we put in a badly implemented type of version targeting, which allowed us to improve our standards support more quickly than we otherwise would have, and it has been with us for the more than half a decade since.

So maybe I’m more sympathetic to the IE predicament and their proposed solution because I’ve been there and done it already. Not to nearly the same degree, but the dilemma seemed no less daunting for all the difference in scale. It’s something worth keeping in mind while evaluating what I’ve said on this topic, and whatever I will say in the future.

Robert O'Callahan wrote in to say...

The important thing is, we’ve shipped several major Gecko revisions since then without introducing any more modes.

We keep modes to a minimum, and we keep the differences between modes to a minimum too — limited to a small set of specific must-have-for-Web-compatibility quirks. Almost all of our thousands of bugfixes and new features apply to all the modes.

Paul Armstrong wrote in to say...

Robert, Microsoft is making a major change to their rendering engine, fixing hundreds of problems that they, themselves, created in the past. Gecko has always been a lot closer to the accepted standards than Trident has and there are a lot of sites (still) that were only written to support IE5 or IE6, so it’s difficult for Microsoft to push a browser update on its customers and then break their needed websites. It makes sense.

Eric, I can understand Microsofts motifs. But that doesn”t mean that targeting is a good thing. What disturbs everybody the most is, that you have to opt in, even if you want to opt out, or be cutting edge, as Microsoft puts it.

Can Microsoft guarantee that all IE7 quirks is fixed and their rendering engine is fully standards compliant (or at least as compliant as Mozillas)? If so, there”s absolutely no need to have that opt-out thing. Microsofts dismisses the principle of progressive enhancement. And that”s neither good nor fair to forward thinking web developers.

I see more problems with that. What about templates for popular blogging software. It could as well hardcode the meta tag. If the server itself is misconfigured than you could end up with a screwed up layout.

The default mode must be standards compliant mode. Full stop. There”s no way around that. And I”m disappointed about major web standards advocates to show the white feather on this issue. Zeldmans argument is: “At least Microsoft is doing something about web standards.” But I doubt their commitment. If there is a long enough beta phase each and every web site owner has time to look at their web sites and apply a meta tag which tells IE to use the “we couldn”t make it the first time” IE7 rendering. Or get their conditional comments in place.

Web sites breaking in IE8 will then most likely break in every other major browser, too. Since years. Standards compliant browsers don”t appear out of a sudden. If the author didn”t care over the years he will insert one (1) meta tag or get his web site standards compliant (whatever is more likely).

Personally, I found that story incredibly helpful, so thanks. It seems that there’s more history to politically-sensitive standards support judgements than I’d realised.

I assume that this decision was made before Mozilla was open source — right? Perhaps the whole targeting furore serves to highlight the disadvantage that traditional, closed source companies have when developing interoperable, free software like browsers. I guess Mozilla and Webkit have an inherent long term advantage here — even though they have corporate backers — because there is no client shouting down the phone at them, trumping all other considerations.

Eric, I’ve said time and again that I want the default to be “latest”. Nothing can be bettered or changed by telling me what I already believe.

But my experience shows me why that might not be what the IE team will choose, regardless of what they might prefer. I would have preferred not to add a new rendering mode, but advocated it because the alternatives all sucked more.

I thought I’d share that experience because the only way to get the default changed is to show why it’s a good idea (or at least not a bad idea) for the IE team to do so, within the framework of the problems they must solve; or else to find a better solution altogether, again within the framework of their needs.

SJ wrote in to say...

Thanks Eric,
This is an interesting example of how these things work.
It’s become clear that despite the optimistic tone with which this was presented on ALA, there isn’t really much benefit to come of this for those developing with web standards. It is a small inconvenience that we’ll have to deal with from now on, and if we have no choice in the matter then that’s that. It’s unfortunate (and surprising to hear) that IE has made these promises based on non-standard rendering but I suppose it’s in their best interest to follow through.
So, ‘edge it is’ I suppose….

I’m glad to see you state that explicitly Eric, because that really wasn’t the impression I got before!

I do remember table based layouts breaking because I had an extra space in the doctype…and IIRC didn’t BBEdit ship with a broken doctype for while? But I digress. Once that doctype was sorted it provided a benefit to the site developer in allowing validation.

I think most standards loving developers understand Microsoft’s problem and agree that version targeting is a good solution for Microsoft and it’s customers – it’s just the default to IE7 mode that causes such outrage because it’ll force us to tag all our existing sites and all our future sites (at least for a few years anyway) with code that’s of no direct benefit whatsoever to us, it’s purely there for the benefit of Microsoft and it’s corporate customers. It just seems wrong that we’d have to do so much work on their behalf.

Unless my memory has gone completely to hell, I’m sure Mozilla was open-source by then, Jonathan; it was mid- to late 2002 at the time.

And come to think of it, we did the whole thing as a small group “behind closed doors”, to use the popular phrase, and the first substantive word the community got of it was the DevEdge note I linked above. Ironic that the IE team of 2008 has brought their plan before the community ahead of deployment, whereas the Netscape/Mozilla team of 2002 presented it after release as a fait accompli—and that we had next to no pushback over it.

I asked this same question on Zeldman’s site and on my own blog, but I’ll ask it here, too, so that someone can help me better comprehend the proposed solution regarding the IE7 default:

If IE7 is the version of Internet Explorer that “broke” millions of IE6 fashioned sites, why isn”t the default rendering engine going to be IE6, in order to fix that particular problem?

The logic doesn”t add up to have IE7 rendering as the “cure” for broken IE6 sites. Is there something special about IE7 that corrected that issue that I don”t know about? It”s entirely possible that such is the case. My question is sincere, not smarmy. :)

Sandy, this isn’t an argument, it’s an observation based on experience. And I really don’t think we want to start basing our decisions on market share. Believe it or not, I’d feel and have felt the same way about this idea had it come from another browser maker. That’s what I believe, anyway.

I don’t know the answers to your questions, Bridget. Those are things the IE folks would need to answer.

Good story Eric. But I also think experiences like this are exactly the reason why the CSS WG and WHATWG are so focused nowadays on designing specs for new features/elements/properties in a way that they don’t break existing sites. If the CSS property ‘vertical-align’ was being developed now, the default behavior of rendering a single image in a table cell would have been specced as shrinkwrapping. And an additional value for vertical-align might have been added for the case where you actually wanted to use a cross-cell baseline.

“Giving in” now means that these concerns can be declared moot, so we can start developing multiple webs: the IE8 web, the IE9 web, etc, each with their own HTML and CSS and JS spec. Which I find very scary.

I really think that the “multiple webs” claim, which I’ve seen in a number of places, overstates the situation, Rijk. Each with their own HTML/CSS/JS? You make it sound like those specs will somehow be completely different, when in fact they’ll be successively built on one another, just the way browsers are already built over time. The difference in IE will be that a page will be able to tell it at which stage of IE’s evolution it was developed, and should thus be treated. Is that a fair summary of the process, or am I missing something?

I have kept a very open mind about this topic and plan to continue to do so. However, you story highlights one reason that I think is purely bogus.

Show me a large enough company to have enough clout to push Microsoft around that has a Network Admin that pushes out IE-Next and breaks their entire intranet because he didn’t test it and I will show you a Network Admin who just lost his job.

Huge companies are notoriously slow to change. Companies have publicly said they are not switching to Vista for many years, even the government said that. A large portion of major corporations still use mainframes with Cobal and Fortran. Why? Because it works and they don’t want to spend money to upgrade and fix something that works.

The same goes for browsers or any other piece of software. New updates are tested before the the decision to deploy is made. I know many companies to this day that are still using IE6. The web world is no different. Many hosting companies are slow to upgrade their distros os Linux, PHP, PERL, etc. because the current stuff works and if they upgrade there is a possibility something might break.

The part about developers that will not upgrade their site is also not a reason that should be taken into account. This is just whining and a valid reason. I am sure the very same developers later upgraded their site and complained (as every developer does) about browser issues and bugs, except they were probably too stupid to see that they were the ones that caused them. I hope that those developers where ever they are read this article to understand what they did.

I am open to all reasons and ideas but to me those two do not hold their own weight.

P.S. I am also open to hearing if someone can give those 2 reasons their weight.

Bridget: while I speak only for myself and have no affiliation with any of the IE Team, I would reckon that this is because IE7 was released to the wild with its most Standards-compliant rendering mode as the default. That caused sites to break — but those sites have fixed themselves by now.

Hence, there would be no benefit to having IE8 render as IE6 because the current Web is generally working for IE7 which, while still a far cry away from being as Standards-compliant as Mozilla/Opera/Safari, is a lot better than IE6.

Robert: what you’re forgetting is that your situation with Mozilla is not directly comparable. Mozilla has never radically adjusted (think a complete overhaul) its Javascript engine without breaking ties with its existing User Agent string. IE8 is getting a significantly different JS engine but the user agent string will still contain Internet Explorer and/or MSIE.

Surely you can imagine how many sites utilizing JavaScript a lot would break if IE8 were to just use that new engine by default?

Eric, I have read many articles debating the target meta element. Now I have a better understanding of the problem, and frankly I cannot think of an alternative. So I could accept that solution, though it’s not perfect.

What still concerns me is that there will be a bloated browser a gigabyte large because it tries to be backwards compatible to the internet Stone Age. If the Microsoft team would offer a perspective, like “we do this thing but keep it to the current browser plus the latest major version predecessor,” that would be a compromise I could live better with. So there would be some progress while giving developer teams time to adjust websites to a new browser version. After all, major websites are relaunched about every two years anyway.

Any meta like “IE=8″, “IE=9″, “IE=10″ will add to the number of sites that are locked-in. No meta means that the site is locked-in, any meta other than “IE=edge” means locked-in, too. Can you imagine what will happen if 95%+ of the web sites in 2010+ are locked-in to a version of IE?

What will the very same marketing manager of MS say when he sees the statistics that 95%+ expect IE8 or less? What will any client, any creative director say? If you give them the opportunity to lock-in the web, they will not unhand it anymore. This process of locking-in the web is self-energizing, and it will influence the discussion in the CSS-WG and other standards bodies.

If there is any risk that this will bring IE’s standards commitment and therefore the standards process to a stagnation, then why can’t we say that’s a bad idea, because the possible consequences are irreversible? You can’t take it back once it is implemented without braking larger parts of the web, they will say.

Let them have their opt-in to a real standards mode because their maketing manager wants IE8 to not break the intranets (aside: even if a simple one-liner opt-out-meta would be all they ever needed to fix their intranets for realstandardsmode-IE.)

Any other browser acts like as if “edge” is implemented. Even if some voices in this discussions fancy a site-lock to a specific engine to reduce costs, even if some think the meta is optional because MS says so. It is not optional. You are locking in if you set a meta, and you are locking-in if you don’t. “edge” is not an option, but mandatory, if you want to make sure that the standards keep being an open process not determined by an IE-version.

The mode “edge” is what the other browsers are in. There is no “Fx=4″ mode planned, no “Saf=4″, no “Op=10″. Coding for standards means coding for the edge. Forget the “IE=8″, “IE=9″, “IE=10″, its a very bad idea because the consequences are not foreseeable, but include the risk of bringing standards to an agonizing halt.

“IE=edge” as the only promising meta value forces the IE-Team to come up with a very good browser. They will need a public testing phase with a functional bug tracker this time. Let us help them.

Back in 2001 I argued that Gecko”s standards mode should behave like what became the Almost Standards Mode a year later. In 2001, the argument against being legacy-compatible was that the CSS line box model wasn”t designed for table layouts but was the way forward.

In retrospect, regardless of what one thinks of table layouts, CSS2 failed to Support Existing Content and that”s a bug is CSS2. A better way to address the table cell problem would have been to define CSS formatting model default behavior to match what now is the Almost Standards Mode and to define a per-cell property that the author could set to obtain the behavior that CSS2 actually specified.

Since hindsight suggests that introducing the Almost Standards Mode was not the best fix, I think the Almost Standards Mode is not a solution pattern to be imitated.

Rob Waring wrote in to say...

Eric, I understand what you are trying to achieve here by giving a relevant (and quite illuminating) example, but three posts in three days plus the ALA article is starting to look like firefighting for MS when they would be better off doing it themselves. Probably the ‘No’ crowd just shout the loudest, but there are a lot of contradictions in Microsoft’s arguments, such as Bridget’s point about IE6 vs. IE7 as the default, or rollback, rendering engine.

Personally I don’t get the logic of making sites default to IE7 unless specified, however neither do I think that there will be implementation problems between IE10 and 11 versions of IE7 because in all probability the default browser will get rolled forwards, albeit more slowly than the latest versions, so we are more likely to be wrestling with different interpretations of IE8+ at that point, better implementations of the spec and all.

What I’m wondering is how long the IE7 mode will be the default. Can we expect IE10, 11, and 12 to all still have the IE7 rendering engine built in, and be the default? Or will it eventually be phased out? The trouble is that if old sites continue to render perfectly forever into the future, there’s nothing to encourage the developers to fix their sites. Standards based sites are better for the user as well as the developer, and we should be encouraging people to embrace them.

There has to come a time, some day in the future, where we say “you’ve had X number of years to fix your site, if you still haven’t then tough, it’s going to break.”

On a separate, and mostly irrelevant note–and I completely understand what you’re saying in this post, and agree–just how short sighted and stubborn was that particular developer, who point blank refused to fix their site, seriously! I mean, these days it certainly isn’t realistic for every developer to patch up every site they manage each time a new browser comes out; but if a member of the IE or Mozilla team contacted me directly to tell me I had done something wrong in a site, I’d be more than happy to fix it.

“Can you imagine what will happen if 95%+ of the web sites in 2010+ are locked-in to a version of IE?”

And there we have it. Microsoft want to make the web become IE-dependable. We must fight this all the way. Standards or nothing. No fixes, no hacks, just well-written sites that work in all major browsers.

Eric Meyer wrote:

“Their response was, in effect: ”No. This is your problem. Every other browser gets this right, and we’re not mucking around in our templates and adding classes all over just because you broke something.””

Of course they were stubbornly wrong. It was their code that needed to be updated. You were introducing standards and they didn’t like it. You should have told them where to go. Eventually their customers would have seen the site break in new browsers that applied standards and they would have been forced to update the site.

It’s a tricky one though. Browsers like Opera annoy people when they render strictly to standards, causing problems with sites coded for laxer browsers. But where do you draw the line? If you continue to allow for badly coded sites then standards can never win. Microsoft need to see this. But they’d rather hold back the web, as they have done in the past, by making it IE-centric.

I truly hope that they drop this targetting idea before IE8 ships, due to the overwhelming outcry from so many people who hate the idea. Here’s hoping.

“But what happens when a multi-billion dollar partner corporation refuses to update and demands, under the terms of its very large service contract and its very steep penalty clauses, that a new version of IE not break (for whatever value of “break” you like) its corporate intranet, or its public e-commerce site?”

So large contracts and penalty clauses dictate browser development? This can’t be right. The intranet sites need to update their code. There can be no excuse.

IE8 is getting a significantly different JS engine but the user agent string will still contain Internet Explorer and/or MSIE.

It occurs to me that this provides an alternative solution (to version targeting) for not breaking older sites. (Forgive me if this has already been proposed elsewhere.)

If the sites in question work in Firefox and Safari (say), the reason IE8 would break them is entirely due to browser sniffing / conditional comment fixes for a previous IE versions. If an entirely new user agent string was used (say a code / brand name for the new engine) then older sites would assume they were talking to Firefox et al, and should render fine in IE8 standards (edge) mode. Right?

Does this actually come down to the politics of user agent identification strings, or have I missed something?

The comparison with “almost standards-mode” is interesting and worth thinking about. I remember very well when that new mode was introduced, but I didn’t know that Eric was behind it. I do remember that I opposed it strongly, even to the point of mocking the whole thing (sorry about that!).

So is this new meta opt-in super-standards-mode different from almost-standards mode that triggers on certain doctypes? I think so. I still don’t like almost standards mode, but I do think that it’s more acceptable than the new meta thing.

The almost standards mode is triggered by HTML and XHTML transitional and frameset doctypes (and some IBM doctype no-one’s actually using). Those are doctypes that are supposed to be gradually phased out, and that was the intention from the beginning (even though the phasing-out is taking a lot longer than one would hope for). But at least ideally in a couple of years they won’t be used any more. Thus the almost standards mode will (ideally) disappear when the web moves forward to a bright and shining standards compliant future. It’s a temporary kludge.

The proposed meta opt-in super-standards-mode is different. It will apply to HTML and XHTML strict, doctypes that are not temporary transitional solutions. They are not supposed to go away. They are good and recommended (now and originally). But anyone who uses them, and wants the pages to be rendered according to standards, will have to use the meta opt-in kludge forever (in order to get IE8 etc to render correctly)! There is nothing wrong with sticking to e.g. HTML 4.01 Strict for the next 30 years ore more. That doctype is not something we’re supposed to be using temporarily just to be able to transition from something bad to something good. It’s a good doctype all by itself.

So it still boils down to the wrong default. Any one who uses a good non-temporary doctype, and wants the pages rendered according to standards, should not have to add any kludges for that to happen – at least not any kludges that will stick with us forever.

But I can see how Eric reasoned way back in 2002, and I can understand how he reasons now. There are some good arguments for the proposed new kludge. But in my opinion they’re not good enough.

So I vote for an opt-out solution. Some pages will break in that case, and that will hurt a little, but I don’t really think that it will be all that bad. And the solution won’t be that too hard to apply. Many pages had to be updated for IE7, and they were mostly updated. I think those were many more than the ones that will have to be updated for an IE8 with a default standards-compliant rendering. And the quick and dirty solution (adding a meta kludge) won’t be very difficult to apply. And it will – in that case – be a temporary kludge, not something that will haunt us forever. And it won’t mean locking pages to certain browser versions for the coming 30 years or more.

Don Field wrote in to say...

I don’t understand the relevance of your story. Browsers have always used whatever information is available to improve the user experience. (And they will continue to do so in the future.) What you are telling us now, however, it that we, web *authors*, must change *our* pages. If we want to use standards, we need to use a vendor-specific extension to do so.

For pages like Acid2, this seems impossible to do. You said you hadn’t thought of this when considering the scheme. And you said Acid2 wasn’t much of interest to you. Could you elaborate on this?

To me, it seems that Acid2 has been very successful in making browsers converge. The meta tag, unfortunately, goes in the exact opposite direction.

Ingo, I know that every other browser effectively defaults to ‘edge’ and that’s one of the (several) reasons I want it to be the default in IE. Hopefully I gave some insight into why that might not be an easy choice to make, regardless of the personal desires of the team members.

But I don’t understand how pages will be locked to IE when any browser can load and display them. It’s not as though Firefox or Opera will stop loading pages with this meta element. Which is why I don’t understand the arguments that this is will have the effect of locking up the web, or that it will make the web dependent on IE. I don’t see how version targeting (or anything else a browser might do in markup) could have that end result.

Nor do I understand how this could bring standards to a halt, if by that you mean all standards in all browsers, which seems to be what you’re arguing. I suppose there’s a risk of it bringing standards support to a halt in IE, but I believe the risk of that happening in IE is greater without version targeting or something that meets the same needs.

eric, interestingly your last two posts here on your blog are far more valuable than your article on ALA, which really seemed far too one-sided and far too “hey, isn’t it great?”. maybe it’s just me, but if the ALA piece had contained far more insights like these posts, and a similar tone, it would have had a far better reception in the community…

anyway, with that said, here’s a question (already posted it as a comment on ALA and WaSP, but for indulge me):

in the article, aaron states:

“We could specify the version of the languages we use, such as CSS 2.1 or JavaScript 1.5. Unfortunately, browser vendors often implement only part of a spec and the interpretation of a specification often differs from browser to browser, so any two contemporary browsers may offer completely different renderings of the same CSS or may trigger completely different events from the same form control.”

well, how”s this: what if we were to version the languages, and carry on doing EXACTLY what we”re already doing to accommodate for differences in browsers, i.e. use conditional comments and possibly a tiny amount of hacks/javascript-based sniffing (capability sniffing and/or browser sniffing)?

personally, my gut feeling here would be that this is more “right” – you”re defining which W3C spec you”re assuming for the page to work, and make slight accommodations where you know for a fact that a specific browser hasn”t implemented it right. it”s specifying the capabilities expected of a browser, rather than the exact browser and version number that the page assumes.

old legacy/intranet sites can stay as they are (without the versioning), and then it can be assumed they”re using the current JS spec and CSS 2.1. that would be the frozen bit: if you don”t version, this is the assumed spec. IE can then do whatever it likes when it comes across those pages…kick in a separate IE7 rendered / JS engine or whatever.

yes, developers would have to modify their code to add versioning, even to their existing sites. but this feels descriptive (similar to doctypes, it”s something you add to your page to explicitly describe what it IS, not what it should do…a subtle, yet fundamental difference in my eyes).

so…somebody explain to me again why this wouldn”t be far more desirable? am i missing something?

Henri, hindsight actually suggests that “almost standards” succeeded, because it let us advance standards support while not breaking existing web content (layouts, if you prefer). It was certainly not an ideal solution, but it was a practical one. It got the job done.

I agree in principle that the best thing would have been to get CSS redefined to handle existing content. I’m pretty sure it had already been tried by then, however, and been rejected. The time it would take to overcome that rejection and devise a new solution that the WG agreed upon would, I suspect, have taken longer than if we’d just backed out the change and waited for the problem sites to eventually redesign using different methods. Given the last eight years of CSS WG progress, I have to wonder if we’d still be waiting for that change today.

Chris, please allow me to reiterate: telling them where to go was not an option. Even if it had been, it would have changed nothing. We were to either find a solution or back out the change. That was it. Breaking the site was not an option for us, even though it’s what we wanted to do. The people responsible for deciding on Netscape releases had made it clear that was not an option.

That’s pretty much the same situation the IE team faces, as I understand it: make sure sites don’t break, or don’t ship the browser. Making sure sites don’t break means either not updating the browser to any significant degree, or finding a way to be backward-compatible with existing sites while still updating the browser.

Don, if your objection is to the default behavior, then this story isn’t relevant to that objection. If you don’t see its relevancy to the overall situation, I’m not sure what I can say to change that.

I was never a big fan of Acid2 because I felt it did far too much for a single test. The original Acid test was of the CSS float model. It didn’t try to test JavaScript, or obscure HTML rules, or even all of CSS. Just the float handling. Of course it implicitly tested things like the box model, because without good box model support you can’t hope to have good float support. But its focus was the float model.

To me, that’s what a good test does. It tests one thing, or one closely related set of things. Acid2 was nothing like that, and I thought encouraged the wrong approach to standards support. But that’s me, and I fully recognize it’s a view not many share.

You’re no doubt right, Patrick, l and I wish I could change that. The ALA article was written pretty quickly—slightly less than a week from first seeing the proposal to working through the considering process to writing it up. I’m sure I could have done much better. Mea maxima culpa (I blame my Nissan).

Though I find it interesting that you thought the tone of my piece was “hey, isn’t this great!” when others found it to be “I’m tentatively and reluctantly in approval”. The latter being much, much closer to my actual state of mind.

Anyway, language versioning in the browser works great as long as you’re confident that the browser gets the language version absolutely and totally right the first time. In other words, it works to say “CSS=2.1″ if the point at which a browser turns on recognition of that value is the same point at which it has fully and accurately implemented CSS 2.1.

If that’s not the case, then you end up in the same soup. Pages developed with “CSS=2.1″ but dependent (intentionally or not) on bugs in a browser’s CSS 2.1 implementation get broken when the next version comes out with fixes to the CSS 2.1 implementation.

And you’re still left with the question of what to do in the default case, when a page doesn’t report its language envelope. Default to old-and-crusty and you freeze pages in the past. Default to -new-and-shiny and break pages from the past. Same soup, different bowl.

keeping it IE specific (as they’re the ones with the biggest problem, as they stated themselves)

If that”s not the case, then you end up in the same soup. Pages developed with “CSS=2.1″ but dependent (intentionally or not) on bugs in a browser”s CSS 2.1 implementation get broken when the next version comes out with fixes to the CSS 2.1 implementation.

then you use conditional comments as we already have to do now (plus possibly capabilities sniffing and, as a last resort, browser sniffing – when the browser lies about its real capabilities). yes, it puts the onus back on the developers to ensure that they choose the most appropriate version of technology (so they can “freeze” what they use) for their particular audience.

And you”re still left with the question of what to do in the default case, when a page doesn”t report its language envelope. Default to old-and-crusty and you freeze pages in the past. Default to -new-and-shiny and break pages from the past. Same soup, different bowl.

as i said in my comment above, IE (if it feels so inclined) can adopt the same “render it as IE7″ approach in that case. if they’re stuck with having to bloat their browser with old-school alternate rendering engines, so be it. so: “if you come across a page which doesn’t use versioning, revert to IE7 rendering” (in essence, no different from the “if you come across a page without the META or HTTP header, revert to IE7 rendering” that they’re proposing).

essentially, this is still an opt-in, but instead of then specifying browser and rendering engine, you specify nominal spec version and carry on with the (small, in most cases) conditional fixes. yes, it still puts the onus on standards-aware developers to accommodate their pages to make IE8 behave like IE8, *but* it’s browser-neutral in its implementation and targets nominal capabilities based on spec version. yes, it leaves the onus of then accommodating for flawed implementation of the spec back with the developers, but they’ll still have to do that anyway (since, by the look of it, Firefox/Safari/Opera/etc are not playing ball with the META, so will still need to be catered for anyway).

in short, i can’t see any drawbacks here…but maybe i’m myopic. and maybe, just maybe, it’s completely futile for us to discuss this anymore anyway, as i get the feeling that MS already took the “you’re either with us or against us” stance…

So let me sum up: Because the webmasters of one popular site have denied to fix their non-standards compliant design six years ago we have an extra quirks mode in every single Firefox, Seamonkey, K-Meleon and Iceweasel? Now, that’s a good argument for backwards compatibility. If we don’t want to suffocate under the mass of broken websites we will have to have the courage to “break the web”.

Eric, I did not say that Netscape should have waited for the CSS WG to come to the conclusion that CSS2 has a bug. :-) Also, the crux of success isn”t that the Almost Standards Mode has succeeded but that the Full Standards Mode hasn”t. And this issue is going to come up again when people notice that the HTML5 doctype triggers the Full Standards Mode in Gecko, WebKit and Opera.

I’m hoping this all calms down again soon enough. Personally I think it matters far less about which behaviour the default value is than many seem to believe. Whichever way it works, it’s not going to be hard work for any of us to adapt to it and carry on exactly as we want. We are the knowledgable ones that can handle it.

My only concern is whether it ends up with a repeat of the DOCTYPE problem – whereby ignorant developers will produce non-standard mark-up, but their author environment throws in a ‘modern’ tag value anyway. Which is why I wish that triggering ‘really standards’ mode required valid mark-up detected by the browser in addition to the meta tag.

But, until all this pans out, I would like people to relax a little more. *hands out chocolate cookies*

The doctype says that this page is expected to be rendered in conformance to the standards (“all=edge”). The meta-“IE=8″ probably means to a large amount of developers: this page is expected to look like IE8 sees it. Both expectations will differ over time.

If the other browsers proceed and standards evolve, they will break these sites with meta-“IE=8″ some day, if not from the beginning. But the more sites with meta-“IE=8″ are in the wild, the more sites will “break” in this situation. And the site owners will tell you to stop breaking their pages, like they did tell you before in your Netscape example. But this time they are right, see, didn’t they say how this page’s rendering is expected to be: like IE8 sees it? This all will influence decisions in the CSS-WG.

That’s what I call: a site is locked-in to an IE-version. I know that there are other browsers around. But IE’s market share in conjunction with the meta-statistics of 95%+ “expected” rendering like “IE=8″ or “no meta: IE=7″ may indicate that the web is in danger to be locked in to IE-versions, and that may bring standards to an halt in the worst case.

That’s why I believe an opt-out would be best, but it is very unlikely to be implemented by MS, so the best choice is “IE=edge”, because that is what the other browsers do. In contrast, the metas “IE=8″, “IE=9″ etc. may turn out to be fatal one day.

David wrote in to say...

From reading many discussions of this topic I’ve seen a recurring concern about requiring a META tag to make IE8 enter standards mode. An alternative that should be easy enough to implement and provide an easy change for people with sites that would break is to only require the META tag to trigger “almost standards” mode. This way a web developer only has to use this “hack” to keep the site working until they redesign to work properly in standards compliant browsers that don’t require a special tag to function properly.

“The most important part of communication is hearing what is not being said.” —Peter Drucker.

1. Microsoft has stated that IE8 will default to IE7. [Not IE6]
2. Microsoft has stated that non-standard DTDs, e.g. custom DTDs and HTML 5 DTDs, will default to IE8.
3. Microsoft has stated that if standard DTDs want to default to IE8 standards rendering to use the meta tag and specify version IE8.
4. Microsoft has stated that if standard DTDs want to default to IE6 rendering to use the meta tag and specify version IE6.

Finally,

5. Microsoft has stated that when using the meta tag to NOT specify all future versions within IE, i.e. IE="edge".

What is not being said by Microsoft?

Logic tells me, at least, that Microsoft may be setting the stage to “move the ocean” and that the IE8 meta tag is a temporary course correction to “maneuver the ship” until the “ocean can be moved”.

Sidebar: Content that will use the meta tag, the reasonable presumption is that such content is created by professional level designers/developers and that, historically, such content turns over or is overhauled every two – three years or thereabouts.

Henri (and wortwart), this story is not meant to justify what’s being done. It’s meant to illustrate the constraints at hand, and to provide some historical evidence. Whether that’s evidence for or against the wisdom of version targeting is going to be up to the individual, of course.

I get the sense that most everyone thinks I’m invested in version targeting and am doing all I can to defend it. No. If Microsoft chooses next week to throw out this entire plan, I will honestly be partly relieved. Also partly concerned, unless it’s replaced with another plan that solves the problem as well or better.

I’m trying to explain the situation, to define the parameters of the problem that version targeting was designed to solve. That’s the only way there is any hope of finding a solution that makes everyone less unhappy. And by everyone, I mean us standards-aware developer folk and the IE team.

Because we have to live with what they ship, whether we like it or not. it’s better to work with them to find a way to like it, in my opinion. But working with them means understanding their needs and working to address them. Otherwise the two sides will just talk past each other, and we’ll be more likely to not like the end result.

I guess one big thing that bugs me is that this wasn’t done with the full knowledge / support of WaSP, or the W3C, or other browser vendors. It’s feels so clandestine and black-box, just dirty.

@#42 thacker That’s sort of been my point all along is that we have to keep making these special accommodations just for MS and IE. First it was conditional comments, now its a special meta tag / server setting. What next? Let MS rewrite al the specifications they don’t like t suit their whims…? God I hope not.

I have to agree with Rachel Andrew here though in that MS’s track record is less than spectacular here (advancing and advocating standards), and I can easily see this kind of meta tag thing leading right back to “This site best viewed in Firefox 3″ style browser / site specific design. Is this now to become the new 800 pound gorilla in the room that no one talks about?

Richard Fink wrote in to say...

I tend to look at this whole issue from an economic point of view.
In terms of the chain of distribution.

Firstly, if Microsoft were to release IE8 and “break the web” as some people would like them to do in the false belief that this would force all sites everywhere to re-design and that such a thing is actually moral, I believe management at the IE team would and should lose their jobs.
Why anybody expects Microsoft to do something that is so clearly against their business interests is beyond me.
If I am understanding correctly, the IE team did not consult the standards bodies for permission to do this. They HAD to do this – the only question was the implementation.
Microsoft is a business, if you want to force MS to implement standards compliance and then force it down everybody’s throats, then the appropriate entity to approach is the government. Hey, it works for the auto industry. Maybe we can get browsers with better mileage.

Second, if you view standards-based features as a product (which it is), the supplier chain looks like this:

1) Browser makers
2) Authors (the people who create sites)
3) Users

Now, it seems to me that all the versioning tag does is shift the burden of responsibility for making use of standards from browser makers to the very people who make up the web: authors and users.

What seems to be sticking in some people’s craw is, incredibly, this shift.

If you’re using IE8 and the site you’re visiting cuts off at IE7 and doesn’t make use of standards-based features that you think would be helpful, well, then complain to the site!

What seems to be creating a lot of anger is that with IE8, there will no longer be a single target to shoot arrows at.

Without getting into the detail of my logic and interpretation, in my opinion, of what Microsoft is not saying, I see the following occurring:

1. IE9 – defaults to the standard rendering and requires the meta tag to default to the IE6 version if necessary.
2. IE10 – defaults to standard rendering and falls back onto quirks mode based upon DTD or lack thereof. The meta tag requirement is abandoned.

Keep in mind, three upcoming events:

1. IE7 being pushed, again, as an automatic update next month, 12 Feb 08 [could very well see IE8 Beta around that time, also].
2. Continued adoption of Vista and eventual sunset of XP — both upcoming Service Packs will include IE7 [if not mistaken].
3. Roll-out of Windows Server 2008 in April [incidentally, I am betting IE8 will roll-out the same month, possibly May]

For the record, if my assessment is wrong, I will then put on the hat of Carnac the Magnificent and blame it upon someone tampering with the hermetically sealed mayonnaise jar that was stored on Funk & Wagnall’s back porch.

Kevin H wrote in to say...

Eric, I’m curious to know how well versed you feel you are in the work that has been going on in the WHATWG, and whether or not you followed the thread which foreshadowed the creation of this new meta-tag switch, over at public-html?

I ask because although I only have a passing familiarity with those things, I find myself siding with Ian more than I do with you, and I have to wonder if your opinion is informed in the same way that his is.

It seems, to me, that Microsoft could support their IE7 quirks by default in IE8 for current doctypes, and that they could use the new Super-Standard IE8 mode for HTML5 (something they are apparently already going to do). It also seems to me that HTML5 can be molded to be backwards and forwards compatible, so it falls on the HTML5 spec to “not break the web” instead of falling on Microsoft. “Where HTML5 does break pages, we need to fix the spec”

[…] the WaSP, and others who we have come to respect in this industry, such as Molly, PPK, Zeldman, and Eric Meyer. People who have made their name, and their careers, based on support for standards. People who are […]

Joshua Cender wrote in to say...

I’ve been brooding over this issue for the last few days. I’ve read as much as I can from both camps in order to try to make an informed opinion.

I strongly believe the default behavior of any browser should be the most standards-compliant behavior available.

However, as a software developer, I can understand Microsoft’s predicament. Some large corporations may refuse to update their code, and this is a problem. Why can’t the answer be a client side switch for quirks mode?

This will allow companies to use their intranets with the quirks mode switch enabled, and not hold back the rest of the world. The switch could be set via domain whitelist (or manually), similar to how pop-up blockers work, and should be designed to be easily deployed to network machines by IT departments.

Of course, this is still a little tricky for companies with public-facing websites that still rely on incorrect coding methods, but the switch could be an obvious part of the UI to allow users to “fix” the layout if one of the sites they frequent appears broken.

Wouldn’t this be the best of both worlds? No matter what happens on the server site, the clients still have a choice. And most importantly, it allows us to continue to push standards-compliance as the default in todays browsers.

First of all, I want to thank you for all the time you’ve spent considering, explaining, and re-explaining this topic. I have to admit that after reading Aaron’s article on ALA I was rather…miffed.

But through reading all of your posts, comments, the ALA article – I feel much better now about this entire topic.

Oddly enough, I think one small semantic change from IE: edge to something that doesn’t include the letters IE in it at all would alleviate some of this tension on this topic. If the meta tag value for standards-compliant pages was simply SC …then the need for Anti-IE resistance would be removed.

Also, I feel my last issue with this topic is what you mentioned at the end of your ALA article, and that’s with the possibility of IE bloatware. If that can effectively be avoided, then I’m ready for this implentation.

Eric,
On the default behavior:
I understand the concern that companies may not want – or know how – to update their code to standards for IE8. However, I find it hard to believe that they would be unable to add a meta tag to freeze their own sites in a past rendering model. Even a few months of warning before IE8 should be sufficient for these companies, no? Even sites that do break could be back and running in no time. I am not convinced by the argument that a fix like this requires the knowledge of a standards-aware developer.

elf wrote in to say...

this makes me glad im still using lynx as my primary browser. (speaking of which, if you could link a single-image version of the browser timeline, it would be much appreciated.) so, being as im not exactly up to date with whats going on with IE, this may be a naive question:
a) is it possible to have equivalent screen layout with code that would be valid in IE6/7/8 but not in some earlier/later version?
b) if so, what is the problem with translation engines and some form of warning, so any developers would immediately know there was a problem, what it was, and how to get the equivalent structure?
c) if not, why not?
apologies if this is missing the point entirely.

Tobias Güntner wrote in to say...

What about “render mode detection” based on CSS?
Consider, for instance, a stylesheet containing generated content or some well-known IE-specific hacks/workarounds. Wouldn’t it be reasonable to assume its author otherwise strived for standards-compliance and therefore switch to “real IE8 mode” in IE8, regardless of any doctype or (missing) X-UA-Compatible declarations? (Detected IE-hacks must be ignored, of course.)

Microsoft wants to default to 3. What about bug fixes for IE7, isn’t there a possibility those will break the web? Also, since Microsoft has to maintain IE7 mode, won’t that take precious resources away from improving IE? It sounds like a bad idea to me.

Why not default to 4 (IE8) and as a short term solution, only in IE8, allow developers to “opt-in” to IE7 rendering mode until they can upgrade their sites. After all IE7 “broke the web” but the sky didn’t fall. There are far fewer IE7 users than IE6 (the company I work at still uses IE6), won’t this affect a far smaller number of users. It was okay to “break the web” with IE7, why not with IE8?

If I were a manager I would say default to IE7 because I won’t have to spend as much on my development budget. That will stagnate the move to standards. If I only have a temporary solution, I’ll have to make sure that changes are made to “meet IE8 standards”.

I think edge should be the default and no other value than IE7 would be valid to those who want to temporarily opt-in.

Nearly all of the outrage is about IE8 defaulting to IE7 buggy rendering mode, not about the basic idea of making IE8 somehow backwards compatible with broken web design. Nearly everyone can pretty well understand the necessity of some backwards compatibility, because ist a matter of fact that many of those big sites will not be redesigned to be more standards compliant.

There would be two solutions for it: First, MS should include a tool when shipping IE8. That tool could throw in that extra version meta info into all pages of a site, if the site owner wants so. This would be absolute minimum effort. Those sites then might break in FF as they do break now in FF, but that’s a completely different problem. Second: MS should offer the User to switch back to old rendering modes. If a website owner for whatever reason does not even want to use the MS tool to “downgrade” their pages, then the user might du so. The downgrade user setting could be stored in the registry on a per-domain base. The user would thus once be aware of the complete unability of that company to fix their broken pages, but after a single click the page would still be usable with IE8.

Ray McCord wrote in to say...

Breakdown of the Problem, Call for Alternative Solutions

The DOCTYPE was more of a “natural opt-in” than the current proposal. It didn”t feel like an opt-in because the split between pages written to quirks and pages written to standards fell (more or less) dead on the line of who used a compliant DOCTYPE and who didn”t. So, opt-in was essentially what happened last time with DOCTYPE switching and so that is the approach this time around — to keep those millions of old documents that already work in quirkier versions of IE working in newer versions of IE, without changes representing phenomenal expense.

So, Microsoft WILL NOT break any pages that currently work with any version of IE to date and WILL NOT require them to be edited to accommodate moving forward with more complete standards compliance.

This means IE MUST be able to tell which standards mode a document wants, and can”t use just the DOCTYPE to do it, because it”s already been screwed up with IE 6. That means a switch must be located in the document (like a META tag or a conditional comment) or must be linked to the document (like HTTP headers, external stylesheets, scripts, images, or other object).

The short of it is, if we can”t find some other way to signal that distinction in a more granular manner than a blanket opt-in/opt-out trigger, web standards takes a big blow and proprietary IE rendering gets a big shot in the arm — as IE 7 standards mode rendering becomes the default for all documents that don”t have some trigger in them that sends IE 8 and above into another mode.

MS CAN’T change the user agent string, because too much code uses that to determine the DOM and/or generate the actual page and/or dynamically generate the style rules embedded in the page and/or determine the availability of the IE feature set (ActiveX, VRML, COM interface, CSS expressions, VBScript, and more). If they change the UA string, they break IE-specific pages. This would devastate their intranet market and MS WON”T do that, as it has a honking load of back-office, front-office, server-room, and middleware market share depending on IE for web integration.

MS CAN’T just kill the existing bug-based style hooks because of open conditional comments like this:

<!--[if IE]>…<![endif]-->

Gotta love that. Such shrines to a pandemic of shortsightedness are throw-backs to a day when, if you targeted styles at IE 5, you knew it was quirks mode regardless of DOCTYPE. At once, if the page was being rendered in IE 6 and the DOCTYPE was valid, you’d assume IE 6 standards mode and without a valid DOCTYPE was quirks mode like IE 5. This logic assumed there would be no more IE versions after 6 or that things like Chris Wilson mentioned (about boxes overflowing against standards) would not be fixed in future “standards mode”, because that is “just how IE does it”.

If that whole thing about assuming “there would be no more IE versions after 6″ sounds unforgivably naive, recall that MS announced no further development on IE after 6 came out. The IE development team was disbanded. People thought “standards mode” in IE would forever be the broken implementation that it was in IE 6. People coded to this assumption and a heck of a lot of pages got made in the interim between that announcement and MS announcing the IE team was getting back together to make another album…er, version.

Regardless, the damage had already been done. So, we have a plethora of deployed code that assumes “standards mode” would always act like it does in IE 6. Regrettably, the differences between IE 6 “standards mode” and the following IE versions will become more and more divergent. Unmaintained pages will become more and more broken over time, and less and less usable.

Another unfortunate factor is that, since IE 7 is already out without any new switching mechanism, nothing can be done about what IE 7 broke of the unmaintained pages out there. However, more pages will become unmaintained during the life cycle of IE 7. This makes for changes to standards mode behavior in IE 8 even more of a problem. With damage to maintained pages that broke with IE 7 getting fixed, the only logical fall-back point for the current “standards mode” is the interpretation as it is now in IE 7. This limits the damage to what has already occurred and is not being or has not already been fixed.

So, the question becomes this: Is there any way current and future pages can be distinguished from pages made for IE 6 or 7 “standards mode” implementations?

That is what the solution proposed by the IE team addresses.

More to the interest of standards on the web, I and others like me are refining this question to become: can we get around the problem using what is already available with current or soon-to-be standards and/or existing switching mechanisms?

Some initial ideas:

I think it’s safe to assume that a right vast majority of developers/maintainers/webmasters/what-have-you will apply different styles for IE 8 than IE 7. Further, it is likely that, in the absence of a great many parser bugs and other CSS syntax bugs to exploit, targeting multiple versions of IE for different rules using CSS hacks and the like will be untenable and that the most logical and practical choice for accomplishing this feat will involve using conditional comments. That said, it should be looked at as a more granular switching mechanism than an explicit opt-in mechanism that assumes a blanket attitude of ambivalence toward web standards on today’s web. That is not the case, as the resurrection of IE and it’s slow lurch toward standards-compliance makes so blindingly clear. So, let us not pretend it is 1998 anymore.

Good. Now, then, I have a thought that media queries, in conjunction with conditional comments, could be an option to look into for signaling IE 8 to use the latest standards mode. If one could target for IE, using conditional comments, a stylesheet using media queries — and since the only versions of IE that would know about media queries would be 8 an up — IE 8 would know to switch on maximum standards-compliance mode when it saw them. The very fact of pulling a sheet in to IE explicitly through a conditional mechanism like that makes it perfectly clear the developer’s intent when all other switching tests would result in the use of “standards mode”. Further, having the media queries in a site-wide stylesheet pulled in explicitly to every page on a site makes the solution maintainable across successive browser releases. Existing conditional comments that inclusively match any version of IE greater than 7 can call the stylesheet implementing this switch and allow for progressive enhancement of standards compliance to be feature-based within @media blocks or additional stylesheets imported using a media query conditional on the relevant @import rule.

This is good, in that, media queries are already implemented, at least in part, in some major browsers, and so are likely to remain viable and part of some future standard even if CSS-3 never makes it out the door (which we have absolutely no reason to think will occur). That, and the media query mechanism used in this switching scheme is, by default, relevant to more than just one browser — giving it primary purpose in its own right, with the switching behavior in IE being only an ancillary use of its capabilities. Talk about your win-win situation that justifies and encourages IE to take a massive stride forward in standards and capabilities.

Awesome then becomes the fact that conditional comments are already implemented in IE since version 5, with the commitment to their longevity. So, there we have an existing proprietary targeting mechanism being extended in utility and enhanced by standards to become even more useful for developers. That beats a burden on developers any day, in my book. How about yours?

Layering icing upon that approach is the philosophy of operation and its support for “progressive enhancement”: it is targeting rendering capabilities and features, not determining them. Targeting this way lets you state what rules to use with a given level of rendering capability and available features. Versioning (the current IE team proposal) tells the browser what rendering capabilities to use and features to make available for the given rules. THAT IS A HUGE DIFFERENCE! Read that again.

“OK”, you say. “That’s one option. But, I won’t use conditional comments because they are evil, proprietary, and smell bad.”

Fair enough. How about another?

Maybe Mr. Wilson and the gang could just implement an evidence-based decision tree that errs on the side of IE 7 rendering unless there exists evidence of explicit consideration for an IE version greater than or equal to 8 (such as using an explicit major version number greater than 7 in an inclusively matching conditional comment) or something like that. The important thing is to determine the intent of the author — that is to say, beyond a reasonable doubt. Since millions of pages hang in the balance, this would obviously necessitate a fairly conservative weighing of evidence. That comes pretty close to requiring the evidence be irrefutable. That’s a tall order, and I’m not sure developers would feel comfortable with the browser making any such “judgement” of their intent, no matter how simple, explicit, or well-known the rules of engagement were.

Some have mentioned that the HTML 5 DOCTYPE is our knight in shining angle brackets. It is also true that MS has indicated that any currently unrecognized DOCTYPE would be treated with the maximum available level of standards-compliance in IE 8 and up. This is the equivalent of the “edge” value for the original proposal’s META tag/HTTP header. Apparently, it has been confirmed that we get super-standards by default if we use the HTML 5 DOCTYPE. Good. Great! Right?

Well, that DOCTYPE isn’t without baggage. What is? It does represent your document type as being what is currently a draft W3C standard which purports to be backward-compatible with previous versions of HTML and future-compatible with direct successors to HTML 5. Okay, so we should be able to use current HTML 4 code with it, then. Maybe. It supposedly rips out some stuff and adds some new stuff. My take is that the default handling of unknown tags and attributes — that is, to ignore them — is the new way, same as the old way. That’s not backward-compatibility. That is backward tolerant. Backward ambivalent. I guess it works if you don’t need anything that got ripped out, though. Most of the surgery was presentation stuff and redundant attributes like “name” on non-form elements. So, I digress.

Now, depending on if we end up with fallout from using the HTML5 DOCTYPE as a switching mechanism too soon, we may end up with the need for yet another breaking point or we may end up constraining HTML 5 development to limiting factors brought on by widespread premature deployment. Serious though needs to be undertaken to assess any potential ramifications before recommending this course of action. It is still a markup change on every page and there is no way to centralize it for existing static files, like there would be if our switch was included via external global styles or some-such.

Well, I’m drained just writing this. I think I’ll take a wee nap and turn the mic over to you. Whatcha got?

Let”s all put out heads together and see what else we can come up with that is more palatable than versioning and markup-based switching. I think we’d all like to see visual rendering issues kept separate of structural changes, if possible. It is a presentation issue, after all.

Thomas Tallyce wrote in to say...

Ingo Chao said:

Any meta like “IE=8″, “IE=9″, “IE=10″ will add to the number of sites that are locked-in. No meta means that the site is locked-in, any meta other than “IE=edge” means locked-in, too. Can you imagine what will happen if 95%+ of the web sites in 2010+ are locked-in to a version of IE?

This is surely the biggest problem with the whole versioning proposal. Having IE7 as the default that people have to opt out of will lead to immense stagnation over coming years.

Jack Sleight said:

What I”m wondering is how long the IE7 mode will be the default. Can we expect IE10, 11, and 12 to all still have the IE7 rendering engine built in, and be the default? Or will it eventually be phased out?

In other words, years down the line, the problem is still going to remain. It’s better that we get MS to move to a more standards-orientated future quicker, rather than just delay the answer to the same problem year on year.

As a software developer, I have serious doubts as to how feasible it is for MS to release IE in such a way that it includes a rendering component for each version of IE. That sort of approach surely is a complete maintenance nightmare, and I can well imagine some MS executive in a few years saying “this is ridiculous – we can’t keep shipping ancient buggy code like this”. It also introduces the problem of having to put security bug fixes (if there are any inside a rendering engine) into multiple places.

I think this whole argument would be much clearer if we had a vague idea of what exactly is proposed to be broken if IE8 were issued without versioning. No-one seems to have answered the question I posed on the ALA comments page:

“People using table tags and other stuff are surely not likely to be affected [by any upgrade to IE8] for the simple reason that IE already implements all of the basics, and the vast majority of CSS stuff to the standards, with only more complex floats and other more complex areas less consistently supported (but again, IE7 Trident got alot of that closer to other browser vendors” implementations).”

I would agree with Eric that if versioning is to be introduced, the default needs to be that edge is the default. The way forward for MS is to produce IE8 to be standards-compliant as possible (i.e. as they currently are proposing, but without versioning) and implement the ‘revert to IE6 or IE7′ switch by use of an HTTP header. They could issue a very clear set of instructions about the way to do this, e.g. a clear step-by-step guide to creating an Apache .htaccess file to add the header; and the equivalent in every other server-side language. That way, they can reassure site owners that there is a temporary fix while sites that use the sort of complex CSS that MS claim will break can be fixed up.

The way forward for MS is to produce IE8 to be standards-compliant as possible (i.e. as they currently are proposing, but without versioning) and implement the “revert to IE6 or IE7′ switch by use of an HTTP header. They could issue a very clear set of instructions about the way to do this, e.g. a clear step-by-step guide to creating an Apache .htaccess file to add the header; and the equivalent in every other server-side language.

The use of a HTTP header for this is very problematic. If a page is saved to disk, the HTTP header is lost, and later viewing of the saved page will get the wrong rendering in IE8. I’m not the first to point this out. It might seem like a small detail, but it’s actually one of the major flaws in this whole proposal. Maybe IE8 and later versions will add the HTTP header as a meta element when saving a page, but earlier versions of IE won’t do that, and other browsers won’t either. We can’t count on IE8 only ever rendering saved pages that it saved itself.

Brian Warshaw wrote in to say...

People seem to be worried about sites being “locked-in” to a particular rendering engine of IE. What’s the problem with this?

The whole point of the tag is to prevent existing sites from having to change code to comply with a new browser. The whole point is locking a site into a present rendering engine so that it doesn’t have to be adjusted until the coder is ready to adjust it, at which point the coder can adjust the meta tag accordingly, either to a later version or to edge. You should want your site locked into a working rendering engine until you’re ready to change it. If you are constantly ready to change it, then you opt in to “edge”. Tell me how forcing the opt-out won’t break the pages of people who aren’t ever-vigilant and ready to fix their pages for their users.

Brian Warshaw wrote in to say...

Thomas,

But you’re still saying that MS should tell site owners to make changes to their sites. It doesn’t matter how easy it is to make a .htaccess file; large corporations don’t like to be told what to do, even by other large corporations, and you can’t make them do it. It would be even harder to get them to start working on fixing the bugs.

Jason McLaughlin wrote in to say...

Thanks for sharing; that’s a sad but fascinating story. I think it’s a shame such things happen, but perhaps sometimes it’s well nigh unavoidable.

In this case, however, I completely agree with Joshua (comment #50), and expressed something similar at the IE blog. It seems like the solution to this problem could be a lot less onerous than versioning or most of the other ideas being bandied about. If Microsoft would default IE8 to standards rendering but allow the end user to selectively render content in legacy mode, and then make sure everybody’s made aware of the improved default rendering and this legacy shim before IE8 is officially rolled out, surely that would serve the needs of those who want such behavior, yet avoid holding back more standards-compliant development and rendering in the least. Or so I believe!

You obviously have experience with this sort of development, Eric. Do you see any fundamental flaws with that idea? I know it could be some small challenge to spread the word, but other than that…? I’m interested in your take.

[…] Eric Meyer discusses how when he was working at Netscape how they wanted to release a standards compliant browser but were forced to “quirkify” it due to it breaking websites. They tried contacting the producers of these websites, but they were not interested in conforming to standards to make it work in a browser, “its your problem”. […]

I don’t like this for a lot of reasons. Many of them already stated by others.

But if there is no stopping this on the whole, I’d like to suggest it be a switch that could be placed inside a CSS file, and not in the HTML.

I have sites that have a lot of template files/master pages, and these would all have to be updated, however CSS files are few, and there is usually a single master (or could be!) that could contain this switch.

This would make testing/QA easier since I would only need to update one file to test an entire site. It could be as simple as:

html { filter: XUACompatible.Microsoft.RenderMode( ie=’8′); }

It would also give us a way to target previous/future browsers.

I could have a ‘*rendermode.css*’ file with this in it, and simply import it into my global.

In a different scenario developers could also do:

html { filter: XUACompatible.Microsoft.RenderMode( ie=’8′ ); }

html { filter: XUACompatible.Microsoft.RenderIEMode( ie=’edge’ ); }

But like I said, I’m against having to put this in the HTML – this however would simply be an alternate. Right now there is only the META tag. I prefer having more options.

The last thing I think this might prevent is mass proliferation. CSS files are not typically CMS/application templates, so this being built in by default is less of a possibility (it could still happen though, I’m sure). It could certainly be a code snippet, but only the developers who know what it is would use it.

JeffW wrote in to say...

I’m pretty strong for meeting the standards when possible, and when browsers implement them, but this brings to mind a quote I saw recently.

I don’t have the exact quote handy, but it was to the effect that standards writers aren’t the top of the food chain, they’re at the bottom, and what they write has no meaning unless someone decides to use their implementation. (fwiw, I think it might have been by someone involved with the html v5 team)

I’m a hardware design engineer, and we’ve long had to face the difference between documented standards and ‘accepted/implemented standards.’ Sometimes the documented isn’t ultimately the right way to go, either due to poor decisions made in that standard, or due to the need to match other not-quite-spec implementations.

So, ultimately, while we usually want browser makers and others to meet the written standards as closely as possible, there may be times when that is the wrong choice. Perhaps the spec should be changed. Contrary to popular belief, they aren’t gospel written in stone and blessed by God. ;-)

In this case you mention, I’d ask:

… did anyone consider changing the html/css standards to better reflect a working and established reality?

(instead of branching the implementations still further with more modes?)

… and was there some reason that the way the spec changed the behavior from the accepted/established was wise or advantageous technically?

JeffW wrote in to say...

Richard Fink said:
Second, if you view standards-based features as a product (which it is), the supplier chain looks like this:

1) Browser makers
2) Authors (the people who create sites)
3) Users

… don’t forget those who make the tools to help authors create sites… that’s another level, whether they be editors, syntax checkers, WYWI!WYG editors, or blog and CMS software that runs on the site itself… They have to adjust for changes too.

Ray McCord wrote in to say...

@ JeffW in # 72

CSS 2.1 is, in fact, a revision to bring the standards in line with the existing implementations. In doing so, it broke compatibility with a few implementations of CSS 2.0 and made some changes that were technically inferior to the original specification.

It was recognized that standards are only so when relevant. Thus, web standards do adapt to our shared reality. The W3C had retired HTML and started touting XHTML as the way forward. The development of HTML 5 recognizes that not everyone in reality wants an XML-derived markup language as the basis for the web. The rigidity of the standards is not the problem.

The problem is that the standards now agree with all other existing implementations, except IE. Standards compromised already. It is now the turn of IE to make the effort to come in line with the standards and other implementations.

@ JeffW in #73

The supply chain you state is incorrect. The browser “product” is not modified and redistributed by the Author of each site to the End User. That is just not how it works.

The browser is between the Author and the End User. It is not a source producer. It’s middleware. The source producer is the Author.

It’s important to note that, in terms of IE, if the Author is a third-party web developer or tool maker (whom is not partnered with Microsoft), they are not viewed as the customers of the browser maker. The browser maker’s customers are viewed as being the End User and Site Owner. Period.

Even though we web developers provide the service which gives the browser its value to the end user and enables the site owner to extract value from the browser, we are rarely, if ever, given much consideration.

From the IE point of view, the supply chain is like this:

End User <– Browser Maker <– Site Owner <– Third Party (Web Developers and Tool Makers)

That we have the power to demand change is through our expertise, but only when we realize and take the attitude that the site owner came to us because they *need* us — we are the experts in a legitimate profession that deserves respect. Only when we have the ability to sway the thinking of the site owner can our influence and demands be extended down the line to the browser maker.

Leszek Swirski wrote in to say...

Brian Warshaw: You seem to be the first person to be speaking any sense.

The whole point of this proposal is to “not break the web”, not to “not break the web after every broken page adds a meta tag to unbreak it”. Some pages are dead, some have idiot developers who refuse to accept that new browsers are out, some have idiot developers who don’t know that new browsers are out. Whatever the reason, saying that pages won’t be broken because you can unbreak them is very naive.

Austin Cheney wrote in to say...

I agree with the direction IE8 is taking. The problem is that people are confusing standards for requirements. According to the W3C the standards are recommendations and not requirements. The W3C provides standards that are relegated to mere guidance while the browser vendors provide the requirements. This is not my desire to favor browser vendors, but the reality in which the internet exists.

Identifying the problem – faulty reasoning
The problem is confusion between those "R" words. If the standards are not requirements then the idealism for standards lust based arguments looses idealistic appeal. The rational result is to ensure the recommendations become requirements, or at least enforceable options.

Identifying the solution – a migratory path for browser vendors
Unfortunately the standards are written as doctypes and not schema, so validation of code is not capable of being enforced. Browser vendors are more capable of enforcing the standards that are otherwise unenforceable merely through field use. The solution to the noted problem is help browser vendors carry the torch of standards enforcement. Notice that I said standards enforcement and not standards compliance. In order to achieve this most desirable solution without immediate alienation of backwards conformance browser vendors must adhere to a three part plan:

1) – Abandon slackware: Browser vendors must implement an optional standards mode that fails on non-compliant pages. This will allow designers to create future compatible content that is nothing less than well formed.

In the case of IE8 the IE8 processing engine must be optional if it actually wishes to be strict to the standards. Internet users must be aware that IE8 is not requiring conformance to the standards of IE8, but it will. This allows content owners to get their assets up to par before they are exposed as flawed, archaic, or simply incompetent. This would also satisfy the requirement of point 3 – timely notice.

2) – Version control: Browser vendors must implement a version control system to warn web content owners that the browser will significantly limit backwards compatibility in future versions. This will allow time necessary for content owners to conform to the standards before their pages begin to break.

3) – Timely notice: Browser vendors must make public what a browser version will no longer support and what new features it will support 9-18 months in advance to allow the market to prepare for those changes. Without timely notice the previous two points are irrelevant. The point of timely notice is not to benefit designers or asset owners, but to protect the browser vendors. Browser vendors must view themselves as always legally liable and take steps accordingly to continually provide innovation while ending support for older versions that impair such innovation.

Intended result – standards become requirements
The goal is not to improve code. The goal is to make content asset owners aware that archaic content and markup methods will expire and that it is their liability to stay up to date. Without liability there are no requirements.

The only alternative – abandoning HTML
If standards are important they would be enforceable. If browser conformance to those standards were important then they would be given the liability of enforcement. If the mentioned steps are too much to ask for then HTML creation will never be standards conforming. Requirements are extreme in their strictness and lack of compromise or they are not requirements. If standards are not entirely followed then they are not standards.

If the mentioned steps are not appetizing then simply abandon HTML. Create a new markup language that is more semantic and defined by schema so that validation is a requirement.

[…] they were even interested in my opinion in the first place. Plus, having read what Roger and Eric have to say on it, as well as the 600+ comments on the IE blog entry, plus what Chris Wilson had to […]

[…] on the digital one. Eric Meyer has a beautiful example of a problem that has no right answer: You can either conform to the specs or you can make it work. In a perfect world these two actions would be identical, not mutually exclusive. This intrusion of […]

[…] had their say in it already. Now it’s my turn, d*** it! It was kind of weird to see Zeldman, Meyer and PPK all agree on something that seems it would have to be appalling to standardistas, in the […]

Remember to encode character entities if you're posting markup examples! Management reserves the right to edit or remove any comment—especially those that are abusive, irrelevant to the topic at hand, or made by anonymous posters—although honestly, most edits are a matter of fixing mangled markup. Thus the note about encoding your entities. If you're satisfied with what you've written, then go ahead...