March to Your Own Standard

So what’s up with the little grey button at the bottom of this site? It is my official Invalidation Badge. It’s mere presence on every page of this site renders my entire domain XHTML 1.0 Non-Compliant. Invalid. Erroneous. Whatever you want to call it. Here are the various crimes this one line of code commits:

An ampersand is not properly encoded

An alt tag is missing

An attribute called “myfavoritetag” is made up

An attribute is missing quotes

A script tag is missing its type and language attributes

A non-closing tag is missing its trailing slash

A tag is upper case… gasp!

By invalidating my entire site with this one line of code, I ensure that I am made aware the instant it matters. The instant this stuff starts to break anything in the real world, I will know. If I only had a few small errors on a few random pages around my site, I could easily miss the day when “the big switchover” happens and wind up with broken pages I don’t know about. And since this code is in the form of a server-side include, I can freely remove it with a few clicks.

It’s kind of like carrying a canary down a mine shaft with you. As long as the canary is alive and chirping, you know you’re okay for air. Actually, I guess it’s not really like that.

You moron, what are you trying to prove?

Nothing to prove really. It’s just a reminder that web standards are about a lot more than validation. Web standards are about all the processes involved in publishing information over IP. If you have a big red button at your company that employees press to make their pages live, that’s a standard. It’s your own strange and puzzling standard, but it’s a standard. If someone is going to work at your company, they need to learn how to push the big red button to publish their pages.

So how do we pick what standards to follow when we’re publishing on the web? If we pick standards that nobody else practices or recognizes, the benefit of the standard is limited to our own little world.

Who can we look to for guidance?

First we look to the W3C. We don’t look to them because they have any authority. We don’t look to them because we have to. We look to them because their very charge is to help us and their existence is for our benefit. They are not owned by Microsoft and they are not paid by the NRA.

W3C specifications are usually (but not always) detailed and well-thought out. They give us sets of tags with which to classify our data. They give us proper syntax with which to use these tags. They give us methods of styling our information with external stylesheets. And finally, they give us the ability to add intelligent behavior to our content using the document object model.

After a several days of studying, some people could get a rough handle on the above methods. After several weeks, the same people could probably claim they have intermediate web skills. And after several months, these individuals might have more web skills than anyone in their neighborhood.

So what have these people learned sitting in the closet with their laptop and their O’Reilly book? How to deal with deadlines and workplace personalities? How to integrate art, editorial, and marketing? What web publishing is all about?

Not at all.

They’ve merely learned the building blocks of deploying 0’s and 1’s on the web. That may sound insignificant, but it’s not. It’s more than 99% of the world knows, and probably a good amount of the entire code-writing profession.

Alright, we’ve got our W3C stripes, now what?

First let’s go over what we shouldn’t do:

Don’t join the validation army

Just because you can validate your code doesn’t mean you are better than anybody else. Heck, it doesn’t even necessarily mean you write better code than anybody else. Someone who can write a banking application entirely in Flash is a better coder than you. Someone who can integrate third-party code into a complicated publishing environment is a better coder than you. Think of validation as using picture perfect grammar; it helps you get your ideas across and is a sign of a good education, but it isn’t nearly as important as the ideas and concepts you think of and subsequently communicate. The most charismatic and possibly smartest person I’ve ever worked for was from the South and used the word “ain’t” quite regularly. It didn’t make him any less smart, and, in fact, it made him more memorable. So all I’m saying is there are plenty of things to judge someone on… validation is one of them, but certainly not the most important.

Don’t move on to other technologies thinking you’ve mastered the web

A true mastery of the web takes years. You know all the rules, but only through trial-and-error will you learn all of the many exceptions. In fact, one could argue that the web has more exceptions than rules when it comes to how things display in browsers. One would probably lose that argument, but nonetheless…

Don’t get all Unabomberlike on us

Running your Olsen Twins Fan Club site out of the broom closet is not going to teach you anything about the collaborative environment of electronic publishing. Spewing validation manifestos on message boards isn’t going to show you how to listen, negotiate, compromise, and execute. You need to get out and work with the designers, writers, and businesspeople who will make your own work much much better.

Can we get back to web standards please?

Yes. We discussed earlier how anybody can have a standard. So besides using the W3C as a baseline, whose standards do we follow? The first thing we do is look to the masses. What arethereally,smartpeopledoing? What sort of publishing trends are going on right now? Are people using floats or absolute positioning? What browsers are people finally shuttering off into irrelevance? We must look to the masses because it is the masses who we work with and it is the masses who we work to reach. The people linked to in this paragraph are not jedi masters because they validate their code. They are jedi masters because they have thrived within the incredible constraints of the browser world to produce beautiful pieces of code, design, and communication goodness. Todd uses a lot of Flash. Doug’s pages can get heavy. Shaun uses a lot of javascript. Jeffrey sometimes creates and deploys color schemes before they are actually in fashion. Dave gets daring and controversial on redesigns. And Dan has committed the ultimate in validation blasphemy by working with us at ESPN.

These people are important because they are uniters. They bring the designer and the coder together. They bring the business team and the production team together. They bring the rules and the exceptions together. It is their ability to mitigate in a world of competing interests which sets them apart. And they are nice guys to boot. We look to people like this to help us learn the best practices the W3C cannot teach us. We learn things like how to best integrate Flash into a publishing environment, how to use multiple stylesheets to change the color of a site every day of the week, and how to replace vomit-inducing browser text with anti-aliased typographical goodness.

Standards exist for the benefit of the web worker almost more so than the end user, and by following the best practices set forth by the best people in our industry, we ensure we are equipping ourselves with a versatile skillset which we can take into any environment. We may never work at a company which requires us to push a big red button to publish our pages, but we almost certainly will work at one which requires some of the methods set forth by these and other pioneers in the web publishing world. If you think standards are all about helping the disabled, you’re wrong. Accessibility is about helping the disabled, and there are both good and bad accessibility standards in use on the web today. Standards, on the other hand, exist so that we can use the minimal amount of labor and energy to create the greatest impact possible.

True enlightenment comes from within

The next and possibly most important standard we follow is our own. Tantek Celik followed his own standard when he brought us the Box Model Hack. Suckerfish created the gold standard for dynamic navigation with their CSS/JS dropdowns. And we at ESPN brought typographically rich scalable headlines to the world using Flash and Javascript. None of these things were being preached by the W3C, nor were they in use on the web up to that point. That makes them our own little standards. Standardlings, if you will… or, “rules we follow which others may eventually follow.”

How a standardling sheds its ling

In order to become a meaningful standard on the web, a method must be judged as a generally positive thing, and then deployed on a widespread basis. A good example of this is how Macromedia used the object and embed tags to display Flash seamlessly within web pages. It didn’t conform to W3C specs, but it worked, and it worked well. The method displayed Flash seamlessly on any browser and failed-over silently when set up correctly with javascript. Additionally, users on PC Internet Explorer were able to download the plug-in virtually transparently. As a result of Macromedia’s smart yet “invalid” implementation, the Flash plug-in now permeates over 85% of the world within 14 months of whenever a new version is released. How happy would you be as a web developer if someone told you that everybody in the world would have a shiny new browser within 14 months? Macromedia created a standard from within, and now everyone is reaping the benefits of it. And we haven’t even touched the subject of Flash itself being a standard. That is an incredible accomplishment as well.

If you still need more evidence that breaking the rules can be okay, take a look at the aforementioned Flash headlines we put to use on ESPN.com in 2001. Our method allowed us to create dynamic, scalable, and anti-aliased headlines using any typeface without adversely affecting any browsers. The code used inline javascript and wasn’t totally ideal from an “under the hood” perspective but we felt the attractiveness and functionality gain was worth the paltry fee.

But what happened next is what’s really important. Shaun Inman, an ESPN user and churner of much butter in the web design world, came up with a way to deploy these Flash headlines in a much better way from a coding standpoint. It’s called IFR and If you haven’t seen it yet, check it out. It’s what Shaun uses on his site, it’s what I use on this site, and it’s what we’ll be using on ESPN.com and other Disney properties in the very near future.

So what we have now in IFR is a very promising young standardling which is about to be adopted by a very big company. It’s the result of our original imperfect implementation, followed up by a near-perfect implementation at the hands of the I in IFR. Do I really care how many people adopt it? No. It benefits us and our users and that’s all I care about. If it benefits you and your users, then great, use it too. And don’t forget to shoot a nice thank you note to Shaun. Maybe one day it will be the defacto standard of how to enhance typography on the web.

And now back to that Invalidator Badge thing

So aside from IFR, I am excited about another standardling which makes its way into this world on this very page. I’m officially opening up the Invalidator Badge to the public domain. As Al Gore would say, “It’s Open Source.” Simply copy and paste this code…

… anywhere on your site and let the fun begin. If you’re really good, you can even further invalidate the code in other harmless ways. Remember, this project is what you make it and my code is but a starting point. You’re welcome to use the badge as is and I think I might even publicly shower with compliments the person who is able to put together the most invalid benign code sample in 100 characters or less.

And then maybe after you’re done with that, you can get some real work done. :)

68 comments on “March to Your Own Standard”:

Second I wouldn’t go so far as to completely vilify validation. While I agree that validation isn’t something worth wagging fingers over, it is extremely useful in a production environment.

As you’ve beautifully illustrated, it doesn’t make a spit of difference to today’s user agents which are still accustomed to bruised mark-up but validation is an essential first step in troubleshooting rendering or display issues. Once all the simple, obvious things are accounted for (encoding the required entities, closing open tags, correcting tag case) it’s far easier (and less time consuming) to track down the source of the problem. Otherwise, I need to sort through line after line of the validator harping on inconsequential errors in order to find the real cause—assuming it’s even a problem with the X/HTML! But once the source is valid then I know it’s either a problem with the CSS or a particular browsers misrendering of the CSS.

Being able to eliminate some guess work on the way to a solution and the resulting time savings is reason enough to write valid code.

I validate pages to look for errors in code. If I find no errors, then I know it is a CSS problem. If the CSS validates, I know it is a browser problem and I can begin to work around it.

Validation matters in a production environment, but sites are not always going to be valid. Why? Say you design a site and hand it off to a client. There is no telling what they will do the site once they get a hold of it.

Yep. I agree with both of you. Validation, right now, is most useful as a tool to help you debug your own stuff. But if you’re perfect like me, you never have to debug. :)

Honestly though, I find a lot more of my display errors during design phase are related to browser quirks than invalid code. What I really need is Doug Bowman or Dave Shea built right into my OS. Kind of like Clippy the Microsoft Office assistant, but with useful stuff to say.

One feature I wish the W3C validator had was the ability to ignore certain errors. For instance, we still have hundreds of unencoded ampersands littering ESPN because of our ad server, and any attempt at debugging based on the validator’s output is tough.

Anybody know of any client-side validators out there which let you turn off certain errors?

You made some very interesting points here and you’re right–in some cases validation becomes its’ own purpose, which shouldn’t be. My oppinion is that pages should be valid, but you made me think and will probably win me over not to put validation links in future.

Ouch, I wrote a long bit. So, I put it online in my weblog, but for your viewing pleasure I’ll also paste here:

Mike Davidson writes about invalidation. He introduces a ‘this site does not validate’ image on his site, which has totally worthless code in it. The basic line of thought is, when the browsers start to actually care about such things, you will notice immediately in stead of tiny bits of your site breaking down one by one… or something. Well then, I thought, let’s put this on grauw.nl…

Er, look at it with Mozilla (Firefox) or Opera please ^_^. You’re looking at the regular version without that piece of code right now. The same would apply for the MSX Assembly Page by the way.

The thing is, these sites are made using XHTML 1.1. ‘So what’, you or Mike Davidson might say, ‘my site is made using XHTML as well’. Indeed it is, but the page’s content is still sent using a MIME type of text/html, while the actual MIME type for XHTML is application/xhtml+xml. This is an obvious choice, as Internet Explorer understands crap of the latter type and offers you to download the web page instead. No harm done, XHTML is especially designed to make this possible.

However, web browsers send information in their headers about which content types they accept. These sites of mine check whether the XHTML MIME type is accepted slash supported, and if so present their content using application/xhtml+xml in stead of text/html. Currently, both Mozilla and Opera support this (don’t know about Safari). When they display the site, they use their XML parser instead of their SGML parser.

Now one might say, ‘then why bother with XHTML’, but despite this there are several advantages of XHTML over HTML… I’ll name a few. First of all, point of XHTML is that it is compatible with XML and can be processed by standard tools. Anyone with a scripting language which has an XML library available for it can basically easily syndicate all content from my site, and if I would at some point in the future want to extract my blog entries into Word documents I could (theoretically ;p) use XML tools such as XSLT. If my page doesn’t validate for XML, all those tools won’t work with it either.

Another reason for using XHTML is that you can easily embed other content defined in XML, such as formulas beautifully rendered using MathML (don’t use IE for that page). Or use XSLT, but this time client-side as a stylesheet. And finally, I’m using XHTML because I am just a sucker for standards :). (Yes, I like unicode too).

Actually, this built-in validation is also quite useful to make sure I immediately notice any nonvalidating code and can fix it :). However, remember that this is only XML validation. It checks for stuff like proper nesting of elements, and quoting of attribute values and escaping of &, < and > characters. It does not check, however, whether an img tag has an alt attribute. Though now that I mention it… that is expressed in the DTD. Guess it just doesn’t bother to check the entire DTD then. Ahwell.

Wanting to validate however means a lot of extra trouble you have to go through, as discussed several times earlier (forgot link). For the MSX Assembly Page I mentioned earlier XHTML is a breeze – the content is static, and I can simply tell the other maintainers to write valid code and test in Firefox or else I’ll be angry.

Difficulties start to emerge when you loose control over the content. This for example happens with user comments which allow style applied to them, which I am working on for this site. Those suddenly need to make sure those style tags are opened and closed correctly, that characters are escaped correctly, and all the other rules of validation are met. And then there is the choice: will I force my visitors to write correctly styled comments, or will I try to make the best out of it if they make any errors? The first is obviously easiest to realize, and the question is how much of a bother it will really be, otherwise it will probably not look as it is supposed to anyway. ‘and if it doesn’t suit you, then just don’t style’ ;p.

Another thing is JavaScript. I am absolutely not against using it as a tool to achieve a means… When used with care, it can really do pretty cool and useful stuff, such as the comment preview system on Mike’s site. Unfortunately, document.write is an absolute no-no in XML. So this nice comment preview system… ain’t gonna happen. The way it is written right now at least. XML definately increases the complexity of some scripts, depending on what you want to do, and much more so than XHTML vs. HTML. Also there are very few pre-made scripts available which work in a strictly validating XML environment, so you do need to know how to program JavaScript, or know someone who does.

In any case, XHTML is really nice, but XML and in particular validation don’t make life a perfect and smooth experience. They make it more difficult in the sense that you must just follow the rules more strictly. But the way I see it, it really is just a matter of making my web site code ‘better’ (I am referring to both the HTML and the PHP scripting). It is basically the difference between ‘it works’ and ‘it works correctly’. Not producing valid code is IMHO a bit lazy programming. In ‘real’ programming (think C++, C#, ASM, etc.) there are exact rules too which have to be followed to the letter or else it won’t work, so why should I ignore those rules when it comes to web design. Hence, I don’t (or try not to).

If you don’t agree with that, fine, that is a choice you make. I can understand not everyone is used to the strict regime of programming, and that somewhat looser rules are appealing to many. It also makes creating web pages easier to learn (though at the same time teaching bad habits). What would make me personally already very happy is having a valid doctype present on your site, preferrably XHTML marked up code or at least HTML Strict, and a careful eye to keep your code structured, not use tables for layout unless really really necessary, and keep things more or less validating using the occasional W3C validator check (alt IS important for disabled people). And the occasional misstep – I won’t miss a night’s sleep for it.

Hi Laurens. First off, congratulations on being the inaugural dissenting opinion on this site. You are the first in what is soon to be a very long line.

I just have a few responses:

1. First and foremost, although the invalidation badge is somewhat tongue-in-cheek, deploying it on your site without a Transitional doctype is probably not advisable. That said, I only tried clicking your example in Safari and everything looks fine to me. From all the exclamation marks, I thought someone had died or something.

2. Choosing a doctype is a personal decision. And the best part about it is that you can choose Strict and I can choose Transitional and our users will both be able to see both of our sites. Not only that, but they won’t even know the difference unless they see our validation/invalidation badges at the bottom. So I agree with your decision to use Strict, and I also agree with my own to use Transitional.

3. I see what you’re saying about how there is a future in strict XML documents, and I’m ready for that to happen, but none of the reasons you stated in your post seem compelling enough to really worry about it quite yet. I’m sure MathML is really great and everything, but I don’t need it. I do, however, need to do things which the Transitional doctype lets me do. The doctype works for me, I don’t work for it.

4. You said that document.writes are a no-no in XML and I agree. And that is all the reason I need not to code in strict XML right now. We use document.writes all the time to insert all sorts of useful things and there is no reason to give up that utility or any other utility just to say we’re Strict.

Anyway, that’s about all I have. I’m not really disagreeing with you. It’s just that invalidation is not for everyone…

I’m not any kind of web guru, and gett in an aparently bignames conversation makes me feel guilty, but anyway… I just want to say that this is the most rational post about standards i’ve seen in too many time, thanks for taking the time for doing it. Maybe i will use yor invalidator if you give me any permission to translate it for spanish (hola! as you may have guessed by now, my first language is not english).
Thanks.

Before Inman has a chance to replace anything else, I would just like to say bravo. Great and well thought out, Mike. From the first bit I ever learned about standards there was a nagging in the back of my head. I am all for things conforming in the way my work gets displayed… but how many times have standards left you cursing in the middle of the night, staring at code on a white screen? I’m on board your train, Mike.

It’s not even a matter of bucking standards, its something I will agree to until I am blue in the face—open yourself up to new ideas and inspirations, don’t just follow the same path as everyone else, become an innovator. Standards are a way (and do fight for the good of the general public), yet certainly not the only way. But then again what do I know, it’s in my nature to try to come up with new avenues to use, I am coming at this from a designer’s perspective… I just moonlight with some code here and there :D

I agree with most peoples views on the matter, validation is a very useful tool in the debugging of webpages.

Validation has its uses, I just think its important that these uses be recognised. There are too many starting out web gurus who have been brought into the belief that webpages should be valid under all circumstances. The truth is, validation isnt always necessary. What do you guys think?

Mike – you really should look at it with Firefox or Opera. In Firefox you get a big orange screen, telling you that the XML parser found an error, and nothing of the actual page shows. In Opera, you get to see the parts of the site before the error, but there too, a huuge error notice appears.

As always, quality content wins out. I found my way to this site because someone thought this was a quality post…I’d refer back to the 6 smart people you linked to, whos quality content is the real reason why anybody visits their web site, and is not affected at all by validation issues.

I’m sad to see your comments, because it is obvious you don’t know the difference between a standard, a best practice, a gruesome hack, and an abstraction. And yet, you lecture your readers on these concepts and more.

Those who do know the difference will obviously ignore your post, which is full of some of the most ridiculous confusions I’ve run across in someone so otherwise talented. Validation doesn’t matter? To whom? A standard followed by only a few people is qualitatively the same as a standard followed by hundreds of millions? The W3C is not the source of Web standards, and has no authority? Web standards are just about zeros and ones and don’t have anything to do with Web publishing? Grammar isn’t important to computers? Do you know anything about how computers work? What sort of ill-advised nonsense is this, anyway? Do you add letters to the alphabet when you write your local newspaper, because the 26 letters that “someone” came up with are too limiting?

I hope that anyone reading this deeply misinformed rant of yours will have the good sense to ignore it, and you, and focus on what matters – producing high quality work, using the highest possible adherence to the fundamentals that govern all programming and markup and design, and respecting the fact that just because you do not like or appreciate a truth does not in any way diminish its power over you.

You may think that it is cute to deliberately invalidate your pages – and it is, in a sort of childish and foolish way. You show the world that you can’t tell the difference between validation as an act and the proper and informed use of Web standards, of which validation is just one building block.

But in so doing, you have essentially guaranteed that new errors will go unnoticed (not, as you claim, the contrary) because you have abandoned one of the firm underpinnings of good design: consistency and adherence to the rules (otherwise known as “syntax”, or “well-formedness”, et cetera, depending on context).

You may not think it matters, due to some misunderstanding of design: some high-school writers workshop platitude about “the best writing starts by breaking the rules”, perhaps.

Or perhaps you think, as you admonish the rest of us not to, that you’ve somehow mastered the rules of Web design and now you are free to break them.

Well, congratulations, if so – the rest of us will be right here waiting for you when you come round to the error of your ways. I know I made the same mistake, despite knowing better, back in 1995 – “validation doesn’t matter”, I said – “these browsers don’t support SGML anyway”. And this from someone who was using tools that wouldn’t even open non-valid SGML documents; who was writing perl scripts to help convert invalid documents into valid ones. I knew better, but I still went along with the “I’ll make my documents validate when it matters, when the browsers actually use SGML parsers” crowd.

And, so, with the rest of the short-sighted idiots, we went through five or six years or more of painful, browser-specific, hackish and invalid design work, costing everyone more for no good reason. We wasted time dealing with trivial crap when we could have been focused on the things you correctly seem to believe really matter: pushing the design and creativity envelopes. I personally have spent hundreds, if not thousands, of hours, including the past hour or so, having to argue for something that should never have been an issue in the first place; dealing with browser vendors and journalists and developers and designers and conference attendees, just trying to convince them why standards matter.

Web standards aren’t a religion (to me, anyway – I’m sure there are zealots, just as with any other arbitrary conflict) they are a building block, like the underlying protocols that make TCP/IP and the Internet work, or that allow me to type on this keyboard from my Macintosh, and know that my exact words will be properly represented on your Unix server and display properly on someone’s Windows desktop.

At best, I guess your essay falls into the class of work I’d call “the don’t sweat the little things” oeuvre. The slacker ethic. The folks who think that they can trade off more attention paid to things that are fun, cool, easy or interesting, over and against neglect of the boring, icky, square stuff like correctness, syntax, and logic. As such, I don’t feel the need to pay it any more mind than this.

But I wanted to make my stance clear and hope that you can come to some more sensible position after you have had a chance to think about it more, cleared of some of your more deeply wrong misconceptions and biases.

Oh, and:

standard: e. g. ASCII, ANSI C, ECMAScript, PPP (or, loosely, the Recommendations provided by the W3C). Standards are ideally created with the intention of widespread adoption to prevent pointless diversions of approach and implementation, for the sake of efficiency and to reduce waste and incompatibility. That there are often many standards for a given practice is no drawback, but rather a function of competition between those with investment in the one or the other, and offers the market a choice. Often several are adopted, by those not forced to agree on one (as in VHS, a consumer grade video standard, and Beta, the higher quality, still used in high-end production).

best practice: e. g. IFR, as a replacement for the Fahrner Image Replacement technique, is a best practice. It is the best of many ways of achieving similar results, that satisfies the most of the possible goals.

gruesome hack: e. g. the fact that you couldn’t use Flash in both IE4 and NS4 without using both the <object> and <embed> tags. Not something to be celebrated, rather something to be endured.

abstraction: deliberate ignorance of the differences among a class of objects, to allow focus upon the common aspects; also useful for hierarchical organization. e. g. Web standards as a phrase for describing both the letter and spirit of the “law”, as a means to a commonly shared and supported platform for Web publishing.

I didn’t even bother to count the number of errors because there appear to be so many. And this is the site that Mr. Champeon is CTO of! Mr. Champeon, a leader of the WaSP! Not just a leader of the WaSP either… but the WaSP member who singlehandedly turned me off from his organization with an e-mail exchange we had after the ESPN.com relaunch. Luckily I was contacted separately by other WaSP members wishing to offer their sympathies, so I don’t condemn the organization as a reflection of Steve. There are clearly other people involved in the WaSP who I share ideals with.

I think you probably would have been better off taking your own advice and ignoring this post since according to you, it is nonsense. Now you just look like a hypocrite.

http://www.hesketh.com validates perfectly for me as XHTML 1.0 Transitional using the W3C validator. My guess is that the htmlhelp.com one sends a user-agent string matching an extremely old browser (NS4 for example) and the Hesketh site is set up to serve old crufty HTML to old crufty user-agents.

Hmm… after a bit of poking around there doesn’t seem to be a user-agent sniffer on there after all. I don’t know where the discrepancy between the two validators is coming from, but the page is certainly valid.

hesketh.com validates fine for me. if Steve just fixed it to avoid looking bad in this discussion, it only goes to show how easy it is to fix most validation errors.

of course whether it validates or not, whether or not the rest of the WaSPs agree with him, and whether or not Steve is a hypocrite, an angel, or a child killing satan worshipper with bad breath has absolutely no bearing on the validity of his argument.

Thanks for the validation, Mike, if you’ll pardon the pun. I’ve fixed all the errors you found, and now I have something to take back to the team who did the markup. The site validates now.

Nearly every error you found, for what it’s worth, was caused by weaknesses in the tools used to produce the documents: copy-paste errors introducing non-standard characters, or duplicated IDs, or a mixture of HTML and XHTML, or just junk markup (one presentation given in Powerpoint and then saved as HTML).

I wish they all followed the same standards, I wouldn’t have had to waste another fifteen minutes hunting all these errors down.

What you missed, however, in your eagerness to paint me as a hypocrite, is that the vast majority of the site is valid, is coded in XHTML 1.0 Strict, is usable in everything from lynx and the lowliest handheld browser (such as EudoraWeb on my Kyocera Smartphone) to the most powerful desktop browser, and because the markup is, with very few exceptions, structural and semantic and the presentation is achieved by way of CSS, it is easy for us to keep the same underlying structure while providing completely different look and feel. And it’s a lot lighter than our old site, so we’re saving on bandwidth and CPU and in a variety of other ways.

Can you say that about ESPN.com? Can you say that about any of your sites? How much money do you waste every day because of your stance on standards? Does your boss know about all this waste? How many people visit your sites every day and find they can’t use the site due to your approach to “good enough for most is better than nothing” Web design? Do they even tell you? Would it matter?

Anyway, I made my point, you made yours, and it’s up to the next reader to decide whether a few validation errors on my company’s site that took me fifteen minutes to fix invalidates my arguments above, or if you win because you got to call me a hypocrite and make uncalled-for comments about the Web Standards Project, on whose behalf I did not post.

The discrepancy comes from the fact that the WDG validator validates entire sites, while the W3C validator just validates an individual page.

While Mr. Champeon’s front page may be perfectly valid, apparently countless other pages around his site are not. This is the beauty of the WDG validator… it doesn’t let you get away with that kind of stuff.

Great article… sure to stir up all kinds of fun. I may post more on the topic later on, but before I forget… you mentioned looking for a validator that allows you to turn off errors in #3. CSE HTML Validator allows you to control if and issue is reported as an errors/warning/comment or turn it off completely. It will also follow links and return reports on any page it follows (restrictable to directories). Anyway… we find it quite useful to make it report on only what we want it to so thought I’d pass it along.

Steve Champeon should just serve his pages with the application/xml+xhtml MIME type when the browser indicates it accepts that, then you will KNOW, instantly, whether your page validates or not :). But his page having some validation errors is really the browser’s fault, and not his fault. When you are programming something, errors sneak in every now and then. They are often ignored by the browser, and one doesn’t run the validator on every page they create, so often they go by unnoticed.

In any case, again, my stance on this subject is that (X)HTML is a programming language just like any other, and with just a strict syntax as any other. If you forget to add a ; after a statement in Java or C++, it will not compile. Similarly, if you forget to close a tag in XHTML, it doesn’t validate anymore, and when the browser is in ‘strict’ mode, such as when served with the abovementioned MIME type, subsequently it won’t render. Basically saying you can’t handle such strict rules doesn’t make me think highly of your programming skillz0rz. Just like my bank’s website officially not supporting non-IE browsers anymore makes me place strong question marks at the quality of the underlying code and the security thereof. I mean, if they aren’t even capable of producing a good web page…

Anyways, that most browsers ignore such errors or rather, try to make the best out of it when it has a text/html MIME type because of legacy considerations doesn’t mean it is the best solution. It makes people lax and conceive the idea that writing bad code is good – it is not. I really think that if browsers had not accepted invalid code from the inception of the internet until now, we would live in a better world :). True, it would perhaps be less easy for beginners to start web design and it might have dampened the success of the internet, on the other hand, instead of trying to edit code by hand those ‘n00b’ users would have used WYSIWYG tools, which obviously would also have created conforming code.

Or there might be some other web expression language. Similarly to C++ vs. Basic, one for the professionals and one for the dummies. But I actually guess we got that now… XHTML vs. IE-ish-HTML-without-DOCTYPE ;p.

As Steve friendly (tsk) pointed out: “…is usable in everything from lynx and the lowliest handheld browser (such as EudoraWeb on my Kyocera Smartphone) to the most powerful desktop browser” (…and the rest of that paragraph). That is one of the really important things. For that, validation really is just a tool, not the be-all-end-all (or however the proverb goes ;p). But having validating code and following the standards also in places the validator can’t touch (such as using tables for tables and not layout) takes care of 99% of the effort of creating an accessible site.

In CSS groups, if people ask a question like ‘hey this doesn’t work as it’s supposed to’, or something similar, the firstmost answer that comes to mind is ‘have you validated your code?’. In many cases this takes care of the question. Having validating, standards code solves many of the interoperability problems and problems with web sites in general.

And about that WDG validator – no wonder such amounts of errors scare you. Whole-site validation, pfff… Often errors are introduced by scripts running on many pages, so you see the fault in one script recur a lot of times. Validating and fixing one page at a time is much more doable and less discouraging. Also, one fix usually takes care of multiple errors. A page with 30 errors probably only needs 10 small issues fixed. If the errors are is in pages generated by a script used on 20 other pages, it fixes 600 errors.

~Grauw

p.s. Feel free to validate my page ^_^.
p.s.2. It is brand new and still in development so forgive me the not-so-sparly looks of it.
p.s.3. I don’t hate you for not having validating pages ;p.
p.s.4. I actually really dislike the idea some people have that if your page doesn’t validate a 100%, it isn’t good. Take a look at the code, see if it is structured, nicely seperates layout from content, has a doctype, degrades well when you disable the styles and/or view it in a more limited browser or device, and only then form your opinion about it. Needless to say I have no doubts that Steve’s pages have not much to complain about, being the person he is.
p.s.5. I also really don’t see a need to intentionally break your page’s validation just for the sake of it. :) I wouldn’t say childish because I really have no intention to offend you, nor do I think it will make you any more willing to share my opinion, but it is rather pointless, does nothing except make a bit of a fool of yourself.

Paul, yes, there are CSS errors – and it’s very well documented as to why they are in use. As older browsers are retired, we’ll be able to fix our broken CSS in one place, rather than having to scrap an entire site and start over. We consider this better than introducing invalid markup across the entire site as it allows us to restrict our deliberate workarounds to a single file or small set of files.

Feel free to read the comments at the top of the stylesheet you pointed to, it may explain things a bit better and maybe give you some ideas as to how to work around browser bugs (or exploit them to work around still other bugs).

As parser workarounds take into account the way things actually work in actual browsers, and can be easily hidden for pure validation purposes, just as Mike’s little button can, I feel quite justified in using them when necessary.

Some feel differently. There’s an argument to be made for both sides, and which side you come down on depends on what matters most to you – using baseline Web standards to save time and money while providing universal access to the end user, regardless of their choice of platform, or hewing to the strictest possible interpretation of the standards themselves.

Thank you for the reply, Steve. To be honest with myself, I should have taken a look at the actual stylesheet before saying anything, for that I apologize.

I think people are making some really great points here, especially you Steve. I love reading what other people have to say about symantics and web standards, even though it seems to bore many – which I dont particularly understand.

Interesting post indeed, and I must say i’m not surprised that we’re already seeing the clashing factions appearing. While sites I create are, at the time of going live, valid XHTML 1.0 Strict, the loss of control as it passes over to the client for upkeep and the adding of content (often simply copied and pasted) usually means this lasts for a matter of days.

While I personally code to the w3c standards, I know exactly where Mike is coming from and why he’s reached his decision. The amount of preaching that goes on from those involved in the web standards arena is almost intolerable at times and it was inevitable that those well versed in the theory and implementation would rebel eventually. Indeed, i’ve actually been emailed before by a standards zealot who validated a site i’d made and lambasted me for it not being valid xhtml – a problem caused by that old favourite, an unencoded ampersand (which was added outwith my control).

I think that those of us who *choose* (important part there), to author valid code have no grounds to judge those who don’t – especially when it’s someone such as Mike, who is undoubtedly capable of producing code at the same level, or above, that of the somewhat arrogant Steve Champeon..

On an unrelated note, while I like the live-preview on the comments, in Firefox 0.9 (not sure if it’s a specific browser problem) I can’t use the arrow keys to navigate the text field, for editing, etc. Apologies if that’s been covered elsewhere.

Cameron, I am using Firefox 0.9 and it works just fine. Sounds like an installation problem to me (you didn’t upgrade from 0.9rc did you? try a clean profile, or uninstalling then reinstalling).

What you say about ‘lambasting’: really, just ignore such comments. If people can’t look beyond the validator at the actual code, they’re not worth your attention. Btw, if the data was input using a script, there’s always the PHP ‘htmlspecialchars’ function which easily takes care of unescaped & characters. An unescaped & will prevent your page from being usable with an XML parser…

Just to be perfectly clear about this, Cameron, I was not criticizing Mike for not producing perfectly valid code so much as I was criticizing him for making a flawed argument against same.

By confusing the meaning of the word “standard”, even beyond the widespread confusion that already reigns among many developers, Mike is undermining the basic point: that Web developers and designers need the browsers to support a baseline set of standards, and the best way to convince vendors to provide that support is to use it on the sites they create.

I share the common distaste for nit-picky folks who expect 100% compliance, regardless of practicality. I sympathize, like Laurens above, with the basic idea that markup should be treated like code, and that it is better to be syntactically correct and as elegant as possible. The great thing is that this is finally, to a large and powerful extent, possible in recent browsers, and with a few tricks you can extend that to older and underpowered browsers, too.

But Mike, in his piece above, doesn’t argue for developers to support the very standards we fought so long and hard for – instead, he mocks the idea of validation, laughs at the W3C as a source of this agreement and concord (such as it is) we now enjoy, confuses standards and conventions and procedures and deliberate sabotage, and encourages all of us to make our own standards. That unless we’re browser vendors this is absurd and impossible notwithstanding, in the end it boils down to the fact that if you enjoy the current state of affairs and expect any improvement, it is necessary to make use of the tools we’ve begged for and to some extent been given.

I don’t know if any of the folks here remember the browser wars, but I sense – that having arrived at a relatively stable state of affairs, they’ve forgotten how difficult it used to be to simply make something work in both Mosaic and Netscape or IE and Netscape or across platforms. That it is no longer as difficult now is directly due to the vast amount of pressure and work that went into getting vendors to adopt that baseline of support.

Mike and the others who have recently made noise about how much they hate Web standards and its advocates quite simply do not get it. And so I am doing what I can to make sure that for every ill-founded argument against standards there is someone around to point out that without standards, we wouldn’t have the nifty numbered backgrounds on posts, or the live preview, or the comment feature, or the ability to even read this site on more than one platform. And anything said to the contrary is another failure to recognize the fortunate position we occupy, and why.

Great article, thanks for the end of the day laugh. Seems you and SeÃ±or Champeon have some history. I can’t decide if his WaSP ecclesiastic preachings humor me more or if it’s your refreshing view on standards.

Steve, please don’t take this the wrong way, but you are kind of like the really strict teacher who makes kids not want to go to school.

I, and alot of other people, could really learn a lot of good stuff from you, but we’re just not going to sit through an hour of you telling that the web is black and white and that we’re either perfect or we suck. Instead, we will sit in the back of the class, draw caricatures, and rebel if necessary. And in the end, we will get our information and our influences from other places.

I draw my influences from people who solve problems creatively. I again point to Shaun Inman as a shining example of such a person. I also draw influences from people who recognize and respect what I do, regardless of what disagreements they may have with me. Zeldman is an example of such a person. If you told me to take down my Invalidator Badge, I’d say no way Jose. If Zeldman did, I’d take it down and wouldn’t even ask why. And that is what mutual respect will buy you.

I teach in a new media/web design program and we force our students to always validate their pages. If nothing else, it forces the students to better understand the HTML/XHTML markup languages and helps produce good CSS troubleshooting techniques.

You are correct in your assertion that validating will probably never make you the greatest coder, but I believe it will certainly help make you a better coder.

Back in June of last year, Mike announced that the ESPN site redesign was standards compliant, which was false, except perhaps in the loosest sense of “doesn’t use tables for layout. much.”

He then, as now, in an interview with Eric Meyer at Netscape’s DevEdge, mocked the idea of validation with the comment:

Telling me my site needs to validate in order to be standards compliant is like telling me I need a flag in my yard to call myself an American.

Simply put, no. Telling you your site needs to validate to be standards compliant is following the definitions laid down by the people who wrote the standards, which is what being in compliance means. Whether the browsers are more forgiving or not than the standards themselves is irrelevant.

In essence, like he does above, Mike gave legions of Web developers reason to stop worrying about basic syntax, and use the current browser as the only benchmark that matters.

He then linked to the Web Standards Project site from the page he forcibly redirected the users who lacked “standards compliant” browsers.

Did he give us any advance warning? No.

When angry ESPN users emailed the Web Standards Project, instead of ESPN, who I had to answer that email, saying, in essence, that I wasn’t even aware that ESPN had linked to us? Me.

Did he ask us for permission to use the WaSP logo on his upgrade page? Did he thereby give us an opportunity to decide whether we wanted to allow the use of our name and image in association with a site that did not even validate? No.

He did eventually remove the logo, and he did give the idea of standards some greater public airing than perhaps had been done before, but in the end I was extremely unhappy about the fact that he essentially boiled things down to “validation does not matter” and “standards compliance is what I say it is”, neither of which is true, and both of which miss the point I’ve tried to make above.

Which is, perhaps, why I’m even bothering to post here – this is not the first time Mike has said things that, whether through ignorance or malice, threaten the future of the Web.

As an aside to Mike: you may be right in that by insisting that you actually make sense, or use terms in the ways they were intended, I’m going to make you not want to listen to me. Oh, well.

I have, over my ten years in this business, done my best to come up with novel and interesting approaches to solving Web design and development problems. I’ve written and edited articles, books, and spent time involved in efforts, like the WaSP, that have to helped make it possible for a site to even be somewhat “standards compliant”. I’ve tried to popularize others’ findings as well, such as in my article on Progressive Enhancement, and I’ve encouraged my team to do the same. One of our guys wrote an article that is now required reading at a university, for example, and we’ve tried to lead by example (such as with our own site, which you so easily mocked, without bothering to look deeper). I’ve spent seven years hosting and listmothering the webdesign-L list, and spent thousands of hours giving away advice and so on. Many have found my contributions useful or helpful.

I’m sorry you don’t find me involved in “creatively solving problems” or inspiring to you personally. Perhaps Zeldman will see this article and, by appeal to authority, get you to see why what you are saying is so dangerous and stupid.

Fundamentally, there is a difference between saying you are standards compliant when you are not, and saying you’ve achieved as much compliance with standards as is pragmatic.

The latter is just fine, and leaves room to grow. The former is just false, and the risk is that by so visibly lowering the bar even as you hone your leap, you encourage others to stop trying to leap at all.

Laurens: thanks for the reply, unfortunately it wasn’t through a form, they had taken it upon themselves to edit one of my templates!

re: “Mike and the others who have recently made noise about how much they hate Web standards and its advocates quite simply do not get it.”

Oh I think he does get it, only too well. I’d be very surprised if any of those contributing to this little discussion aren’t thankful for the platform standards give us for multi-device/software development.

The point he makes in the article that sticks with me is that by adopting techniques such as IFR, while we may be doing so for “selfish” reasons (putting design control back in the hands of the designer), if enough people were to do it then there is every chance it could become a universally adopted standard. This to me, is a very interesting concept, as while (for example) the w3c does a very good job indeed with their recommendations they are by no means perfect and fall short of offering the level of control i’d like to see .. especially from a typography standpoint.

When faced with the choice of an admittedly unperfect solution, or no solution at all, I know what the majority of designers would go for..

“This is not the first time Mike has said things that, whether through ignorance or malice, threaten the future of the Web.”

Oh my… I just had a serious power trip. Am I really able to threaten the future of the web? I thought only Microsoft could do that.

Here is the difference between you and me: I see things as getting better. You see them as getting worse. Because ESPN doesn’t validate, that means people will start writing worse code? What?

How about this: because ESPN made a good business case for designing with standards, other companies could now more easily make the same business case in their own redesigns? Do you like what Sprint has been doing better? I do. I think it’s great, and I would only be so flattered if we helped them justify their redesign in some small way.

No, I don’t see them as getting worse – on the contrary, I see them as standing a chance to get a lot better.

Why? Because of the efforts of Dave Shea and Eric Costello and Owen Briggs and Doug Bowman and Craig Saila and all the folks on css-d and more I haven’t named.

The people who have invested the time, because they thought it was the right thing to do, to investigate and experiment with and publish their findings with respect to CSS layouts and the separation of presentation from structure.

These few made it possible for the many to skip to the end of, or at least avoid a lot of pain involved in, the CSS learning curve.

They didn’t claim “standards compliance”, or redefine words like “valid”, if their code wasn’t valid, and they certainly didn’t tell people validation didn’t matter. What would be the point, unless they felt that by failing to achieve valid markup or CSS, they’d somehow failed in a greater way, that they had to justify that failure by mocking its importance?

On the contrary, they kept their standards high (if you’ll pardon another pun) and were honest about whether they’d actually achieved what they set out to achieve, and didn’t feel the need to denigrate that which they failed to achieve. And when they compromised, like Zeldman advocating tables for layout for the time being, they made rational arguments about cost and pragmatism and the state of browser support.

I haven’t been following the Sprint redesign, though I did talk at some length with Jessica Hutchinson at SxSW last year and it seemed like they were doing great things, or would if their toolset let them. I’m glad to hear they’re still pushing, still trying to reach the goal.

And just to clarify one other thing – I’m very familiar with the constraints we operate under. I’ve worked on sites where for some reason or another we couldn’t achieve valid documents or standards compliant design approaches, due to tool constraints or other reasons.

My team is working on a site right now where we have to deal with LTR/RTL bugs that threaten to make all of our efforts – at reducing the markup load and increasing the flexibility and reach of the site – for naught. We have to justify everything we do in terms not only of present investment, but future maintenance. Hopefully, by the time we’re done, we’ll have achieved our goals in all contexts where it’s possible to achieve them, and compromised in those areas where it’s not. But when we’re done, if the pages don’t validate, we won’t say they do or that it doesn’t matter; if the site is leaner, and the backend easier to maintain even with whatever workaround we’ve added, we’ll focus on that.

I can relate to those who continue to deal with such constraints. But we do, when possible, shoot for the brass ring. And when we do achieve something, we try to be honest with ourselves as to how far remains to go. What we do not do is redefine pi to 3 because it’ll be easier on the farmers.

I can’t help but think that you’ve made up your mind that I’m a rabid standards advocate of the sort that Cameron decries. I’m not. But I do think that they’re important, and that part of keeping them important involves being honest about what you’ve been able to achieve and not simply yanking down ideals to your level because you’re stuck in the real, or redefining terms to suit your achievement, or by denigrating the stuff you haven’t been able to achieve as unnecessary.

What with Keith Robinson retiring from his position as web standards advocate and this article over at Mike Davidson’s site, it seems the rebellion against the preachers of web standards may be beginning. I think most of us are aware…

I don’t want to open a can of worms here, but I thought it might be helpful to share my experience over the last week trying to put in place a better system to keep Asterisk valid. Don’t worry I wont bore you with all the details. As you may…

*gasp* You Can Mute Sound From Flash?!?!? – Proclaiming that You *Hate* Standards

I’ve just been reading a couple of posts by Virtuelvis. Apparently there’s an app that runs in the background that can make sure you never have to endure the shocking terror that is flash-embedded-music. I honestly tend to jump whenever…

Well, first off welcome to elliotswan.com. And second off, I encourage you to bookmark this site (or even subscribe to my RSS feed), and keep on comin’ back. There should (hopefully) be lots of good content.
As my first official post, I’v…

[…] 11th, 2007 You know, for some reason this guy makes a whole lot of sense. Mike Davidson – March to Your Own Standard You know all the rules, but only through trial-and-error will you learn all of the many exceptions. […]

[…] Have you run it through the W3C HTML and CSS Validators? Whether you believe in standards or make a great argument for breaking them, the W3C validators are still a great way to catch minor errors in your code. We designers are […]

[…] Secondly, opacity is a CSS 3 property, so your stylesheet will not validate. On a lighter note, I honestly don’t think your stylsheets should necessarily validate. I’d rather my site works like I want it to (across most browsers), than have a nice shiny validation badge. But hey, that’s just me… and others. […]

[…] to semantically describe content. Once mastered, the web developer is able to make intelligent and conscious decisions on the “right” compromises to be made for a given project. We are constantly working […]