December 29, 2012

W3C and IETF coordination

This is the third of a series of posts about my personal priorities for Web standards, and the relationship to the W3C TAG.

Internet Applications = Web Applications

For better or worse, the Web is becoming the universal Internet application platform. Traditionally, the Web was considered just one of many Internet applications. But the rise of Web applications and the enhancements of the Web platform to accommodate them (HyBi, RTCWeb, SysApps) have further blurred the line between Web and non-Web.

Correspondingly, the line between IETF and W3C, always somewhat fuzzy, has further blurred, and made difficult the assignment of responsibility for developing standards, interoperability testing, performance measurement and other aspects.

Unfortunately, while there is some cooperation in a few areas, coordination over application standards between IETF and W3C is poor, even for the standards that are central to the existing web: HTTP, URL/URI/IRI, MIME, encodings.

W3C TAG and IETF coordination

One of the primary aspects of the TAG mission is to coordinate with other standards organizations at an architectural level. In practice, the few efforts the TAG has made have been only narrowly successful.

An overall framework for how the Web is becoming a universal Internet application platform is missing from AWWW. The outline of architectural topics the TAG did generate was a bit of a mish-mash, and then was not followed up.

7 comments:

as long as there are different opinions on how standards should be implemented, the standards will be hindered by bastardisation because someone wants to do things their way.

These organisations need to come to an agreement on how to implement web standards as a matter of priority. Otherwise this will tax the browser makers and end systems with bloated web browsers because they need code to render each implementation. It would be like the days of IE6 all over again

"different opinions on how standards should be implemented" points to a couple of common fallacies.

Systems are built from components which connect through interfaces. The architecture is the overall framework of how the components and interfaces are organized. Protocols (like HTTP) and languages (like HTML and CSS) and APIs (like Canvas 2D) are interfaces. Implementations (like Webkit or Gecko) are connected to other implementations using the interfaces.

Standards (for the most part) describe the interfaces. In a good architecture, if the standards are good, then the system will support widely varying opinions about how the standards should be implemented, while still insuring that these varying implementations can still interoperate.

"These organisations need to come to an agreement" is interesting, because the standards organizations are, for the most part, volunteers from the same organizations.

So you'll see people from each of the browser makers show up at IETF and (usually different) people from the same organizations show up at W3C or participate in WHAT-WG.

I'm less certain of the motivation or even if it is a conscious plan. But I think it's reasonable to treat the divergences less a matter of entrenched positions of companies and more a matter of individual technical judgment coupled with ambition.

Unfortunately, it took me years of studying architecture to understand this; meanwhile, the explosive growth of the Web has wrought a new generation of developers who are, as a group, dismissive of any and all architectural lessons learned before their time.

Repeating historical mistakes seems to be the human condition. Although TAG is tasked with avoiding exactly that, I don't see how a democratic process will lead to any result but relegating those who do grok this architectural nuance to a minority vote.

So I expect the pendulum to swing back to library APIs, until such time as the majority re-learns the lessons of the past; at which point I expect the pendulum will swing back to network-based APIs for the Web.

That things are bound to get worse before they get better imbues me with feelings of sadness, but not of despair. Once we've "rediscovered" the reasons for architectural sanity, we'll return to sane architecture. For a time, anyway.

I wonder if it is a bit early to say the web is becoming the universal application platform. Less than 5 years ago people were saying the web was dead as mobile apps took steam. Desktop widgets were also once an important factor in people accessing the internet. I think before we say something is becoming universal we should wait at least 5 years after it has become dominant (given the popularity of mobile apps I am not sure if the web application is dominant right now).

The tricky thing about the latest generation of technology is that the popularity of a technology is as much a function of fads as it is of technological superiority. That is why the W3C must be careful about which technologies they embrace.

Whether the web _could_ become the universal application platform depends, in no small measure, to how good a job the standards community does in designing a well-architected, modular, scalable, extensible platform. http://www.w3.org/QA/2009/06/orthogonality_of_specification.htmlI think a lot of current standards design work is short sighted, fighting the old battles of the first browser war or the new multi-billion-dollar IPR wars for control of the mobile platform (iOS, Android, Boot to Gecko, Windows 8)?

This is not time for W3C to be a cautious follower, waiting carefully to see which technology will "win". Perhaps your company or mine might wait cautiously, but it's time for W3C to lead the web to its full potential, for us to work together to fix the problems that might be of temporary advantage to some of the players but ultimately harm the common good.