a web whose major content formats are not controlled by a single vendor

A goal which I agree with, and the basis for my series of Flex posts, which he also referenced. So far, so good. As he continued, I got confused. He asks us to:

Consider just the open standards that make up the major web content languages: HTML, CSS, DOM, JS. These mix in powerful ways that do not have correspondences in something like a Flash SWF.

I agree with his assessment of the powerful ways in which these technologies combine. But much of what he finds laudable are technical properties — they don’t derive from the fact that these are open standards. It’s just a fortunate (or perhaps, designed) outcome that those are the technologies that are combined in a browser. After all Java, C#, and even C++ have been standardized (well at least if you believe that the JCP is standards body), so being an open standard technology is not a guarantee that you’ll have the properties that make the web “alive” according to Brendan. It seemed like what was really being discussed was the “live web”, not the “open web”.

The place where I really got lost was when he started discussing the future of the open web,

Implicit in my writing is the assumption (conclusion, really) that browsers can adopt the necessary advanced rendering and faster virtual-machine programming language support that the “rich client platforms” boast (or promise in version 2.0). … There’s no technical reason this can’t be interoperably supported by other browsers in the near term.

There’s no technical reason, but there are plenty of political/business reasons. Every browser implements each of the open standards to a varying degree. They implement different versions of the specs. They implement each spec imperfectly. That translates into lots of debugging and testing when building an application atop the open web. I like the improvements that are likely to come in Firefox. The problem is that until many of those improvements appear (if ever) in Safari and IE, it will be hard to justify using those improvements, because it means writing multiple versions of the same code and then qualifying those versions. Contrary to Brendan’s assertion, big companies with armies of developers might have the resources to devote to all that additional work, but small development houses are the least able to tolerate that additional labor. Since Microsoft has an interest in advancing WPF/E, part of the Closed web, it’s hard to imagine that they will be motivated to improve IE quickly enough for innovative Live web features in Firefox and Safari to make a difference to application developers versus something like WPF/E or Flex. The risk to Microsoft is that instead of collecting those developers themselves, they lose them to Adobe.

6 Responses to “The Open Web, the Closed Web and the Live Web”

Ted, I think you are on to something. I’m curious about your thoughts… If what we really want is “a web whose major content formats are not controlled by a single vendor” then isn’t the solution to this with Flash to have a significant Flash Player competitor (like one with near 50% market share)? (And when I say “competitor”, I mean like Gnash, not WPF/E.) If Adobe were to just Open Source Flash, I don’t really see how that keeps us from still having the SWF format controlled by a single vendor. Adobe would certainly still control the distribution of Flash Player, which to some means that they still control the content format. Because even if someone forks Flash Player it may never reach significant adoption. As you point out, to some extent MS controls HTML, CSS, JavaScript, etc. Because they control the browser that 80% of the world uses, if some new feature isn’t added to IE, you aren’t going to see significant adoption of the feature. Look at the battle that is happening over HTML 5 right now. It doesn’t matter one bit what standards we create and the W3C accepts. If MS doesn’t add support for the standard into IE then very few people are going to use the standard. On the other hand, when MS does put features (standard or not) into IE and the other browsers follow suit, then we get things that aren’t standard, yet used everywhere (XHR is an example of this). Competition is always the healthiest way for an ecosystem to grow and live. But that same competition can create an unstable and inconsistent foundation to build on. So the real “live web” is the one where no single organism dominates the ecosystem or where the dominant organism is trustworthy. There are many parallels to the natural world. Anyways, this is a great conversation to have. Please let me know your thoughts (maybe event in person at Web 2.0!).

I did mean to conjoin (not conflate) open and live. You don’t get “emergent compositionality” or “distributed extensibility” or whatever it’s called with closed formats from runtime vendors who realize significant cost savings by version-locking their runtimes to their for-$$$ tools. They may have some kind of extension story, but it’s usually lame, at best a knock-off of a competing, more open (and therefore more live) system.

I ran into someone just today at Web 2.0 Expo who used to be at Adobe. While there, he heard me argue strongly that the “strong types ™”, “compile/edit/debug” far from the deployed runtime, eval-free nature of Flex targeting Flash distorted its ecosystem away from live software and toward dead hardware. He didn’t get it at the time, but today he spontaneously volunteered that — having done recent prototype work for his new venture in Firebug with Ajax frameworks — he seriously gets it now.

I also spoke to Aptana folks, who have full, Eclipse-based, Firebug-friendly IDE for JavaScript. They gave a convincing demo that suggests your pessimism about the costs to small developers of coping with web browser interop problems is exaggerated. Even without an IDE, modern Ajax libraries hide the browser differences for most users.

I’ve done TCP, NFS, Unix/standard-C portability as well as web standards. Interoperation is hard, but it can be tamed. It is being tamed now, in the midst of “Web 2.0″. MS and Adobe have tools to sell that claim to solve some of the problems that are laid at the feet of Ajax, for which they are both now delivering cross-platform runtimes. But this means they’ll have cross-platform interop bugs, certainly in their early versions.

And anyway, their target markets have to trade the wins possible with their tools and runtimes against the risks of lock-in, the lack of interop utopia even with those expensive tools, and the lack of reach compared to the web. Zero install is huge; I heard that today at Web 2.0 Expo too.

As for when other browsers will adopt Firefox features that are all based on WHATWG specs — Apple and Opera already are committed to the WHATWG. Your question may really be: when will IE adopt those emergent mini-standards? (E.g. the offline web app support forthcoming in Firefox 3, or the canvas tag already in Firefox 2 as well as Safari and Opera.)

That’s a great question. I can’t speak for Microsoft, but I will point out again that the WHATWG strategy involves uplifting IE with platform-based (JScript if possible, no compiled code) add-ons to emulate the new standards. And indeed, Brad Neuberg with Dojo Offline Toolkit is uplifting IE and even older Firefoxes to have the missing offline support forthcoming in Firefox 3. So we don’t need to rely on Microsoft to do the work, since they built a robust (if non-standard) platform.

But to return to the point I was making: open => live is no coincidence, nor is it the result of grand-planning by Netscape or Microsoft ten years ago. It’s an evolutionary effect induced by scale properties of the Web combined with developer and user economics. I will blog about this again in the near term.

A very interesting conversation. I agree with much of your analysis. In particular, the problem with a single vendor still controlling up to 80% marketshare is that vendorâ€™s ability to stifle any forward progress or to direct potential progress down only those paths which disproportionately benefit itself.

What I wonder about is a sense of defeatism that I pick up from your post. Maybe Iâ€™m reading this wrong. If so, just let me know. This is an area where I’d love to be wrong. The post seems to suggest that anything innovative is painful, can perhaps be absorbed by the giant software development houses but is too disruptive for developers who arenâ€™t giant companies.

But the history of Firefox to date suggests that the opposite is true. The role that Firefox has played in encouraging web innovation is not because the big software houses decreed that Firefox was their tool of choice â€“ precisely the opposite. Firefox has been effective because tens of thousands (maybe hundreds of thousands) of developers have adopted Firefox as their development tool, and tens of millions have adopted it as their browser.

This doesnâ€™t change the fundamental unhealthy state you describe where a single vendor has such a deadening ability to stall or direct innovation. But it does suggest to me that we must rely on ourselves â€“ on the distributed, decentralized choices of many players to effect change.

So a few points of agreement. Zero install is huge, particularly for people coming from a desktop software view of the world. I totally support the effort to drive the future via WHATWG.

Re: Aptana – It’s not just the development its the testing effort as well. Most of the Javascript IDE’s are at what I would generously call an early stage. I fully expect this situation to improve over time. But our code has bugs/problems that manifest on particular browser / operating system combinations. We’ve seen bugs in stuff like date functionality (Safari), stuff that modern AJAX libraries shouldn’t have to paper over. You might call that par for the course in terms of level of effort required, in which case I think we’d have to agree to disagree over that.

If WHATWG is going to uplift IE to match new stuff that comes out of WHATWG, I’d consider that to be a huge mitigation of some of the risks that I pointed out. A follow on question would be, “who’s committed to do that work?”. People (well me, anyway) will want to know that such uplift work has a high probability of appearing. It certainly helps the story.

I don’t contest that the open web and the live web are coincident, but I am not convinced that openness automatically leads to the liveness properties. I agree that it opens the possibility, but I think it’s a stretch to say that just because something is open it would become live. Would the issues you describe with Flex targeting Flash go away if Flash/Flex were open/open source? That wouldn’t change the eval-freeness of the technology, for example. This is a more general point about the connection between openness and “good technology properties”, not just the discussion of the open web.

I don’t feel defeatist about the situation, but I do think there are real challenges, and that there are obstacles on the road to a desirable/acceptable future. Innovation opposes the status quo, and part of upsetting that status quo can mean inflicting pain on people who are in that place. Based on my recent experiences with the Cosmo project, where we have sharp developers, are using modern AJAX libraries (Dojo), and so forth, there is more expenditure of labor than I would like to see (Isn’t that always true) – working around browser incompatibilites, running into bugs, etc. That’s somewhere between annoying and painful, depending on your technical sensibilities — I’d be closer to painful than annoying, but I know people who are the reverse.

The whole reason that I wrote the initial postings on Flash was that I don’t think all is lost or anything like that. Rather my goal was to raise awareness of the issues there (my set may be slightly different from Brendan’s), so that people could have the information that they will need in order to make choices that lead to a desirable outcome. I’m not advocating that Mozilla do anything differently than what it is doing. What I would urge you to do is to make sure that what Brendan wrote about uplifts comes to pass. That’s a very important part of the story to tell – that even though one can imagine scenarios where Microsoft wants to stay in the past, that people are committed to bringing it along into the future.

Ted: eval-freeness is not inherent in Flash, generalized and freed from the plugin prison.

if Flash were open source, we wouldn’t need Adobe to engineer merged rendering with Webkit as locked into Apollo, or do our own advanced 2D rendering implementations in Gecko — we would blend the Flash SObject and other code into our browsers (agreeing on standards for the new compositions of Flash and HTML), eval would be implemented in the AVM in short order, you might see uncompressed MXML with scripting support along with crunched SWF, and XUL and MXML would be easier to converge. It’s hard to predict what other great things might happen that can’t right now with Flash closed.

Don’t get me wrong, I’m not a Flash hater. It’s clear that Flash has advantages as a single-vendor plugin over the canvas and SVG features, and any forthcoming video tag support, from the minority share browsers. But those advantages do not always or even often outweigh the costs of being in the plugin prison, of poor accessibility and text usability, of strange focus modalities and other problems associated with plugins — in short, of not being “the Web” in the way that HTML-based content is. The single vendor can only do so much, and in the case of Flash must make money off of tools that tend to aggrandize control over the runtime and content format.