On September 18th, the Electronic Frontier Foundation, EFF, announced that they were leaving the World Wide Web Consortium, W3C, due to its stance on DRM, effective immediately. This was published in the form of an open letter from Cory Doctorow, which is available on the EFF’s website.

There’s several facets to the whole DRM issue. In this case, Cory Doctorow seems focused mostly on the security side of things. Creating an architecture to attach code that manipulates untrusted data is sketchy, at a time that browser vendors are limiting that attack surface by killing as many plug-ins as possible, and, in this case, a legal minefield is layered atop it due to copyright concerns. Publishers are worried about end-users moving data in ways that they don’t intend... even though every single time that content is pirated before its release date is a testament that the problem is elsewhere.

We can also get into the issue of “more control isn’t the same as more revenue” again, some other time.

As for the consequences of this action? I’m not too sure. I don’t really know how much sway the EFF had internally at the W3C. While they will still do what they do best, fight the legal side of digital freedom, it sounds like they won’t be in a position to officially guide standards anymore. This is a concern, but I’m not in a position to quantify how big.

HTML is a format that translates text into a hierarchy of special objects, called elements, that can be arranged into Web content. The specification is controlled by the W3C, who just promoted HTML 5.1 to “W3C recommendation,” which is their final stage for a standard excluding errata or a wholly new version.

Because standardization, intentionally, takes a very long time, this is not about new features or anything like that. In fact, one of the changes that I found interesting was the removal of appCache. This feature was originally designed for web applications to operate offline by ensuring everything it needs is stored locally. It wasn't really surprising, since Firefox actually warns users that it's deprecated since version 44, but notable none-the-less. (If anyone is wondering, Service Worker API replaced this API. Yes, I am aware of the Web standards joke “there are two standards for everything, but one is deprecated and the other is experimental”.)

Recently, the W3C has officially recommended the whole HTML5 standard as a specification for browser vendors and other interested parties. It is final. It is complete. Future work will now be rolled into HTML 5.1, which is currently on "Last Call" and set for W3C Recommendation in 2016. HTML 5.2 will follow that standard with a first specification working draft in 2015.

For a website to work, there are several specifications at play from many different sources. HTML basically defines most of the fundamental building blocks that get assembled into a website, as well as its structure. It is maintained by the W3C, which is an industry body with hundreds of members. CSS, a format to describe how elements (building blocks) are physically laid out on the page, is also maintained by the W3C. On the other hand, JavaScript controls the logic and programmability, and it is (mostly) standardized by Ecma International. Also, Khronos has been trying to get a few specifications into the Web ecosystem with WebGL and WebCL. This announcement, however, only defines HTML5.

Another body that you may hear about is the "WHATWG". WHAT, you say? Yes, the Web Hypertext Application Technology Working Group (WHATWG). This group was founded by people from within Apple, Mozilla, and Opera to propose their own standard, while the W3C was concerned with XHTML. Eventually, the W3C adopted much of the WHATWG's work. They are an open group without membership fees or meetings, and they still actively concern themselves with advancing the platform.

And there is still more to do. While the most visible change involves conforming to the standards and increasing the performance of each implementation as much as possible, the standard will continue evolving. This news sets a concrete baseline, allowing the implementations to experiment within its bounds -- and they now know exactly where they are.

The main benefit of open Web Standards is that it allows for a stable and secure platform for any developer to target just about any platform. Still, due to the laws of No Pain: No Gain, those developers need to consider how their application responds on just about every platform. Internet Explorer was once the outlier, and now they are one of the most prominent evangelists. It has been barely two months since we reported on the launch of modern.IE for Microsoft to integrate existing solutions into their product.

When we first covered modern.IE back in February (again, here), the initiative from Microsoft was created to help test web apps across multiple versions of Internet Explorer and check for typical incompatibilities. With the addition of Sauce Labs, Microsoft hopes to provide better testing infrastructure as well as automatic recommendations for common issues encountered when trying to develop for both "modern" and legacy versions of their web browser.

In my position, this perfectly highlights the problems with believing you are better than open architectures. At some point, your platform will no longer be able to compete on inertia. Society really does not want to rely on a single entity for anything. It is almost a guarantee that a standard, agreed-upon by several industry members, will end up succeeding in the end. Had Microsoft initially supported the W3C, they would not have experienced even a fraction of the troubles they currently face. They struggle in their attempts to comply with standards and, more importantly, push developers to optimize for their implementation.

There are very good reasons to explain why we do not use AOL keywords anymore. Hopefully the collective Microsoft keeps this grief in mind, particularly the Xbox and Windows RT teams and their divisions.

Microsoft has been doing their penance for the sins against web developers of the two decades past. The company does not want developers to target specific browsers and opt to include W3C implementations of features if they are available.

Microsoft traditionally fought web standards, forcing developers to implement ActiveX and filters to access advanced features such as opacity. Web developers would program their websites multiple times to account for the... intricacies... of Internet Explorer when compared to virtually every other browser.

Now Google and Apple, rightfully or otherwise (respectively, trollolol), are heavily gaining in popularity. This increase in popularity leads to websites implementing features exclusively for Webkit-based browsers. Internet Explorer is not the browser which gets targeted for advanced effect. If there is Internet Explorer-specific code in sites it is usually workarounds for earlier versions of the browser and only muck up Microsoft's recent standards-compliance by feeding it non-standard junk.

It has been an uphill battle for Microsoft to push users to upgrade their browsers and web developers to upgrade their sites. “modern.IE” is a service which checks for typical incompatibilities and allows for developers to test their site across multiple versions of IE.

Even still, several web technologies are absent in Internet Explorer as they have not been adopted by the W3C. WebGL and WebCL seek to make the web browser into high-performance platform for applications. Microsoft has been vocal about not supporting these Khronos-backed technologies on the grounds of security. Instead of building out web browsers as a cross-platform application platform Microsoft is pushing hard to not get their app marketplace ignored.

I am not sure what Microsoft should fear most: that their app marketplace will be smothered by their competitors, or whether they only manage to win the battle after the war changes theaters. You know what they say, history repeats itself.

I ran across an article on The Verge which highlighted the work of a couple of programmers to port classic Realtime Strategy games to the web browser. Command and Conquer along with Dune II, two classics of PC Gaming, are now available online for anyone with a properly standards-compliant browser.

These games, along with the Sierra classics I wrote about last February, are not just a renaissance of classic PC games: they preserve them. It is up to the implementer to follow the standard, not the standards body to approve implementations. So long as someone still makes a browser which can access a standards-based game, the game can continue to be supported.

A sharp turn from what we are used to with console platforms, right?

I have been saying this for quite some time now: Blizzard and Valve tend to support their games much longer than console manufacturers support their whole platforms. You can still purchase at retail, and they still manufacture, the original StarCraft. The big fear over “modern Windows” is that backwards compatibility will be ended and all applications would need to be certified by the Windows Store.

When programmed for the browser -- yes, even hosted offline on local storage -- those worries disappear. Exceptions for iOS and Windows RT where they only allow you to use Safari or Trident (IE10+) which still leaves you solely at their mercy to follow standards.

Still, as standards get closer to native applications in features and performance, we will have a venue for artists to create and preserve their work for later generations to experience. The current examples might be 2D and of the pre-Pentium era but even now there are 3D-based shooters developed from websites. There is even a ray tracing application built on WebGL (although that technically is reliant on both the W3C and Khronos standards bodies) that just runs in a decent computer with plain-old Firefox or Google Chrome.

Open Web Standards has reached a new milestone on Monday when the W3C published their completed definitions for HTML5 and Canvas 2D. There is still a long and hard road until the specification becomes an official standard although the organization is finally comfortable classifying this description as feature complete.

The “Web Platform” is a collection of standards which form an environment for applications to target the web browser. HTML basically forms the structure for content and provides guidelines for what that structure physically means. CSS, Javascript, Canvas 2D, WebGL, WebCL, and other standards then contribute to the form and function of the content.

HTML5 allows for much more media, interactivity, and device-optimization than its 1999 predecessor. This standard, particularly once finalized and recommended by the W3C, can be part of the basis for fully featured programs which function as expected where the standard does.

This is an important milestone but one by no means the final destination of the standard.

The biggest sticking point in the HTML5 specification is still over video tag behavior. The W3C pushes for standards it recommends to comply with its royalty-free patent policy. Implementation of video has been pretty heavily locked down by various industry bodies, most noticeably MPEG-LA, which is most concerning for open source implementations which might not be able to include H.264. There still does not appear to be a firm resolution with this recent draft.

Still, once the patent issues have been settled, video will not just be accessible in static ways. Tutorials exist to show you how to manipulate the direct image data resulting from the video to do post-processing effects and other calculations. It should be an interesting abstraction for those who wish to implement video assets in applications such as for a texture in a game.

HTML5 is expected to be fully baked sometime in mid-2014. It would be around that time where HTML5.1 would mature to the state HTML5 celebrates today.