These were some of the questions that the participants were supposed
to address in the workshop:

What functionality is needed for Web applications? What should a hosting environment provide?

What APIs are needed for Web applications (eg. retrieving and sending data over the network, parsing XML, clie

How are they related to Web documents, which are normally static?

Should there be a set of predefined compound document profiles (eg. XHTML Basic + SMIL Basic + SVG Tiny)?

What happens with event processing and style cascading across the boundaries of mixed content?

How can application semantics from different markup languages be mixed in an interoperable way (e.g. using XBL

XHTML+XForms+SVG

Note that the W3C sponsored workshop is already biased
toward an XML based approach. The quesions already
presume that the basic components of any solution would
be XHTML, SVG and XForms. It's not hard to see that as
their bias since those technologies represent a large bulk
of the work done by the W3C in the past few years. Also,
given the current direction of the XHTML 2.0 working group
you get a clear sense of how much weight the W3C puts on
backward compatibility (that is, none at all).

Web Forms 2.0

Now the W3C isn't the only group considering such
issues. In particular Ian Hickson,
now with Opera Software, is the editor of the Web Forms 2
draft specification. Ian and a large group of contributors have been
working on specifying a solution to the very questions raised above.
Their work is not under the auspices of any standards organization
and in part takes its inspiration in reaction to the W3C's XForms
recommendation. Web Forms 2.0 is a direct reaction to the
complexity and non-backward compatible route that the W3C took
with XForms. It is a standard that aims to build on
current working technologies like HTML and to extend them to address
common needs and deficiencies to help make author web
applications easier.

During the W3C workshop Ian presented a position paper
that represented
the consensus opinion of both Opera Software and the Mozilla Foundation.
The core of their position is to follow the lead of the Web Forms 2.0
development. Given the already biased nature of the workshop
scope, and the list of participants, there wasn't much hope for
their proposal being adopted. In fact, their proposal didn't stand
a snowball's chance in hell of being accepted, but to really
understand why requires a short trip through history.

IBM 3270

The IBM 3270 is a class of terminals made by
IBM (known as "Display Devices") normally used to
talk to IBM mainframes. The 3270 attempts to minimise
the number of I/O interrupts required, by accepting
large blocks of data known as datastreams.

In a datastream, both text and control (or formatting
functions) are interspersed allowing an entire screen
to be "painted" as a single output operation. The
concept of "formatting" in these devices allows the
screen to be divided into clusters of contiguous
character cells for which numerous attributes (colour,
highlighting, character set, protection from modification) can be set.

Further, using a technique known as 'Read Modified' the changes from any
number of formatted fields that have been modified can be read as a single
input without transferring any other data, another technique to enhance the
terminal throughput of the CPU. Modern users however, sometimes find this system
extremely bizarre, since it is very different then any user interface
encountered in the consumer market.

The 3270 interface is a page driven terminal protocol
used as the main connection to IBM mainframes
for many years. It has fallen into disfavor with the
rise of GUI clients, but at the time if you used a large
business application on a mainframe, you spent all your time
on a 3270 terminal, or maybe a 5250 terminal, a similar beast
employed by IBM's AS400 series servers. But as the desktop
GUI has taken over new interface applications were built,
some just screen scraping the IBM 3270 screens and presenting
the contents in native widgets. Today there is a plethora
of such interfaces with vendors such as SAP, IBM, and Oracle
creating their own custom clients to their applications.

Doomed to fail

I admire Ian Hickson for going through the motions of bringing
the Web Forms 2.0 specification in front of the W3C, but
the proposal was doomed to fail. Look at the list of some of the
companies that contributed to the XForms 1.0 specification:
SAP, IBM, Oracle, Sun, and Computer Associates. Why are these
companies so interested in this web standard?
Not because they have any interest in the web.
You would think that all these companies would be cognizant
of the limited deployment of XHTML, XForms and SVG and if they
were really looking at web based solutions they would take
that into account. The problem is that the web is the
furthest things from their mind, and thus the huge disconnect
between Opera/Mozilla and the majority of the respondents.

The biggest problem, the root cause of the disconnect, is the
perspective that these companies are coming to the table with.
Mozilla and Opera were born, and live on, the web. They get it.
That's not to say IBM and SAP don't get the web, they might,
but when you say Web Application they are already tuned to
a different channel. They're thinking a replacement for their
current heavy weight clients. What they're trying to do is build
a shiny new 3270 for the next century. In that respect you can see how
XForms, SVG and XHTML are ready made pegs for them to hang their
thick client hats on. It's not about the internet, it's about the
intranet now. It's about creating a XML syntax replacement
for the IBM 3270.

Don't get me wrong, there are real problems to be solved here
and world will be a better place with these types of interfaces
standardized. It will reduce developement time, lead to greater
inter-operability, and open up whole new areas of data interchange.
Given that, though, this is not how the internet works, nor
is it the target environment. The W3C is about the Web and not
the Intranet. If corporations want to adopt and adapt the output of
the W3C to their intranet based products then all the better, but
right now it looks like the process is the wrong way around, with
intranet based developement needs trumping the needs of
World Wide Web.

In The Open

We, of course, want the W3C to go down our chosen route.
Since there doesn't seem to be much consensus on doing that, though,
the question is what should we do now? Should we do our own thing (in public of course)
and then submit it to the W3C (or IETF or ECMA) at some future point once we
have initial implementations? Should we simply do our own thing (Opera, Mozilla,
and a few interested parties) and forget standardisation altogether?
Should we just take part in whatever Web Applications working group the
W3C sets up and implement whatever comes out of that in several years'
time, despite being fully aware that few people will ever use it?
(Which is a foregone conclusion since it wouldn't work in Windows IE6.)

The first choice is the best. Remember, the web wasn't
created by the W3C, it only came into being when there was
a need to standardize incompatible pre-existing implementations.
That time of initial cleanup is always the high point for any
standards group, they do best at the janitorial tasks and not
at the innovation stage. Let Opera and Mozilla and other
browser groups innovate, providing real value for their
customers who actually live on the internet, and later if the W3C is still around it can
come in and standardize things if we end up with incompatibilites,
assuming it ever gets back around to being interested in the
internet.

It seems you don't understand how the W3C works. There was 0% chance that Ian's proposal would be "accepted" whatever that means.

The W3C dance works the following way. First comes the workshop where a bunch of companies present their opinions or their technologies in the target area. In parallel, a few of the companies may submit their proposals as W3C Notes. If there was enough interest in the technology area a working group is chartered and then is told to use one or more of the submitted proposals as a basis for their work. Finally, 3 to 5 years later a technology that bears some resemblance to the original submitted proposals becomes a recommendation.

To think that a company could just show up at a workshop and present their proposal then have it rubber stamped by the W3C as a recommendation is a misapprehension of how the W3C process works.