Loosely Coupled weblog

Friday, May 02, 2003

Presuming transparency

Loosely Coupled's news release feed brought back memories for Jon Udell this week, who can recall inviting PR people to submit releases to an application he created for the BYTE.com website back in 1995. Naturally, he went on to draw an up-to-date moral: "Now, think about the tool used to write the press releases that I used to collect, and that Phil is now collecting. It's Microsoft Word. Eventually, people will figure out that it's easier to save an RSS item directly from Word 2003 ..."

I wonder. I'm amazed at how many people still publish press releases on their web sites in a way that makes it difficult to link to them. If people still in this day and age don't get how the Web itself works, I'm sure that many of them will choose to hold out against the idea of publishing their press releases in an even more accessible and easily transformable format.

In the hope of pushing them in that direction, the Loosely Coupled feed doesn't publish whole press releases. You can only submit the title, a subtitle and a summary, along with a link to the original document (what the weblogging world calls a permalink). This builds two implicit assumptions into the application: a) that vendors publish press releases on their websites in a timely fashion and b) that they publish them at permanent archive addresses. But the application also has some workarounds built into it, because neither of these assumptions are in fact true:

Some companies take two weeks or more after issuing a release over the news wires before they get round to posting them on their own websites (others, strangely enough, work the other way round, issuing releases with a fresh date, long after first posting them online).

A surprising number of websites still use Javascript popup windows or frames to display press releases, which, unfortunately for them, means that when our archive deep-links to the individual release, it comes up in the browser without their site navigation around it.

One of my hopes is that asking PR and marketing people to provide a URL when they submit releases will gradually drive home the importance of designing websites that a) are easy to update in real-time and b) use consistent, accessible links. Another hope, of course, is that they'll go that extra step and start using RSS themselves. But just as some companies who don't want you viewing their press releases unless you're already on their website, so there will be companies who regard publishing to RSS or structured XHTML as a little bit more open and transparent than they're willing to be. posted by Phil Wainewright 11:40 AM (GMT) | comments | link

Wednesday, April 30, 2003

InfoPath and the chasm

Microsoft's controversial strategy for trickle-down adoption of InfoPath is classic Geoffrey Moore. But it's wrong, because Moore's advice in Crossing the Chasm doesn't take into account today's decentralized world.

The controversy has been caused by Microsoft's decision not to include InfoPath, its user-friendly XML forms generator, in mass-market versions of the next release of Office. It will only be in the Enterprise and Professional editions. Some of the reactions:

Don't segment desktop XML by Jon Udell: "The future of XML on the desktop is far from certain. Now is not the time to segment a market that has only just begun to grow."

At Microsoft's Mercy by Kendall Clark: "The XML industry must seek to avoid finding itself in a ... position ... in which it relies upon Microsoft to make XML creation tools for end users ubiquitous."

XML Brings Power to the People, my own ASPnews column this week: "Microsoft has a chance to be there first with InfoPath, but if it chooses not to be, there are plenty of would-be rivals who are more than eager to take its place."

Jon in his column makes the point that "Microsoft now sees schema support and InfoPath as an enterprise play that might  or might not  trickle down," which is where Geoffrey Moore comes in. When he first published his classic guide to high-tech product adoption back in 1991, the visionaries who became early adopters of innovative, nascent technologies had to command enterprise-scale budgets to be able to afford to buy into them. Microsoft is counting on the same dynamics to launch InfoPath towards mass adoption, reckoning that if enough large corporations launch million-dollar projects to harness the potential of structured XML documents, then InfoPath will gather enough momentum to cross the chasm into mainstream adoption.

But what Moore couldn't have forseen back in 1991  and Microsoft's present-day strategists haven't recognized  is that in today's networked world, a visionary doesn't need the existing infrastructure of a large enterprise to impose adoption. A powerful interlinked trio of standardization, componentization and ubiquitous connectivity have drastically lowered the entry threshold for pioneering the commercial adoption of technology  particularly for grassroots, decentralized innovations like structured XML. As Jon notes, "The most vibrant XML applications today are coming from the grassroots up, in the form of RSS-enabled weblogs. The network of RSS producers and consumers, which is growing like gangbusters, has become a laboratory for leading technologists at IBM and Microsoft ..."

What we haven't yet seen is evidence that commercial innovators can harness this network with the same success as technology enthusiasts. Microsoft's bet on Moore's classic chasm theory is a gamble that they can't, but I think the gaming rules have changed in the past few years. Although there's still a big chasm to cross before you can move beyond visionaries out into mainstream adoption, the strategies you need to adopt to acquire those visionaries have changed out of all recognition, thanks to the new, decentralized dynamics of the network economy. Anyone who can harness those new dynamics, so long as Microsoft chooses not to, will find themselves able to secure a surprisingly powerful market position at the larger vendor's expense. posted by Phil Wainewright 5:13 AM (GMT) | comments | link

The irony of defining .NET

Last week, I was gratified that agile development took up position at the head of our ever-growing alphabetical list of Loosely Coupled glossary terms. This week, I'm slightly dismayed to see .NET steal that position, after we introduced a definition to accompany this week's feature story, Bringing .NET to Main Street. Agile development is an important concept that deserves more airtime; .NET is a vendor's proprietary brand, and Microsoft can afford their own advertising.

But there's nothing I can do about it (at least until we find cause to add a term that comes even earlier in the alphanumeric sequence). The underlying architecture of the glossary sensibly avoids the complication of a separate database to store definitions, using the native file system of the web server instead. It uses the standard form of the term as the filename for the definition, and the alphabetical list is generated from the file system. The delicious irony of this is that now Loosely Coupled has added .NET to the glossary, we will always have to stay on a Unix file server, since the Windows file system doesn't allow you to start filenames with a leading dot, and therefore wouldn't accept the filename for our .NET definition.

Tuesday, April 29, 2003

One time too many

Sceptical users vented their frustrations with IT during an inspired session at the Web Services for Businessconference and exhibition in London today. Paul Druckman from the Institute of Chartered Accountants spoke on behalf of CFOs, making the proposition that "Getting connected is not worth the business investment." James Phillips of Actional, representing the Web Services Interoperability organization (WS-I), was charged with putting the opposite case, but by the end of the session it became obvious that it is the whole IT industry, not just its web services evangelists, that is on trial.

Given that this was a group of delegates who presumably were there because they thought web services might actually be useful, it was instructive to hear the depth of scepticism expressed in contributions from the floor. Even though a show of hands at the end of the session showed that most believed it was worth making the investment, no one was taking vendor assurances at face value. "How real is this or is it just another tick-list item?" asked one. Druckman said software vendors hadn't proven they could be trusted over standards. Another contributor wondered about vendor motives: "If it's so cheap and easy, why are all the suppliers so enthusiastic?" Surely, he was implying, vendors have a business model that's predicated on IT being expensive and hard.

Sitting there listening to all this, it struck me that IT vendors are like the boy who cried wolf once too often, or like the errant lover who strays just that one last time too many. The irony is that, this time, web services and service-oriented architectures really are the silver bullet that will finally deliver the long-awaited benefits that have been prematurely claimed for so many previous iterations of the next big thing. It's just that there have been too many next big things for users to believe in any of them any more. There is still some hope  users do want to believe in it  but they feel sceptical, mistrustful, even embittered. Businesses have had so many false bounces in the bear market of IT expectation that they just can't bring themselves to buy any more, however good the fundamentals look. And frankly, I don't think we've reached market bottom yet.

That question of vendor motivation is a telling one. The truth is, many of the big vendors are embracing web services without actually recognizing the full implications for their business models. They still don't get it, just like previous generations didn't get the PC, or the relational database, or various other disruptive technology innovations. They're in that 'horseless carriage' phase that ZapThink analyst Ronald Schmelzer recently mentioned, when the new technology is looked upon as merely an extension of what came before.

The parallel that resonates the most for me is the advent of the PC and distributed computing. I think the reason it resonates so strongly is that web services is the software equivalent of that earlier revolution, which was mainly driven by changes in technology at a hardware level. Recall how the big guys of the time reacted to the PC phenomenon. Their immediate response was to produce over-engineered, expensively produced versions of the PC concept  think DEC Rainbow, IBM PC Junior, or IBM PS/2. They saw and embraced the PC, but only outwardly. They didn't understand that what was really happening behind the scenes was not just a new computer hardware format, but also an entirely new way of making hardware, based on an open architecture, horizontally segmented product design, and a manufacturing process founded on assembling commodity components.

Fast forward to web services, and history is about to repeat itself. Everyone sees and embraces web services, but none of the big guys can see (or frankly even dare contemplate) the margin-destroying dynamics of the business model that web services is introducing to the software industry  the move to standards-based, horizontally segmented software assembly, using largely open source components and low-cost agile development methodologies.

The trouble is, there's still plenty more scope for users to lose even more confidence in the IT industry's ability to keep its promises, even while migrating to this faster, better, cheaper, service-oriented architecture. Established vendors will lose their footing and take loyal customers along with them on disastrous false trails, while next-generation upstarts will grow faster than their quality control and support capabilities can keep pace with, leaving customers feeling aggrieved. I guess it'll all turn out well in the end, but I fear it's all going to be very messy and unpleasant in the meantime.posted by Phil Wainewright 11:59 AM (GMT) | comments | link

Monday, April 28, 2003

UDDI comes of age

Among a crop of articles this month about UDDI 3.0, which is due to be finalized later this year, Network World has just published the best so far. In Key Web services protocol gets help, John Fontana describes why enterprises are using UDDI and how the new features in v3.0 will consolidate its registry role as "the linchpin for the next wave of Web services, which ties together multiple Web services into composite applications."

This is a topic we covered a couple of months ago here on Loosely Coupled, in Keith Rodgers' article UDDI finds a role after all. What came out clearly then was that UDDI comes into its own once you have more than a few services to hook together  in other words, the registry capability becomes essential once a service-oriented infrastructure starts to emerge. A quote from Ted Haeger, director of product management for Novell's eDirectory and Nsure UDDI server, sums it all up succinctly at the end of the Network World piece: "UDDI is the glue that ties together the whole web services idea of loosely coupled applications. If you can't locate a web service, you can't use it."

Keith's Loosely Coupled article also noted that vendors are increasingly building UDDI into their platforms  at the time, it had already appeared in WebSphere, and as of last week, it's in Windows Server 2003. This means that many organizations may end up adopting UDDI without even noticing. But as I noted in my recent article, Tactics, not strategy, drive SOA adoption, adopting a service-oriented infrastructure without forming at least the outlines of an architectural strategy is likely to cause crises of management at some point along the road. Enterprises need to make sure they're going into this with their eyes open. posted by Phil Wainewright 12:30 AM (GMT) | comments | link

Assembling on-demand services to automate business, commerce, and the sharing of knowledge