Management guide to SS tactics

The Shifting Standards

November 1998

Summary

The OS wars are not over yet, and the latest manoeuvres look set to
target standards. Corporate IT staff are all to well aware of the hazards
of wrapping up company operations within proprietary standards, and so
the new method is organised around creating so many and so complex 'open'
standards that they are not supported across the board. By having different
customer bases using different and ever more varied 'open' standards, system
software vendors may maintain their hold on individual market sectors.
This is good news for the vendors but will be a major headache for corporate
IT, who may find themselves in limbo with unadopted standards and with
major inter-operability problems where standard diversification was not
necessary. They are also faced with ever more complex systems to manage.
This document highlights the tactics, and how to work around them.

The increasing success of the Open Source development model, and the
continued rise in deployment of systems such as Linux and FreeBSD in the
corporate computing environment is becoming a serious problem for system
software vendors who would rather cling onto the old closed source/proprietary
methods. Much FUD has been targeted at Open Source Software (OSS), and
although much of this was ill informed comment, the style and consistency
suggested a deliberate marketing orientated FUD campaign.

This FUD fell on stoney ground, the subject is system software
and the decision makers are battle hardened professionals who long ago
learnt to evaluate the reality for themselves. This has forced un-enlightened
system software vendors into their second line of defence, that of standards.
The most ominous indicator to date has been an unauthenticated document
that appears to be a leaked internal Microsoft communication (commonly
known as the Halloween document). Although the document is unofficial,
Microsoft watchers confirm to its adherence to Microsoft style and suggest
that even if false it is nonetheless a pretty good representation of the
current state of mind within Microsoft. This document directly cites the
use (or abuse) of open standards as a tool to help maintain pseudo proprietary
methods and market monopolies.

An active role in standards making has become an essential tool in software
marketing. In many industries there exists a David and Goliath scenario
where large corporations simply do not have the agility to match fresh
new startups with new ideas, or small companies who just get lucky with
a product that just hit the mark at the right time. Some industries, such
as automobiles and white goods, have built in protection mechanisms; the
investment required in production facilities and distribution/support networks
is so large that a startup needs years to ramp up to a frontrunner position
even if adequate finance is available. The software industry is at the
other end of the scale, small companies can become bestsellers overnight,
and the internet allows software to be sold, distributed, and supported
on an international scale from a single site whilst at the same time allowing
manpower to contribute from wherever, and whenever.

The dilemma for the 'Goliath' is how to invest their large earnings
in developments that will keep the 'David' out. The obvious choice, frequently
tried yet almost never successful, is to simply throw lots of software
engineers and the product, with the goal of adding in so many features
and gadgets that competitors products look like cut-down or 'lite' versions
by comparisons. The computer press has been a great aid to this technique
by invariably comparing software by means of tables of 'check-boxes'.But
it is a technique that can only go so far. Good software is like a small
sleek racing car, not a big truck, and just as there is a limit to how
many mechanics you can get round a Ferrari when it takes a pit-stop, so
there is a limit to how many programmers you can have on a module. Push
things too far and you get bloated software, code that should be neatly
intertwined finds itself in different modules, with different people pushing
it in opposite directions. The code-base mushrooms to size where it is
impossible to get the 'big picture' and the bug-list starts back logging.
Innovation becomes out of the question, whilst technical support teams
find themselves having to distribute ever more frequent 'minor upgrades'
and patches on CDROM because they are too big to download of the net.

Interestingly, it is not the large number of programmers involved that
create the problem, it is the fact that they are engaged on adding new
features. Open Source Software has a reputation for being smaller, faster,
and more reliable than its traditional rivals, and yet the number of people
poking their nose into the code is overwhelming. The difference is that
in the Open Source model, most of those programmers are concerned with
fine tuning a limited set of tried and trusted methods rather than adding
new gadgets with new problems. In many cases they are simply fixing things
that annoyed them when they tried to deploy the software.

A far cleaner and more successful method of expending development resources
has been the use of proprietary standards. The software industry has continually
expanded its scope and usage over the years, and as new elements are added
into systems, new methods must be continually added to allow inter-operability.
The 'i' are able to offer an across the board range of products whereas
the ''Davids' may only offer a subset. So if Goliath links his range of
products with a closed proprietary technique, David is forced to reverse
engineer the protocols in order to enter the playing field. This reverse
engineering eats up a lot of resources and often leads to mysterious bugs
(often used as a base for FUD). Closed proprietary standards abounded through
the 1980's, and a few are still in common use today; but by and large they
are disappearing. One problem with closed standards is that it also affects
inter-operability with the specialist and bespoke software that very many
company operations depend upon. The ''Goliaths' are not interested in the
specialist markets, they are geared to bulk sales of standard products,
and although the problems may be partly resolved by partnership programmes
and NDA's, the problems appear to the user as a shortfall on the Goliath
products. But the biggest reason for the demise of closed proprietary standards
is the total lack of subtilty. Even the most naive purchaser can see that
such methods serve only to the supplier in order to lock them in, for the
purchasers they are nothing but a headache, and this creates a bad feeling.

The 'Goliaths' soon realised that there was no real need to keep the
standards closed, it was enough that they were proprietary. As they developed
new products, they would in parallel develop new standards to match, so
as the product was launched they would release the specifications and say
that the product inter-operated with an open, documented standard. Of course
they have a head start on the 'Davids' which in itself gives an advantage,
but more importantly such standards are typically very deep and very much
oriented to their own product architectures, making it difficult to add
in innovation. In order to get products to the market in time, the 'Davids'
are forced to use development tools supplied by the 'Goliaths', products
that are designed to orientate the 'Davids' products into add on modules
to the Goliath range rather than alternative products.

So the 'open but proprietary' standards solved the specialist/bespoke
problem, but was still a blatant lock in technique. Modern companies are
entirely dependent on the smooth running of their IT operations, and more
than ever before the concept of locking everything into a single source
is something to be avoided at all costs. At some point one has to decide
what is an open standard. Although many producers seem to deem publishing
the specification makes it open, purchasers have been more demanding. The
credit for rapid growth and success of the internet has been made possible
by adherence to completely open standards, standards made by neutral or
balanced panels, where the decision making process is transparent and contributions
may be made by anybody with suitable credentials. The internet showed this
model works, and software purchasers are now demanding such open standards
across the board. Until recently, network based services on company LANS
were almost entirely based on proprietary protocols, now companies are
looking to 'intra nets' where the inter-operability of the internet may
be exploited for the in-house systems. But companies are going further,
they are looking to the same level openness at all levels of the system,
demanding true open standards at all points of interface at all levels
of products. Companies are also increasingly looking to Open Source software
as a method of assuring they have a way out of eventual problems. Open
Source software, by its very nature, tends to conform to truly open standards,
with proprietary techniques being reserved for 'back\wards compatibility'.
This new tendency to give priority to true oneness is undermining the last
line of defence of the more conservative 'Goliaths'; for it must be said
that many 'Goliaths' are throwing in the towel and adopting the Open source
model where they compete by superior service, something that is win-win
for producers and consumers.

But some 'Goliaths' are determined to fight on, notably those who have
virtual monopolies on particular market sectors. The new technique is to
adopt Truly open standards, but overwhelm the standards mechanism itself.
'David' may be represented on a standards panel by a technical person who
approaches the panel with draft proposals ready to dispute technical merits
with his peers, whereas the 'Goliath' representative is more likely to
be a person carefully selected on the basis of their negotiating capabilities
and will have received specific training in communication skills. The 'Goliath'
member may be packed up by a team of analysts and technical writers and
is able to submerse the panel with large volumes of very well prepared
specifications. The role of 'David' on the panel is soon limited to objecting
to the most outrageous proposals (for which 'Goliath' will have a ready
prepared fallback strategy), and the net result is that the standard group
wavers towards proposals that are favourable to 'Goliath'. Another important
aspect is that the standards may be made as large and as sophisticated
as possible.

This helps keep the standards in the right direction, but alone it is
not enough. Another aspect of the software industry is that there are many
standards making bodies. Many standards are set by ad-hoc industry groups
and very often their scope overlaps and they compete for a while before
either one group gives up or they merge their common interests. The presence
of a 'Goliath' on a standards group gives it instant credibility, and 'Goliath'
is able to supply many resources, sponsor the group and push it to forge
ahead in the the face of competition. They can do the same with the competing
groups, either directly or via subsidiaries and associates. The net result
is a plethora of highly detailed conflicting standards. 'Goliath' has the
resources to comply with all standards, and make them ever more complicated
(embrace and expand is the buzzword), whereas 'David' must try and select
the eventual winner, and even so will struggle to comply with the frequently
unnecessary sophistication which also serves to clamp sudden innovation.
When the dust settles, only one standard will become 'de-facto', and many
'Davids' will fold because they had not backed this option. The 'Davids'
who got lucky by being behind the right group will, nonetheless, have their
hands tied by the serious constraints of the specification.

Shifting standards are not just about protecting the big from the small,
they are important revenue earners, as a new standard means new versions
of software. Big vendors are quick to drop older products from their inventories,
forcing new installations to use the new package with the new standard,
which invariably exhibits a few quirks when working 'the old way' and thus
forces upgrades on existing systems in order to maintain inter-operability.
Very often corporate computer departments find themselves upgrading whole
sites because the old package is no longer available, despite the fact
that the new package yields little or nothing in terms of productivity
gains. So why should corporate computer customers be bothered? They can
simply follow Goliath and be assured that whatever the eventual outcome
they will be covered. The downside is that this tactic invariably results
in a continuous upgrade program. Many end users are quite happy with their
actual systems, modern client software such as Word-Processors, Spreadsheets,
CAD etc. are already capable supplying all the resources they require.
The adoption of new standards via upgrades forces upgrades across the board,
and upgrading to a new version will often not increase productivity and
actually harm it as stable systems with well known shortfalls are replaced
with new unstable systems with hidden perils. Users must adapt to the new
environment, IT personnel training is often required, and much time is
often lost making existing data operate with new elements. Very often new
software requires a hardware upgrade, and stable, reliable equipment ends
up being scrapped prematurely. If all this assured companies that they
would not be left in limbo, it might be acceptable, but it does not achieve
that goal.

'Goliath' will support all standards during their emerging phase, but
as a winner emerges they must drop the lesser used ones in order to help
contain their already bloated packages. They will provide migration paths,
but these are not necessarily easy. SS tactics ensure that 'Goliath' has
support for the winning standard, but does nothing to ensure that a 'Goliath'
customer adopts the winner during the emerging phases. The key of SS tactics
is the frequency and sophistication of new standards and if taken to extremes
may easily make Open Standards a worse option than proprietary ones. Instead
of being locked into a suppliers standard, they find themselves locked
into their upgrade program, with upgrades going non-stop and considerable
time spent on re-training.

Of course the big fight against SS tactics needs to take place at the
developer level, on the standards panels themselves, but the corporate
customer may still do much to protect themselves. First line of defence
is to resist change. With systems increasingly meeting users needs there
is little to be gained by adopting new methods, changes should be made
if as and when they are absolutely necessary, and not because they are
the latest technology.

Software packages should be evaluated on the basis of the standards
they comply to, and it is a good move to see if other similar standards
exist. Corporate purchasers should shop for standards, and then decide
on the software they will use to meet them. They should wait till the dust
settles on the standard before adopting them, and check that the same standards
are supported by a number of software producers. They must also be weary
of proprietary extensions and check that the package may be configured
so as not to utilise such methods automatically. They should also insist
on backwards compatibility with data archives. It is not enough to be able
to 'import' old data it must be possible to work in the old format as well.
If new extensions are not supported by older software, it must be possible
to invoke a global backwards compatibility mode, so that the new systems
may inter-operate with the old until the whole site has upgraded by means
of natural retirement of old equipment.

By being conservative about upgrading, and by only adopting new standards
that are already supported by multiple vendors (and especially open source
packages), it is possible to minimise the impact of change and maintain
open options.