As part of its ongoing absorption of Forte products, iPlanet has added new JMS and XML interfaces between its application server flagship and its EAI and B2B e-commerce products.

Specifically, the EAI product—iPlanet Integration Server 2.1, a.k.a. Forte Fusion, has added support for Java Messaging Service (JMS) and HTTPs to allow direct communication with iPlanet Application Server. (XML support was already built in as the product’s core integration building block.) That will make it easier to build web commerce applications that interact with enterprise applications such as SAP or PeopleSoft.

iPlanet ECXpert 3.5 (the B2B product), which serves as an integration and translation hub for documents in EDI, email, and other formats, has belatedly added XML support, including parsing, translation, and support to XML communications over HTTP (with SSL encryption). Not surprisingly, it does not yet support SOAP, a Microsoft-developed protocol for communicating XML transactions over HTTP, which has drawn support from heavyweights such as Ariba, Commerce One, Compaq, and IBM.

Added to its existing JMS support, the EAI and B2B modules can now communicate with iPlanet Application Server, or any other J2EE-compliant appservers equivalent.

The JMS and XML support are necessary first steps for iPlanet to cobble together its offerings into what it bills a new “Integration Platform.” ECXpert is simply one of several iPlanet e-commerce modules, which also cover automated procurement and sell-side applications, credit authorization transactions, and market makers (its answer to Ariba and Commerce One).

In the long run, iPlanet plans to make all of these components of a common web commerce application platform. It still has a ways to go; iPlanet Application Server, ECXpert, and Integration server are still three separate products with different architectures and look-and-feels. IPlanet says it will eventually provide integration — a sizable task — but has wisely not committed itself to deadline dates.

The primary advantage today is that all products are supported by the same vendor. Today’s announcement adds J2EE integration to provide the same type of integration as if the products came from different vendors. It’s just a first step.

Software configuration management (SCM) systems have long been the Rodney Dangerfields (the American comedian “who gets no respect’) of the tools market. Although not at the top of most checklists, virtually every development team with ongoing projects usually has some version control tool in place to keep developers from tripping over old code.

In most cases, the systems have been limited to workgroups numbering at most 20 – 30 developers. The tools are either primitive homegrown applications or databases, off-the-shelf products limited to 20-30 person workgroups, although a couple repository-based, enterprise-scale offerings, including of Merant’s PVCS Dimensions or Rational’s ClearCASE, have emerged. But in most cases, the tools have been taken for granted.

With today’s release of Rational Suite 2001, the latest in a round of semi-annual rollouts by Rational, competition in the entry-level SCM (software configuration management) market has ratcheted up. In additional to other tools covering the AD life cycle, the new Rational Suite includes ClearCASE LT, a new low-end version of the company’s existing SCM offering.

Offered as part of the suite or a la carte at $1500/seat, ClearCASE LT is roughly at price parity with PCVS Professional, the best-known workgroup-level tool—and is roughly half the cost of enterprise-level offerings. Rational’s pitch is that, unlike other workgroup offerings, which also include the free, open-source CVS; Starbase, the web-oriented tool often embedded with Java frameworks; and Microsoft’s Visual SourceSafe; ClearCASE LT is actually a miniature version of the full enterprise product. The implication is that migration paths should be easier. By comparison, to go from PCVS Professional to PVCS Enterprise requires an automated wizard that parses out data from the workgroup product’s flat files into the federated database and source code repository structure of its enterprise equivalent.

For AD teams, the choice is not just around which tool provides upgrade paths, but what other goodies may come alongside it. For instance, while Rational offers unified change management, a component of Rational’s unified process, Merant has begun offering standalone companion tools that track changes in packaged applications, beginning with Oracle Financials (it might eventually integrate them). Other common features high on AD team checklists are defect tracking, issue management, change management, and requirements management.

Thanks to e-business, AD teams are having to cope with more complex, highly intertwined applications that must be developed on increasingly short fuses. That’s placed a premium on buying suites or best-of-breed solutions, rather than point tools. However, due to workflow variability and the shortcomings of today’s repository technologies, no vendor yet offers a completely open, integrated solution spanning the entire AD life cycle, from designer to developer and tester. For that, we will probably have to wait for XML frameworks.

Nonetheless, the renewed attention to SCM points out that this is becoming one of the hottest contested portions of the AD tools space. That’s ironic given that Rational’s 1997 acquisition of PureAtria (the source of ClearCASE) was initially given such poor reviews by Wall Street.

The web application server business just got a bit more consolidated, with HP’s acquisition of Bluestone Software, announced during Fall Internet World in New York. As websites begin adding industrial strength commerce and transaction processing the application server has become strategic linchpin. As J2EE standards mature, the application server itself is becoming commodity. There is little room left for niche middleware players. For too long, Bluestone has been the small software company with big technology.

Why have application servers so important to the web, and why have they become strategic application platforms for major e-business hardware and software vendors?

In short, the answer is scalability, availability, and reliability. When an ad appears on television or when B2B trading partners begin collaborating on their supply chains, web commerce sites cannot afford to slow down or fail. And lesson one for any e-business application designer is that, compared to internal applications, those that reach outside the enterprise cannot easily control or predict their user bases and transaction loads.

The logical approach to providing always-on capability is to start distributing databases and business logic on different machines, so capacity can quickly be added wherever and whenever it is needed. In e-commerce, it’s hard to know when you’re going to need another database or application machine. And, in so doing, industrial strength application and database management strategies grow necessary, because loads must be balanced and failover must be applied. While distributed architectures did not start with the web, the web has made them more mission-critical.

Because the web is inherently stateless, the housekeeping functions traditionally performed by the operating system must be performed by something else. Although the database could—and in many cases does—perform services such as persistence and object/relational mapping, the conventional wisdom in high-end web application design is to move that overhead to the application layer, because the database is already pretty busy. Enter the application server. With Oracle 9i, even the venerable database vendor has gotten the message, elevating the application server to the same prominence as its database.

With the emergence of J2EE (Java 2 Enterprise Edition), a framework has emerged that specifies transaction and related services, including object/relational mapping, database transaction persistence, component generation, web page generation, along with APIs to directory services, transaction management, messaging and database access. Today, J2EE is the dominant architecture for web applications that are serious about scalability.

That makes the application server an increasingly strategic buy. Ask most web teams about their short list, BEA WebLogic, IBM WebSphere, and iPlanet (formerly Netscape) Application Server generally make the first cut. Significantly, two of the three are legacy buys—IBM, because of the natural link with mainframe operations; iPlanet, because of its Netscape heritage; and BEA, because they were extremely aggressive in hopping on EJBs before everyone else. What’s also interesting are the once-popular names that are no longer there: NetDynamics, which embraced Java, but not Java Beans, early, and HAHT, whose architecture was C++ based.

Today’s second tier survivors are players that were willing to embrace the J2EE transition and eat their young. SilverStream, which like NetDynamics, embraced Java early, has recently migrated its architecture from proprietary Java components to J2EE; Iona and Gemstone, whose roots were in the ORB and object database fields, respectively, repositioning their technologies as building blocks of J2EE servers.

Last, but not least is Bluestone, which probably made the toughest transition of all. Like HAHT, it had its own proprietary development language (Sapphire), and it was largely C and C++ based. A couple versions ago, Bluestone bit the bullet and made a wholesale migration to a complete Java architecture, rather than take the path of least resistance by “protecting” its legacy base and making only cosmetic changes at the API level. Yes, it would provide migration tools to Sapphire customers, but no, it wouldn’t continue the dated architecture.

It has made some smart technology bets, beginning with its early emphasis on load balancing and failover, which are now taken for granted in the space. Bluestone also hopped the XML bandwagon early, initially with a giveaway XML server product that it eventually incorporated into the core platform. Recently, Bluestone made a shrewd acquisition of Arjuna, a German software firm that developed the first Java-based transaction server, which provides a needed counterweight to BEA’s trove of database-oriented middleware.

The dilemma remained: Bluestone generally got high marks for technology, was effective at getting its message out, but remained stuck in second-tier position in a product category which emerging standards were transforming into a commodity. In short, it faced the usual battle for niche software vendors: swallow quickly or get swallowed up.

HP to the Rescue

Bluestone, which exhibited typical performance for the early stages of a public company, was beginning to show signs that it might finally enter the black. Annual revenues have doubled year over year through 1999, and three quarters in CY and FY 2000, they have already hit the 2x mark, at $28 million. But the company, which focuses on large accounts, has been heavily reliant on its top 10 customers, which last year accounted for over half the revenues. With Enterprise Java technology becoming commodity, it needs the critical mass to convince large accounts that it would be safe to buy Bluestone products.

As the last major software player without an application server strategy, HP Software Division’s acquisition of Bluestone at first glance appears quite logical. HP has recently begun making waves to challenge Sun’s dominance of the e-business space, courtesy of more powerful hardware, tighter relationships with Oracle, a new e-Services initiative, and a president—Carly Fiorina—who is striving to shake the company of its plodding, consensus-driven culture, with a more entrepreneurial focus that is more open to risk taking.

To the uninitiated observer, HP Software sounds like an oxymoron. The company’s tagline, “Invent,” harkens back to a history where clever engineers did neat things devising electronic instruments in the backs of their garages. The company has always had a strong product engineering bias. This author is currently working with one of the fruits of that culture: a 10-year old HP LaserJet II-P which continues to effortlessly turn out 4 pages of high-quality copy every minute, refusing to become obsolete.

But then again, the old AT&T had a similar reputation. Its products were built to last, and with Bell Labs it had state-of-the-art R&D. Chances are, if you took the company up on its offer to buy the phones that it used to rent, prior to the company’s 1984 split-up, chances are the units would still be ringing. Yet, the technology business is littered with the corpses of engineering-driven companies that were tone deaf to marketing, with AT&T the poster child. The Baby Bells spun off from AT&T, which were left with lower margin businesses, have reinvented themselves, with many of them looking far more attractive to investors than Ma Bell herself. Yet, as this report was going to press, AT&T was trying to extricate itself from its latest missteps by once again breaking up the company into business units that might make smarter decisions.

When we look at HP Software, we think of OpenView, a product that invented SMTP network management market. And, in spite of HP’s neglect, OpenView’s Network Node manager has managed to remain the gold standard for network management, even as rivals like CA and BMC begin adding sexier neural network/genetic algorithm technologies to do forward prediction of network operations. This is the same OpenView that withstood the HP hardware company’s embrace of CA’s Unicenter two years ago as the preferred framework for HP-UX platforms.

Since then, HP has formed a software business unit around OpenView. Like IBM Software, HP is striving to form a real software business from a vast array of market-leading and obscure point products. With OpenView providing the pillar into enterprise computing, HP began enunciating its strategy for e-business. the other major pillar of the software: The e-Speak XML framework, and e-Process, for providing high-level tools for designing and managing e-business workflows.

e-Speak, a framework of XML-related, covers XML communications, and the management of XML transaction services. In other words, the concept is sufficiently broad yet vaguely defined that at this point it is difficult to tell whether it will compete with competing frameworks from players such as Microsoft. e-Process is more tangible, currently consisting of ChangeEngine, a tool for developing e-Business processes and workflows, and HP Service Delivery, a telecommunications solution which embeds the WebMethods Enterprise EAI engine.

Bluestone fills a key gap in HP’s e-business solutions strategy. Previously, HP attempted to partner closely with BEA/WebLogic, but found itself in a position where it was in the passenger, not the driver seat, when it came to selling e-business solutions. HP realized that, to make its way to the table, it had to put skin the game.

For both HP and Bluestone, the acquisition didn’t come out of the blue. Both have been gradually ramping up their relationship since they announced an interface between Bluestone’s Total e-Business and HP’s ChangeEngine earlier this year. And the possibilities of leveraging Bluestone’s XML server capabilities could go far in helping HP flesh out its e-Services product framework.

Appserver Acquisitions: A Tale of Two Cities

The application server market provides good examples of what can go right—and wrong—from similar acquisitions. The track records of BEA and WebLogic; and Sun, Netscape, and NetDynamics, provide black and white examples.

Prior to the WebLogic acquisition, BEA itself was a collection of formerly independent and orphaned middleware tools. It was best known for rescuing Tuxedo, the leading third party UNIX TP Monitor, from Ray Noorda’s Novell, where it had languished as part of the company’s failed effort to become the center of the UNIX Universe. (At the time, Novell owned the remains of UNIX Systems Labs, the former AT&T rival to the Open Software Foundation.)

Back to BEA, the company struggled cobbling along a middleware strategy during the client/server era against the wrath of Oracle. BEA’s battle was uphill for other reasons as well. The TP monitor notion was both behind and ahead of its time, and it was an invisible technology that was marketed during a time when most enterprise IT organizations were preoccupied fighting the ERP wars.

For BEA, WebLogic itself came along at an especially auspicious time. With client-side Java reeling from performance and Microsoft-induced marketing problems, WebLogic was the first to get its arms around Enterprise Java Beans (EJBs). It would sell EJB appservers before EJBs were even a spec. WebLogic had compelling technology, but no feet on the street. BEA was looking for a hot product that would make the rest of its strategy make sense.

The result became, in effect, a reverse acquisition. BEA bought WebLogic, and although it kept its corporate name, the WebLogic product line surged front and center. BEA used its existing middleware products, such as Tuxedo, to lend more weight to WebLogic, and in so doing won the perception battle. WebLogic Enterprise may be a group of products, but in customer eyes, they often regard Tuxedo as WebLogic’s back end. Today, WebLogic is the leading J2EE application server in market share.

On the other end, the Sun acquisitions, which are now part of the iPlanet family, have suffered more lukewarm fates. Netscape originally claimed the mindshare for large-scale application servers largely thanks to its companion webserver and browser product lines. NetDynamics was known as one of the earliest application servers that used Java. However, because both products predated J2EE and EJBs, their internals weren’t Java-based. That in effect made them proprietary software products that just happened to use Java. The result was, when server-based Java standards emerged, neither could provide the degree of openness that was part of the Java promise.

When Sun bought NetDynamics, the company was one of the top players in the low-end appservers space. However, after it acquired NetDynamics, the company lost key personnel, and more importantly, market momentum, given Sun’s lack of experience in marketing software. By the time that Sun acquired Netscape’s application software business, spinning it off as iPlanet, it grew obvious that the NetDynamics’ product line’s days were numbered, given Sun’s strategy to tackle higher-end accounts. IPlanet’s execution, not its strategy, was flawed. It delayed in getting the message out to NetDynamics and Netscape customers, and in so doing, lost market share (in many cases, to BEA).

Since then, Sun has also folded Forte into iPlanet. Significantly, the management team that was responsible for Forte’s revival in fortunes did not survive the acquisition. Today, iPlanet’s software business continues to be run by the same management team responsible for the slow NetDynamics/Netscape consolidation. Significantly, the team lacks anyone experienced in the software business.

Bluestone’s Opportunities

A major condition of acquisitions, especially those that look good on paper, is cultural fit. On one level, there shouldn’t be a problem: both companies have traditionally been very engineering-driven. But in the e-business market, that is not necessarily an advantage.

Another concern is the general question that arises anytime a large company swallows a smaller one: Will the buyer stifle the speed and agility of the buyee. The concern is especially acute for HP, which acted in characteristic, consensus driven fashion while courting Bluestone. Obviously, acquisitions aren’t casual matters, but once the deal is consummated, the newly joined parties must be braced to jump ahead at Internet speed.

From recent history, it appears obvious what must happen for HP’s Bluestone acquisition to pay off: Steal a page from BEA and make it a reverse acquisition.

On this front, the early news is good: Bluestone will become the headquarters of HP Software’s Middleware Division, with Bluestone president Kevin Kilroy taking over the unit. And, when it comes to e-business middleware, HP is almost a blank slate, with one obscure product and another that is still primarily high concept. Compared to other large recent acquisitions, Bluestone should have relatively few feet to step on. Lets hope it still has the feet to do the stepping. HP must hang onto Bluestone’s development organization, which knows a thing or two about change.

The Middleware Group should call the shots when it comes to e-business platform product direction, sales, and marketing, rather than waiting for consensus. There are obvious synergies with OpenView, which can be used to manage the back end systems that feed web commerce. And it shouldn’t shy away from rebranding: Who knows what ChangeEngine and HP Service Delivery really mean anyway? All e-business products should share a common branding—Bluestone Total e-Business and HP e-services and ChangeWhatever, alike. They need to mobilize all the feet on the street, from sales to service. The last part could be tricky, especially if HP’s crazy obsession to spend $20 billion on PwC diverts badly needed resources.

And, as long as we’re shaking things up, how about getting the web site up to date? Today, the web page containing the production description for HP Service Delivery lists Active Software’s ActiveWorks as the integration engine. For the record, WebMethods acquired Active Software back in May, and subsequently renamed the product WebMethods Enterprise.

Systems management has traditionally been the domain of high-end corporate sales. A glance at BMC’s own product description for PATROL sums it best. “PATROL for Enterprise Management provides the business process, enterprise-wide view of your environment. It provides a central point of control for all applications, computers, LANS, WAN and communications devices throughout the enterprise.” These solutions are often overkill or unaffordable for SMEs.

Yet, the emergence of dot coms means that there are a lot of new, small companies with big company computing needs. And in most cases, they probably lack the DBAs or network managers to do it.

That’s in large part the explanation behind the emergence of application service providers, for example. Yet, ASPs alone, only allow aspiring dot coms to export their problems, not necessarily solve them. Emerging dot coms and recently spun-off business units require products and solutions, not just services, to manage their computing requirements.

Software vendors like BMC need opportunities like this to distract from their post-Y2K blues. Like much of the enterprise software industry, 1999 and 2000 have been a tale of two cities. BMC, which grew at a 30% clip last year, just reported its third straight disappointing quarter this year. The September 30 figures showed revenue drops of 7% below the same quarter of last year, with per-share profits coming in at only half of Wall Street’s expectations.

At Oracle OpenWorld, BMC announced the formation of a new business unit, Data One that will have over 300 professionals—including sales and marketing, product development, customer service, and consulting. The new unit’s mission is to come out with simpler, cheaper, almost shrink-wrapped versions of BMC’s PATROL (distributed systems management) products—like WebDBA, a $495/seat product released in the summer that allows DBAs to manage Oracle databases remotely through any HTML web browser.

Aside from WebDBA, DataOne does not yet have any specific products, nor has it firmed up product plans or timelines. So why announce now? According to Anthony Brown, director of marketing for DataOne, “This is a good time give our customers an understanding of where we’re going with our technology,” adding that the Oracle OpenWorld event just happened at the time that they were ready to go public.

(At the show, BMC also announced the acquisition of Sylvain Faust, a Canadian software firm that developed tools for tuning SQL statements, which will also be folded into the new business unit.)

The new DataOne business unit inherits roughly 30 PATROL products covering SQL development, database change management, performance tuning and optimization, maintenance optimization, backup and recovery, and business information management. With the exception of some front end beautification, these products are currently the same PATROL offerings aimed at Global 2000 organizations.

The long-term goal is to come out with integrated solutions aimed at companies scraping by with one or two inexperienced DBAs, who require easy to use solutions with liberal use of preconfigured templates. In most cases, WebDBA will serve as the front end to these new solution sets.

To accelerate products out the door, BMC has begun adopting larger-scale beta testing programs. A program associated with the initial WebDBA release attracted 100 custom,ers. BMC claims that over 2000 customers have volunteered for the next beta phases, aimed at adding new utilities to the WebDBA palette. Although DataOne has not yet announced product plans, BMC promises to begin offering incremental integration between WebDBA and some of these utilities as early as the next 60 days. Among the early targets, BMC will likely address change management first, followed by performance monitoring and maintenance optimization.

BMC’s moves are not surprising. Given that existing enterprise markets have flattened, it has to figure out some way of penetrating smaller organizations that are largely virgin markets. The task is daunting; database management is full of arcane concepts such as disk defragmentation and tablespace reorganization that typically require years of experience to master. In fact, the need for simpler, preconfigured tools extends beyond small organizations, because even large enterprises are having hard times finding experienced DBAs in tight labor markets.

BMC’s moves are hardly unique. On the database side, the recent Sun/Oracle/Veritas alliance to certify specific database/hardware configurations reflects a need that fast-growing enterprises need databases-to-go. In the ERP space, giants like SAP have introduced fast track ASAP programs, which feature liberal pre-configured templates. The difficulty of the task is reflected in the fact that ASPs have struggled to keep customers happy with plain vanilla configurations of enterprise software. Although, at a certain level, databases are more generic, in a distributed computing world, there are always bound to be variations in the way that functions or tables are deployed.

Although Sun et al have recently resisted expanding their database/hardware pre-certification programs, these efforts are the logical place for ventures like BMC’s DataOne to gain traction.

Suppose they gave a war but nobody came? The Windows 2000’s coming out party held just after Valentines Day in San Francisco came off almost as an afterthought. Although over 20,000 attendees were expected, actual turnout was barely a fraction of that. There were few if any vendor bashes that usually accompany breakout events. The headline was Michael Dell backtracking on his previous week’s lukewarm comments on W2K, which triggered a run on Microsoft stock.

Like any Microsoft operating system, Windows 2000 has huge promise to fulfill. And if W2K ever delivers on the promises of scalability, reliability, Intellimirror desktop control, Active Directory, hot plug and play, and other features, that would do wonders for TCO. But getting there will be most of the “fun.” Gartner Group’s predictions that 25% of existing Windows applications having problems making the transition are probably conservative, as previous experience going from 16- to 32-bit Windows demonstrated.

Take the scalability issue. Dell says it’s comfortable running its web site on clusters, while Data General, IBM and others believe that the high end will be best served by grander parallel architectures. Significantly, Compaq, which otherwise boasts a highly diverse line of platforms, was caught flatfooted having to have an answer for an environment—Windows 2000 Data Center edition—which doesn’t yet exist. Although Compaq promises a new high-end architecture redolent of the old Thinking Machines design by 2001, it felt compelled to OEM 32-way machines from Unisys as a bridge strategy just in case W2K Data Center comes out before then.

No wonder W2K show turnout was so light. IT users are scared of the future, while the major PC platform vendors are sending mixed signals on where W2K will initially play strongest, when sales will outpace NT 4, and, in the long run, what’s the most cost-effective way to scale. Better let the experts duke it out before committing real IT bucks.

Unlike Windows 95, 2000 promises real features that could generate TCO breakthroughs. But for now they’re just promises.

It’s become fashionable lately to deride business-to-consumer e commerce web ventures as victims of hype. The newfound caution about the growth of B2C is certainly warranted, given the near-term opportunities for business-to-business web commerce.

Although Gartner’s recent prediction of trillions of dollars in B2B within the next 2-3 years smacks of the same hype that previously afflicted B2C, the prognosis that B2B has greater near-term potential remains sound. Businesses already have the right bandwidth connections, with competitive pressures forcing more and more of them to at least dip their feet in real on-line transactions.

But don’t totally dismiss B2C. The issues are adjusting expectations—especially when it comes to creating new branding identities. Remember, the attraction of going on line is to add convenience and create value by allowing consumers to do more things with the business that they must transact. Although the emergence of new brands like Yahoo or Amazon make it tempting to predict that pure on-line plays will revolutionize consumer buying, recent history has shown that when familiar brands deliver real value and convenience, consumers will go back to the familiar every time. Remembering every new dot com in the long term will prove more effort than it’s worth.

Therefore, while new, on-line pure plays like the First Internet Bank or E*Trade may draw the attention of the downtown media, the Citibanks of the world have golden opportunities to enhance their brand identities by adding new capabilities that allow consumers to have more control and visibility into their bank or investment accounts. We recently had the chance to visit yodlee.com, part of a new generation of on-line service providers whose products—in this case, consolidation of consumer financial statements—could provide exactly the type of service that can help click and mortar brands, like Wells Fargo Bank, make their services even more compelling. It may not be as exciting as inventing the next Yahoo, but in the long run, the flowering of the B2C market development will more likely be driven by new under-the-hood technology that reinforces the names we already know.

Several years ago, open source theorist Eric Raymond released a well-known essay extolling the virtues of open source, comparing traditional, vendor-lead proprietary software development—the “cathedral”—against the more populist open source model—the “bazaar.” Raymond argued that, what looked on the surface to be a more chaotic approach to development—open source—was actually more efficient than its supposedly more orderly vendor-based proprietary counterpart.

Until now, the IT world could easily dismiss these arguments as leftovers of the 60s generation. Open source did have a sort of free love utopian air to it, with its ideals that anybody could make software as long as they contributed it to the communal good.

What IT manager that wants to keep his or her job could afford to take the notion of ownerless software seriously? When it’s 3am, and your server blows up, the idea of a vendor’s warm proprietary embrace sounds a lot more comforting, because at least, somebody takes responsibility for the code.

So why are household names like IBM, Intel, Compaq, and Dell making big noises about their Linux strategies? To be blunt: It’s the reliability, stupid. An interesting survey conducted by IBM and Oracle found that 55% of all Linux resellers came out of the NT world, with another 18% coming out of the NetWare community.

Although Linux true believers still talk about clients, for now, it’s the back end that’s making waves. The experiences of Linux pioneers, who generally locked their servers away back room cabinets, rebooting them at most once or twice a year, is beginning to make real impressions on vendors pressed to deliver lower cost-of-ownership solutions. Not surprisingly, the idea sounds quite enticing to weary LAN administrators tired of constant fire fighting.

At the LinuxWorld conference, the announcement of the 2.4 kernel’s timetable (maybe in the next 6 months), or the latest dot releases from the major distributors, was the least of the news. Instead, pay attention to the expanding array of 24 x 7 support programs from IBM, Compaq, and HP that are aimed at changing the misconception that nobody’s responsible for open source software.

Maybe open source process encourages more frequent updates to the kernel. Maybe talk from the Linus Torvalds of the world about “the good fragmentation” that comes with Linux sounds overly chaotic to administrators worried about avoiding another crash from undocumented or unsupported code. But if you stick with a major Linux distributor and platform provider, you won’t be worrying about which dot.dot release kernel is the absolute latest. Your updates will be controlled by the vendor’s schedule. It will become, in effect, like choosing whether you want the latest SAP R/3 update or if you’ll wait for the next major dot release. Maybe the bazaar won’t seem so bizarre after all.

Give yourself a pat on the back for this one. Midnight came on December 31, 1999, and chances are, the worst thing that happened was that your IT systems displayed a few errant report screens that rolled over from 1999 to 19100. Otherwise, your organization stayed open, or reopened for business as usual on Monday, January 3, 2000. So much for Y2K. In the end, it looked like somebody called a war, but nobody came.

By the middle of last year, most of us became pretty bullish on surviving the date change. Before that, however, it looked like Chicken Little was calling the shots. Remember the fears about the lights going out and chaos breaking loose? The myths about long-lost COBOL programmers commanding $100,000 salaries, or the dire warnings that you’d better book Y2K consultant time early, because their rates would skyrocket the longer you waited?

Fast forward to the present. Your systems rolled over with little incident. You didn’t have to worry about finding COBOL programmers or consulting help. Your company probably relied on your services, rather than hiring consultants, because people like you knew best where the date problems were, and knew the business well enough to know which date fixes had to be done, and which ones could pass. Maybe your company deferred some important IT projects, or maybe it accelerated a few long-awaited system replacements.

Congratulations. Your company survived the disease. Now, the question is, will it survive the cure?

For most organizations, Y2K was the moral equivalent of ERP. Y2K projects forced us to take stock of our IT assets, in many cases, for the first time. It drove organizations to clear the cobwebs, identifying what systems were still in use, how they were used, and which ones could be retired. So far, no pain, all gain.

However, the headaches began when your company began scouring those 15 – 20 year old COBOL programs. If your company was lucky, it still had a few of the original developers around. If not, it was time for the sleuth work.

Once the assessments turned to the bedrock systems your company used everyday, it was time to haul out the change management process, because those systems were too big to fix in one shot. It was time for a reality check with users. What parts of the system were most used, and which parts were the least valuable? What workarounds were possible? In what sequence should we fix the parts to minimize disruptions?

With the change management process updated, it was time to dig in. Renovate the code, followed by the never-ending testing. Again, the upside was the opportunity to engage users, whose acceptance was the final test.

The final step was dusting off those contingency plans. Of course, with Y2K work going so well, it was easy to grow complacent, but Y2K was a potential disaster too universal to ignore. Even if your company held up its end of the bargain, what would happen if the power went out, the phones failed, or your business partners began sending corrupt data?

Even the best-prepared organizations had to revisit their contingency plans. A South Florida company, which had plenty of experience planning for hurricane disruptions to headquarters, had to revise its plans because they didn’t adequately factor local processing problems at its regional distribution centers.

In the end, many of us gained a clearer picture of our IT assets. We updated our change management processes, beefed up testing procedures, and hopefully improved overall software quality. We engaged users from planning through final acceptance testing stages. If we were lucky, we had the chance to replace older desktops with new, standardized, more maintainable configurations and accelerate some new enterprise transaction system projects.

According to Gartner Group, we spent $300 – 600 billion worldwide to fix Y2K problems, and if we improved internal practices to boot, the investment should prove worthwhile. Significant pain, but plenty gain.

However, it’s human nature to mobilize for emergencies, then rest on our laurels. If we relax too much, all those updated asset inventories and change management practices could easily fall out of date. Lacking strong follow-through, the Y2K cure could prove worse than the disease.

Those are certainly fighting words. Sun has enjoyed a nice five-year run thanks to two factors: Its smart acquisition of the “right” portion of Cray, giving it the E10000 Starfires, and its invention of Java, the de facto language of e-Business.

Sun is still doing quite nicely, thank you. Q2 results beat the Street by a penny, as order streams avoided Y2K slowdowns. By comparison, HP’s Q4 was weak.

However, a few warning signs recently emerged. Sun’s once-cozy relationship with Oracle has grown more arms-length, now that HP is threatening to become first among equals. HP also won a high-profile deal at Amazon, although it was debatable whether HP clinched the deal with faster clockspeeds or fatter financing.

HP has recently made some shrewd moves in the promising ASP space, especially with its $500 million investment in building SAP hosting centers with Qwest. By comparison, what did Sun do with its $500 million? It bought Forte. Question: Which $500 million investment will be more lucrative in the long run?

These events should serve as wake-up calls for Sun. While all good things eventually come to an end, Sun’s e-Business ride still has plenty of upside ahead. But it shouldn’t fool itself. Java may make Sun relevant to dot coms, but as more enterprises adopt dot com strategies, they will need ASPs to get them there—and that’s where the bucks will be. HP is making the right noises and investments, but the jury is still out on whether it can execute in Internet time. That’s never been Sun’s problem, but going forward, it needs to cop some ASP wins to ensure that McNealy and Co. remain the dot in dot com.

The recent announcement that Bill Gates was giving up the day to day running of Microsoft was given more press attention than it deserved. There was nothing shocking about the succession. The handwriting has been on the wall since Gates first gave up the presidency last year. Although Gates is very much a technologist, Microsoft’s success came under his watch, so he must have done something right in managing, selling, and marketing. As chairman, Gates, the technologist, will get to devote more attention to The Road Ahead.

The Ballmer succession will mark a needed change of tone for the company. As Gartner Group analyst Michael Gartenberg put it, Ballmer will give Microsoft more of a human face. At the Gartner Group Symposium last fall, just about the only newsworthy event was when Ballmer promised to take as action item an audience request to loosen licensing restrictions on customers sharing beta release results for BackOffice and Exchange. Although trivial in the overall scheme of things, the episode indicates a new tone for the software giant. Expect more humility from Redmond, a quality that will become necessary when Windows 2000 makes its play for data centers. When that happens, Microsoft customers will need better hand holding than Redmond has ever delivered before.