IBM's decline seen as sign of better model for growth Largest computer maker forced into role of follower

December 16, 1992|By N.Y. Times News Service

The big cutback that IBM announced yesterday has underscored what some of the nation's experts on technology have been saying for several years: While the world's largest computer maker remains a huge enterprise, it no longer can be expected to set the pace for technological innovation in the United States.

IBM will slash its development spending by $1 billion, or 17 percent, next year, an amount unprecedented in its history. The company spent about $6.6 billion in 1991, the last year for which figures are available. Analysts believe it will spend a similar amount this year.

"IBM as the IBM that defined computing is over; it's gone," says Nathan Myrthvold, Microsoft Corp.'s vice president of advanced technology and business development. "As a result of new technology developments during the past 10 years, IBM's role in the computer industry has fundamentally changed."

The consequence is that what was once one of the world's most vaunted high-tech companies has been reduced to the role of a follower, frequently responding slowly and ineffectively to the major technological forces reshaping the industry.

It has become increasingly clear that IBM will no longer be the technological bulwark against foreign economic competition, a role it often played in the past with such inventions as the Winchester disk drive for data storage, a seminal achievement in 1973.

But rather than being a sign for despair, IBM's decline may instead mark the emergence of a new model for computer and technology development in the United States that will become powerful in its own right, some technology experts say.

Alliances of smaller, more innovative companies -- sometimes working with government support and sometimes on their own -- will come together quickly to attack crucial problems.

"We have watched while a small number of very important development organizations in the United States, including IBM, Bell Laboratories and Xerox, have slowly taken apart the engines behind their innovation," says R. Andrew Heller, a former IBM engineer who now runs a small Silicon Valley start-up company. "But we're seeing new centers of innovation emerge. We have to be optimistic."

This year, for example, the United States will recapture the lead in market share in the world semiconductor market after lagging behind Japan for several years.

And Intel Corp., in Santa Clara, Calif., is poised to reclaim its title as the world's largest and most successful chip maker.

Moreover, despite fears of overwhelming international competition, U.S. personal computer and disk drive companies have largely held the line against Ashan and European competitors in fierce price-cutting battles.

A share of the credit for the semiconductor industry's remarkable turnaround is being given to Sematech, the industry-government research consortium created five years ago to revitalize the country's chip makers. The consortium, based in Austin, Texas, developed new techniques to make chips to counter strides by Japanese competitors.

Sematech's success could point the way for further alliances in areas where in the past giant multinational corporations like IBM and Digital Equipment Corp. would have carried most of the financial burden.

"It's fortuitous that the Clinton administration has arrived when it did," says Samuel H. Fuller, vice president of research at Digital Equipment. "It's important that we have these underlying technologies and when companies like IBM can no longer entirely fund development, it's important that the government step in."

This new alliance-based model of development and competition is one that IBM has had a large hand in creating.

As it found itself increasingly unable to compete and simultaneously sustain the burden of new technology development, the computer maker has reached out and built hundreds of alliances in recent years, ranging from one with Apple Computer to develop software to an one with Supercomputing Systems Inc. in Eau Claire, Wis., to make supercomputers.

And while some of these alliances have been successful, they have not been enough to turn the much greater tide of competition and shifting market forces that have engulfed IBM

But despite optimism about future research and development alliances, a deeper, more troubling issue still remains: Has IBM's technology spending nevertheless translated into a U.S. advantage that will now be lost? The answer coming from many analysts and computer industry executives is no.

They now argue that the company's vast laboratories have frequently done as much to bottle up needed technologies as to push the nation forward.

"We need to invest differently and what IBM didn't do is figure out how to invest in things that paid off," says David Ditzel, director of advanced systems at Sun Microsystems, now a giant but a few years ago a tiny company that through quick moves came to dominate the engineering work station market that IBM was slow to enter.

IBM watchers also argue that by spinning off tens of thousands of engineers and programmers in the last five years as the company began shrinking its work force, the computer maker has actually increased the pace of innovation.

For example, while IBM invented a way of speeding up computer hardware, it was commercialized elsewhere first, in part by former IBM researchers.

The company developed a fundamental data base computer technology called SQL, but it was Oracle Systems Corp. in Belmont, Calif., that took the technology and turned it into a company with sales of $1 billion a year, eclipsing IBM in the data base market.

Underlying IBM's collapse is the simple fact of what the economist George Gilder has called "the law of the microcosm." That is, every three years the world's chip makers have been able to put four times as many transistors on one piece of silicon, vastly increasing miniaturization.