'Next big thing' may be process, not a product

Peterborough, N.H.  IBM Research is celebrating its 60th anniversary in a year that may prove to be as transformational as 1945 was, for the company and for the world in general. Where the year IBM Corp.'s research arm was founded marked the start of the Cold War and the nuclear-arms race, from now on it may be intangibles  such as information, software and intellectual property, delivered via the Internet  that will be the center of global struggles for dominance, rather than physical products such as weapons systems, computers, cell phones or automobiles. That is how Paul Horn, the current director IBM Research, sees it.

"If you want to know what the next big thing is in the IT industry, it's not a device  which may have a $100 million impact," Horn said. "It's really the ability of IT technology to take inefficiencies out of the global GDP. Without any doubt, in my mind, that is the next big thing." IBM, he said, sees "a huge emerging opportunity in what we call BPTS  business process transformational services."

The current worldwide IT market represents about $1.2 trillion, and Horn sees that growing by at least another trillion dollars in the next few years.

Computers are getting better at representing physical systems for the user, translating the representation into bits that are sent via the Internet. And with modern automation, that information can be translated into control data for automated assembly.

That is a model of efficiency, but it is also cause for concern across the corporate culture. Horn worries that the ability of worldwide ad hoc groups of software engineers to organize around open-source, not-for-profit enterprises is posing a challenge to major computer companies that are themselves redefining their business model in terms of delivery of services, rather than products.

"We've got this whole disruption around how you get value out of software," he said. "The open-source worldwide communities are generating software of great power and value, and the whole service business is undergoing tremendous disruption. Software is being delivered as a service, so there isn't a piece of our industry that isn't undergoing radical change."

The disruption of the computer business is being propelled by underlying technology that will be moving ahead on its own momentum. Whether or not they turn a profit, basic innovations in materials and circuits, computer architectures and algorithms will become pivots of future technological advances.

The overall policy at the largest private research organization in the world  which employs 2,400 scientists and engineers at nine locations worldwide  is to push innovation at all levels, from basic devices through to systems, algorithm design and software.

"There are some very big trends in all pieces of our business that are going to be disruptive technologies," said Horn. He cited "fundamentally new ways that you will have to build semiconductors that are not simple linear extensions of where we are today; fundamentally new compute architectures that are right now on the horizon; [and] compute architectures which are low power, what we would call scaled out vs. scaled up."

"Scaled-out" systems have become a fundamental necessity as ICs begin to run into the physical barrier of power saturation. While there seems to be no limit to the circuit density on chips  and advances in packaging amplify that by packing chips ever more densely in systems  getting the heat out of a piece of hardware is becoming fundamental.

Focus on scienceWhen IBM's initial research lab was built at Columbia University in New York in 1945, the electronic computer was in its infancy. However, IBM was a major corporation, marketing a line of business information-processing machines based on mechanics, rather than electronics.

"IBM has always had an interest in applying information technology to science  that goes way back to the origins of the company," said Tom Theis, director of the physical-sciences division of IBM Research. Thomas J. Watson Sr., who formed IBM by merging several other business machine companies, was always interested in doing science as well as business, Theis explained.

"For example, the first PhD employed by the company was an astronomer, who used punched cards to do scientific computation," he said. "Back then, the lead technology developers were experienced machinists. They were from a tradition of artisanship and didn't have the connection to academia that we associate with research today."

But in the post-1945 world, fundamental physical science started replacing the foundation of machining and manufacturing that business machines were based on. The discovery of the transistor at Bell Laboratories in 1947, and the development of the integrated circuit in the '50s at Texas Instruments and Fairchild Semiconductor, suddenly shifted the business machine onto a new foundation that was governed by the science of solid-state physics and quantum mechanics.

IBM researchers did fundamental work on magnetic data storage that kicked off the disk drive industry. IBM researcher Bob Dennard invented a solid-state data storage device, the dynamic random-access memory, which Theis called the "most reproduced human-designed object in history."

In 1976, Dennard wrote the first study of how scaling would affect transistor performance. "Bob never thought that gate insulators would get so thin that quantum-mechanical tunneling would put a limit on transistor performance," said Theis.

IBM researchers plunged headlong into the world of atoms during the 1980s. One innovation, the scanning-tunneling microscope, invented at IBM's Zurich research center by Gerd Binnig and Heinrich Rohrer in 1980, revolutionized physics and won a Nobel Prize for the pair in 1986.

By the 1990s, IBM researchers had produced a demonstration experiment showing the letters "IBM" written in strings of single atoms on a conducting substrate, positioned by atomic-force microscope techniques. The technique has opened up a new world of imaging at the resolution of single atoms, while also creating a new atomic-level fabrication method.

This innovation might be compared with the laser, which was a fundamental development that affected science and technology widely, but did not immediately result in successful products.

Likewise, scanning-probe instruments have been developed for a wide number of research purposes, and might one day result in ultradense data storage devices  IBM is working on that angle as well. But there has been no immediate payoff in terms of products, despite the large influence the technique carries in research.

On the other hand, another discovery in basic physics, giant magnetoresistance, very rapidly went into products in the form of magnetic-spin valves for ultrasensistive disk drive read/write heads. By allowing the density of bits on a disk to skyrocket, the innovation resulted in tiny drives that could store gigabytes of data.

"You don't know, in the more exploratory areas, if [a research breakthrough] will ever have an impact on your products," said Horn. "Sometimes you find that it has enormous impact, like the natural-language work we did, which led to a whole series of products, or the work on interfaces to semiconductors that led to the copper back end of the line."

Over the years, Horn said, IBM has found that, for the markets in which it plays, "we get value out of the fundamental work that we [at IBM Research] do. We get a return on it, and so we are not going to stop, because it is helping us grow."

Showcase systemThe design of IBM Research's showcase system, the Blue Gene supercomputer, was driven by the trade-off between power and performance. Blue Gene/L, with a sustained performance of 136.8 teraflops, topped the list of the world's fastest computers announced last summer. It was developed in a joint project between IBM Research and the Department of Energy's National Nuclear Security Administration, and is being installed at Lawrence Livermore National Laboratories. The second fastest computer in the world, at 91.29 Tflops, is a privately owned Blue Gene system at IBM's T.J. Watson Research Center.

"Frequency is slowing down because of fundamental power limitations. Density continues to increase, but a transistor is not free anymore, like it used to be. When a transistor is there, it is basically dissipating power and you have to take that into account," said Tilak Agerwala, who heads the Systems Division at IBM Research. "The interesting point is that the relationship between performance and power is nonlinear in the following sense: If I reduce power by a factor of 10, I may only reduce performance by a factor of three."

In the Blue Gene project, that insight was combined with a total-systems view of computing, which included a basic scaling capability of the architecture.

"This is a fine example of how you leverage power efficiency, how you leverage intelligent system design, how you look across the entire stack to get something that is fundamentally more cost-effective and has the highest level of performance that has ever been seen," Agerwala said.

But from the larger perspective of commercial and scientific applications, performance is only one parameter. "There are all kinds of new applications that really require very high performance and are data intensive  digital media games, content management, financial risk analysis, medical and life sciences," Agerwala said. "These applications have significant performance and data requirements, and yet we are in an environment where we are not going to get that performance just out of the frequency of the basic devices."

Blue Gene is an attempt to address both the diversity of applications and the increasing demand for high-throughput data processing in one architecture. It will compete against many other server and workstation products and, as Horn pointed out, against the ability of software gurus to organize diverse computing resources over the Internet at virtually no cost.