A collection of observations, news and resources on the changing nature of innovation, technology, leadership, and other subjects.

March 12, 2007

Beyond the Ivory Tower

The March 3rd issue of The Economist has an excellent article, “The Rise and Fall of Corporate R&D - Out of the Dusty Labs” on the evolution of corporate R&D since the end of the Second World War. The article starts by describing the vision laid out by presidential science advisor Vannevar Bush in his 1945 seminal report, Science The Endless Frontier. "New knowledge can be obtained only through basic scientific research" conducted in universities and research labs, which is then applied to develop new products by the private sector and new and improved weapons by the defense sector, he said in the report.

Vannevar Bush's report was truly influential, and served as the blueprint for R&D in post-war America. "The Government should accept new responsibilities for promoting the flow of new scientific knowledge and the development of scientific talent in our youth," he further added. This led to the creation of the National Science Foundation a few years later. Large enterprises also embraced his recommendations, and organized corporate research labs, such as Xerox PARC and IBM Research.

But today the picture is very different. "Technology firms have left the big corporate R&D laboratory behind, shifting the emphasis from research to development," the article says and proceeds to then pose the key question "Does it matter?"

The Economist article truly resonated with me, because my own personal career has pretty much followed the arc it describes. I started out doing research in computational atomic and molecular physics in the 1960s as a graduate student at the University of Chicago. In 1970 I joined the Computer Sciences Department in IBM's Thomas J Watson Research Center. While it took me awhile to get over the guilt of leaving academia for industry, the reality is that, at the time, IBM Research was primarily focused on the kind of long-term, fundamental research advocated by Vannevar Bush. During my early years at IBM Research, I must say that I knew little about IBM's products and what our commercial customers did with them, and neither did many of my research colleagues. That was what the development and sales divisions worried about, while those of us in our beautiful ivory tower in Yorktown Heights - designed by famous architect Eero Saarinen - concentrated on great intellectual pursuits disconnected from the practical concerns of the marketplace.

John Seely Brown, former director of Xerox PARC, succinctly captured the prevailing attitude of many of us in research labs in those days, with this quote in The Economist article: "When I started out running PARC, I thought 99% of the work was creating the innovation, and then throwing it over the transom for dumb marketers to figure out how to market it." Needless to say, the "dumb" marketers and product developers in IBM were usually unimpressed when we lectured them about our brilliant ideas. Since we did not really understand what it was that they actually did, let alone what their problems were, we could not tell them how our innovative ideas could be of help to them in their work. The gap between "R" and "D" was pretty wide in those days, not only at IBM but between the research and development communities in the US, UK and other countries.

But then our culture began to change. IBM Research began its transformation in the late 1970s under the leadership of Ralph Gomory, who is not only a great research manager but a world-class mathematician, and who later went on to become president of the Sloan Foundation. We started to embrace the notion of being vital to IBM as well as to science. We complemented our contributions to science with contributions to IBM's future products and long term strategies. It was now our responsibility to understand how the technical innovations from our research labs could improve future products and their application in the real world. There were no more “dumb” marketers or developers. They were now our colleagues and collaborators.

To my surprise, I discovered that I enjoyed not only working on technology and its implications, but that I also enjoyed, perhaps even more, the challenges of getting new technologies to the marketplace. Once I started viewing the world out there as a source of inspiration and stimulation for what problems to work on, my attitude totally changed. So did my professional career, and in 1985 I left IBM Research, where I was then director of Computer Sciences, and started working in our development organizations, where I have been ever since.

This change in culture proved to be very healthy for our research labs and for IBM. I am convinced that had we not become more relevant to IBM, we would have eventually disappeared or become shadows of our former selves, as happened to so many other corporate labs when the companies they were part of got into serious financial trouble. In fact, when IBM went through its own near-death experience in the early 1990s as a result of a technology shift - from bipolar to the much less expensive CMOS, - which almost killed our mainframe business and IBM itself, what saved us is that the technical community in our research and development labs had been anticipating this technology shift for several years. They had been designing and prototyping the microprocessors and parallel architectures needed to transition the mainframes, along with the migration of all the software and installed base of customer applications. While the transition was still exceedingly painful, we survived it while many of our competitors did not.

By using “The Rise and Fall of Corporate R&D” as its title, readers of The Economist’s article could be left with the impression that corporate research labs are a thing of the past. While the nature of research has indeed changed since the days of Vannevar Bush, its importance to the enterprise is perhaps even greater. By better integrating our researchers in IBM with the rest of the company, let alone the world at large, they have been active participants in organizing our most important and sophisticated initiatives, including parallel supercomputing, the Internet, Linux and open source, Grid computing, services sciences, and 3D Internet. Thus, our research labs have been critical not just to IBM’s very survival but to our long term market strategies.

Getting back to the key question posed in The Economist article: Does it matter that the research community, not just in corporations but also in universities and government labs, is now more involved in problems in development and even in the marketplace itself? Is it bad that "the new model of R&D turns researchers into the shock troops of innovation," or, as John Seely Brown further comments, that "there is at least as much creativity in finding ways to take the idea to market as coming up with the idea in the first place"?

I don’t think so. I am on John's side, for a variety of reasons. First of all, as a result of the incredible science and technology advances that have come out of universities, government and corporate labs - many supported by the kind of wise government support advocated by Vannevar Bush - we are now able to tackle problems of almost unimaginable complexity. Think of modeling the human brain with advanced supercomputers to help us find cures for major psychiatric disorders like Alzheimer’s, schizophrenia or autism. Think of having the ability to design, simulate and operate a globally integrated enterprise by focusing not just on the back-end processes that are fairly well understood, but also on the more numerous and complex, people-oriented, market-facing processes. There is no way that such problems can be addressed without the researchers getting personally involved, not just in the science and technology innovations, but also in the market, business and societal ones.

Then there is the question of time-to-market. "The idea devalues itself over time if you don't get it to market quickly," said Paul Horn, who heads IBM Research. "Everything we do is aimed at avoiding a 'handoff.'" This is very different from the older, slower-moving days when you had time to transfer the technology from the research teams to the development and manufacturing teams to the sales and services teams. The marketplace, along with your faster-moving competitors, will simply speed past and leave you behind.

Finally, there is the increasingly global, competitive marketplace out there. Long-term, fundamental research is a major driver of innovation, and thus more important than ever. However, given the huge competitive pressures every company now faces, the vast majority don't have the wherewithal to do much beyond bringing new products and services to market. And even the few companies like IBM and Microsoft that continue to have world-class research labs have to be more careful with their long-term research investments, because they, too, face intense competitive pressures from companies around the world. Government has the ultimate responsibility for supporting the majority of the long term research.

I strongly believe that no company, no matter how big and powerful, can afford to ignore the forces of innovation to keep up with, and hopefully stay ahead of its competitors. Research labs, whether supported by government or corporations, served us well for many years as a source of new technologies, ideas and knowledge. But today that is not enough. In today's world, we have to step out of our ivory towers into the global marketplace – which has essentially become our 21st century lab.

Comments

For further evidence of the value of eschewing Vannevar Bush's model of separation between 'pure, untainted' research and that influenced by considerations of use, see Donald Stokes' excellent book "Pasteur's Quadrant: Basic Science and Technological Innovation." In it he argues that in the history of science, the notion of a distinct separation as proposed by Bush is the exception instead of the rule, and uses Pasteur's work to show how scientific advancement can in fact be aided by mixing investigation for fundamental understanding with an exploration of how such understanding might be put to use. That certainly has been our experience at IBM Research, where many projects (such as our work on new approaches to supercomputing systems such as BlueGene) have been informed by considering a range of use constraints and requirements as part of our inquiry into novel computer science problems.