Some forms of technology -- think, for example, of computer chips -- are on a fast track to constant improvements, while others evolve much more slowly. Now, a new study by researchers at MIT and other institutions shows that it may be possible to predict which technologies are likeliest to advance rapidly, and therefore may be worth more investment in research and resources.

In a nutshell, the researchers found that the greater a technology's complexity, the more slowly it changes and improves over time. They devised a way of mathematically modeling complexity, breaking a system down into its individual components and then mapping all the interconnections between these components.

"It gives you a way to think about how the structure of the technology affects the rate of improvement," says Jessika Trancik, assistant professor of engineering systems at MIT. Trancik wrote the paper with James McNerney, a graduate student at Boston University (BU); Santa Fe Institute Professor Doyne Farmer; and BU physics professor Sid Redner. It appears online this week in the Proceedings of the National Academy of Sciences.

The team was inspired by the complexity of energy-related technologies ranging from tiny transistors to huge coal-fired powerplants. They have tracked how these technologies improve over time, either through reduced cost or better performance, and, in this paper, develop a model to compare that progress to the complexity of the design and the degree of connectivity among its different components.

The authors say the approach they devised for comparing technologies could, for example, help policymakers mitigate climate change: By predicting which low-carbon technologies are likeliest to improve rapidly, their strategy could help identify the most effective areas to concentrate research funding. The analysis makes it possible to pick technologies "not just so they will work well today, but ones that will be subject to rapid development in the future," Trancik says.

Besides the importance of overall design complexity in slowing the rate of improvement, the researchers also found that certain patterns of interconnection can create bottlenecks, causing the pace of improvements to come in fits and starts rather than at a steady rate.

"In this paper, we develop a theory that shows why we see the rates of improvement that we see," Trancik says. Now that they have developed the theory, she and her colleagues are moving on to do empirical analysis of many different technologies to gauge how effective the model is in practice. "We're doing a lot of work on analyzing large data sets" on different products and processes, she says.

For now, she suggests, the method is most useful for comparing two different technologies "whose components are similar, but whose design complexity is different." For example, the analysis could be used to compare different approaches to next-generation solar photovoltaic cells, she says. The method can also be applied to processes, such as improving the design of supply chains or infrastructure systems. "It can be applied at many different scales," she says.

Koen Frenken, professor of economics of innovation and technological change at Eindhoven University of Technology in the Netherlands, says this paper "provides a long-awaited theory" for the well-known phenomenon of learning curves. "It has remained a puzzle why the rates at which humans learn differ so markedly among technologies. This paper provides an explanation by looking at the complexity of technology, using a clever way to model design complexity."

Frenken adds, "The paper opens up new avenues for research. For example, one can verify their theory experimentally by having human subjects solve problems with different degrees of complexity." In addition, he says, "The implications for firms and policymakers [are] that R&D should not only be spent on invention of new technologies, but also on simplifying existing technologies so that humans will learn faster how to improve these technologies."

Ultimately, the kind of analysis developed in this paper could become part of the design process -- allowing engineers to "design for rapid innovation," Trancik says, by using these principles to determine "how you set up the architecture of your system."

July 30, 2015  It is possible to predict the timing and intensity of influenza outbreaks in subtropical climates like Hong Kong where flu seasons can occur at different times and more than once during a year, ... read more

July 30, 2015  A new automated data mining system could lead to a dramatic increase in the detection of potentially illegal online sales of elephant ivory through eBay. Law enforcement agencies and conservationist ... read more

July 28, 2015  Researchers explain how the new paradigm of a digital healthcare system, as it matures, is putting the picture of the doctor-patient relationship in an entirely new frame and not always in a positive ... read more

May 6, 2014  Scientists have developed a way to dramatically reduce the complexity of modeling "bistable" systems which involve the interaction of two evolving species where one changes faster than the ... read more

Jan. 25, 2013  Forget Moore's Law. Researchers define new ways to evaluate new technologies. The bread and butter of investing for Silicon Valley tech companies is stale. Instead, a new method of predicting ... read more

June 27, 2011  A team from the U.S. National Institute of Standards and Technology shares critical "lessons learned" that can help businesses and others negotiate the promises and pitfalls encountered ... read more