19th Century Economist Reveals Surefire Investment Strategy!

Intel just finished the “best year in [its] history,” and expects 2011 to be even better. This news suggests a few important questions: for how much longer are we going to keep buying more and more more powerful microchips? Will 2012 be still better for Intel and other hardware suppliers? 2020? 2050? How much can the demand for computation keep expanding?

I first started asking myself these questions after I drew graphs (using US BEA data) of changes over time in computer cost and aggregate US corporate computer spending. They reveal a deeply weird pattern: as computers get cheaper, companies spend more and more on them. See for yourself:

I can see how this pattern would hold for a while. There’s some level of demand among companies for computational power, and until that demand is met investment will continue. Total investment could well increase for a while, even as prices dropped, because there’s so much thirst for computers, but after some finite period of time it’ll level off and maybe even start to decrease. After everybody’s got an Internet-connected PC and a smartphone, and after datacenters have centralized and moved into the cloud, total US corporate spending on computing gear will taper off, right?

Wrong, if an obscure 19th century English economist’s thinking is any guide to our current situation. And I think it is.

In 1865 (when he was only twenty-nine years old) William Stanley Jevons published The Coal Question, a book about the consequences of greater energy efficiency. In it, he advanced an idea that’s still counterintuitive and controversial: that greater energy efficiency leads not to lower total energy consumption, but instead to exactly the opposite outcome: higher aggregate consumption.

The standard (and correct) line about greater efficiency is that it lets us do the same with less consumption. Therefore, standard thinking goes, we’ll consume less. But an equally correct statement about greater efficiency is that it lets us do more with less, and Jevons’s great insight was that we’ll take advantage of this fact to do more.

As coal burning furnaces become more efficient, for example, British manufacturers will build more of them, and increase total iron production while keeping their total coal bill the same. This greater supply will lower the price of iron, which will stimulate new uses for the metal, which will stimulate demand for more furnaces, which will mean a need for more coal. The end result of this will be, according to Jevons, “the greater number of furnaces will more than make up for the diminished consumption of each.”

In a paper published in 1998, the Yale economist William D. Nordhaus estimated the cost of lighting throughout human history. An ancient Babylonian, he calculated, needed to work more than forty one hours to acquire enough lamp oil to provide a thousand lumen-hours of light—the equivalent of a seventy five watt incandescent bulb burning for about an hour. Thirty five hundred years later, a contemporary of Thomas Jefferson’s could buy the same amount of illumination, in the form of tallow candles, by working for about five hours and twenty minutes. By 1992, an average American, with access to compact fluorescents, could do the same in less than half a second. Increasing the energy efficiency of illumination is nothing new; improved lighting has been “a lunch you’re paid to eat” ever since humans upgraded from cave fires (fifty eight hours of labor for our early Stone Age ancestors).

Yet our efficiency gains haven’t reduced the energy we expend on illumination or shrunk our energy consumption over all. On the contrary, we now generate light so extravagantly that darkness itself is spoken of as an endangered natural resource.

Not everyone agrees that the “rebound” effect hypothesized by Jevons is still a big deal in today’s energy consumption patterns, and many smart people believe that greater energy efficiency is in fact the way to reduce energy use in the future. Longtime efficiency advocate Amory Lovins of the Rocky Mountain Institute, for example, sent this reply to Owen’s article.

Rather than wading into the energy debate, though, I want to wonder out loud if computation is like energy. Both are necessary inputs to many productive activities. Both are consumed by every company in every industry. Both are amplifiers of human ability: energy amplifies or replaces our muscles; computation does the same for our brains and senses. Both come from physical things, yet are themselves ethereal; you can hold a lump of coal or a transistor in your hand, but not a joule or a megaFLOP. And the devices that generate both are getting more efficient over time.

If the analogy is a tight one, if Moore’s Law continues to hold true, and if Jevons was right about energy-intensive processes, then one conclusion seems inescapable: the trends visible in both of the graphs above are going to continue. Computers are going to keep getting cheaper, and aggregate demand for them is going to continue to rise.

Innovators will keep figuring out new things to do with computers as they keep getting cheaper (and smaller and lighter), and this innovation in use cases will more than offset the steadily falling cost of a unit of computation. Some of these innovations are visible now — tablets, smartphones, RFID tags and other smart sensors, computer pills, labs-on-a-chip, and so on — and some of them aren’t. To (badly) paraphrase David Mamet, nobody can predict innovation; that’s why they call it innovation.

One final implication of this logic is that the computation industries are going to keep growing for a long time to come, just like the lighting and other energy industries have done since Stone Age and Babylonian times. So if you’ll excuse me, I’m going to go find a good hardware manufacturer index fund to invest in…

What do you think? Do Jevons’s insights apply to demand and supply for computation today? Would now be a good or bad time to invest in hardware makers? Leave a comment, please, and let us know what you think.