The first computers took up entire rooms. They spent hours doing simple calculations. Yet they had less computing power than the typical smartphone has today. In less than 50 years, we’ve created devices that make the science fiction of the past look shortsighted. How did computers change so quickly?

Moore, Better, Faster

In the 1960s, an engineer named Gordon Moore noticed that microchips were improving at a phenomenal rate. He predicted that the number of transistors that scientists could fit on a silicon chip would double every two years—an exponential function! By fitting more processing power onto each chip, engineers could create faster, smaller, more versatile computers every two years. Moore’s law predicted that these increases would continue at an exponential rate. As technology has improved, computers have become not only faster but also less expensive. In 1977, an Apple II computer cost $2,638. It had 48 kilobytes (kB) of RAM, and its processer ran at speeds of 1 Megahertz (MHz). Today, the iPhone 5 sells for $150 dollars with a cell phone plan. It has 1016 megabytes (MB) of RAM and a processor that runs at 1300 MHz, which makes it 1,300 times faster than the early Apple computers and capable of storing more than 20,000 times the memory. It also costs less than 6% of the original price, even if inflation is ignored. You have more computing power in your mobile device than scientists of the 80s had in their labs.

Engineers predict that we’ll soon reach the limits of Moore’s law. Right now, engineers can create transistors that measure 14-22 nanometers (nm) across. By 2020, they’ll have reached the 5-7 nm range. After that, it will likely become too expensive to develop smaller transistors, and the exponential increases will stop.