Chicken Little – No Electricity for Your Computer

Are experts right that you will no longer have enough electricity to run your computer after 2040?

Computers are my life – literally. As an engineer and spending more than two decades in the software industry, I’ve grown up alongside the computer as a pivotal tool throughout my career. So you can imagine my dismay at seeing news lighting up the Internet from The Sun, The Daily Mail, Science Alert and even Reddit reporting industry experts predictions that computers will outstrip the world’s supply of electricity by 2040 – with or without “the singularity”. With this many outlets picking up on the story, there had to be something to it – especially after hearing it from Rush himself.

What is in the SIA model to predict this outcome? As any engineer knows, the assumptions used in any model can dramatically impact its validity. The engineering challenge for me was now officially on: What factors went into the SIA model, were they logical, and could they be corroborated?

With a bit of research, the basis of the story was from this report by the Semiconductor Industry Association (SIA). Published late last year to highlight current challenges and potential solutions by industry experts in the semiconductor industry, the report is garnering attention now due to publication of the organization’s annual roadmap. Citing trends like the Internet of Things (IoT) and Big Data, the report lays out nascent computing trends and investment approaches to overcome currently identified roadblocks in a variety of areas such as hardware architecture.

The controversial conclusion comes from an appendix in the report detailing the total effective computing capacity where several variables such as per bit energy consumption of computing hardware are compared in the figure below.

From this chart, the report concludes, “For this benchmark energy per bit, computing will not be sustainable by 2040, when the energy required for computing will exceed the estimated world’s energy production. Thus, radical improvement in the energy efficiency of computing is needed.” But this assumes the world’s energy production stays flat and target system energy consumption continually rises.

So, is world energy production remaining flat a valid assumption? According to the U.S. Energy Information Administration (eia), the world net electricity generation increases 69% by 2040 as published in their report, International Energy Outlook 2016. With other advancements in the energy sector such as fracking, solar, and wind generation, this assumption does not appear valid.

Is it reasonable to assume system level energy consumption will continually rise? I must admit that I’m not completely current on the latest developments in computing chip architecture, but engineers are always applying the latest developments in science to overcome the world’s most pressing challenges. For example, in the video below from Intel, Mark Bohr explains how their Tri-Gate transistor increases performance by 37% while lowering power consumption 50%.

While this analysis is far from exhaustive, it seems to me that two of the assumptions used in the model are incorrect. With the importance of computers in our day-to-day lives and knowing that a new generation of engineers are hard at work overcoming evolving challenges, I’m am certain there’ll be enough energy to power all my computing devices – at least for my lifetime. And as long as there are wildly popular apps such as Pokémon Go, I am certain the next generation will be sufficiently motivated.

What are your thoughts on the outlook for energy consumption of computers? How do you go about choosing the assumptions for your models? Tell us about your quest for unconventional knowledge and what it could mean for the future of your products or companies. Share your thoughts in the comments section below and don’t forget to follow us on your favorite social media channel.