The semiconducting silicon chip brought about a wave of electronic transformation the propelled technology and forever changed the way society functions. We now live in a digital world, where almost everything we encounter on a daily basis is comprised of a mass of silicon integrated circuits (IC) and transistors. But with the materials used to develop and improve these devices being pushed to their limits, the question of the future of electronics arises.

The Beginnings

The move towards a digital age really took flight late in 1947 at Bell Labs when a little device known as the transistor was developed. After this development, Gordon Moore became a pioneering research in the field of electronics and coined Moore’s law in 1965, which dictated that transistor density would double every two years.

Just over 50 years after that prediction, Moore’s law is still holding true. However, researchers and engineers are beginning to hit a bit of a roadblock. Current circuit measurement are coming in a 2nm wide—equating to a size roughly between a red blood cell and a single strand of DNA. Because the integrated circuits are hitting their limit in size, it’s becoming much more difficult to continue the projected growth of Moore’s law.

The question then arises of how do we combat this problem; or do we move toward finding an alternative to silicon itself? What are the true limits of technology?