This is the 40th anniversary of the release of the Intel 4004 - the first commercially available microprocessor. That one development helped to kick off the entire digital revolution.

"We wouldn't have had a digital era without the microprocessor," said Zeus Kerravala, principal analyst with ZK Research. "Before that, computers used a combination of tubes, resistors and diodes. In fact, I remember having a personal calculator made with tubes. It was about the size of a laptop and all it did was basic math."

Microprocessors basically are the "brains" inside computers and other electronic devices. By advancing from large, cumbersome tubes to small microprocessors, computers not only got smarter, they got faster and smaller. And smaller has been a very big deal.

Consider the advent of the portable laptop and the smartphone. Computer chips also run inside our cars as well as in many refrigerators and other appliances.

"The microprocessor revolutionized the way we work, live, learn and play," said Kerravala. "It's made it possible to embed computing into most everything in our lives - cars, phones, personal computers, gaming systems. It's allowed computing to be in the palm of your hand. It was a huge milestone."

Kerravala also noted that the first commercially available microprocessor was a key part of the perfect storm that gave rise to the Internet.

"The microprocessor allowed us to miniaturize compute capabilities," he added. "But there was this great combination of events that included the first processor, the birth of Windows, connectivity, and cheaper storage. Together it was huge."

And computer processors have made dramatic advances in the past 40 years.

According to Intel, compared to that first Intel 4004, today's second-generation Intel Core processors have more than 350,000 times the performance, and each transistor uses about 5,000 times less energy.

"The sheer number of advances in the next 40 years will equal or surpass all of the innovative activity that has taken place over the last 10,000 years of human history," said Justin Rattner, Intel chief technology officer, in a written statement.