The Rise In Computing Power: Why Ubiquitous Artificial Intelligence Is Now A Reality

f you used a speech-to-text program a decade ago, you probably don’t have fond memories of the experience.Right out of the box, you needed to spend hours talking to your device and correcting its mistakes to get the computer to recognize your voice. Then, when you went to actually use the software, sometimes—not always, but sometimes—it might actually function. It was inaccurate and frustrating—but it was a start.

Today, you can just say, “Alexa, play The Beatles” and within a few seconds, there comes “Here Comes the Sun” from your speakers. You can say, “Phone home,” and your device will look up and dial the number in your contacts list. Artificial intelligence (AI) is now all around us. It’s so common that we take it for granted.

So, what happened? Why did speech-to-text work so poorly a decade ago, yet so accurately today? Why was it unthinkable just a few years ago to walk through a market, pick items off the shelves and walkout of the store without passing a cashier, when today that kind of shopping is a reality? On a bigger scale, why are algorithms that were basically invented in the 1950s through the 1980s only now causing such a transformation of business and society?

The answer is simple: Technology has finally become powerful enough to enable the promise of AI.

Brute-Force Breakthroughs

AI is, essentially, a massive series of math problems that learn to correct their own mistakes. In the case of speech recognition, a computer must perform millions of math problems (or calculations) per second for a system to “learn” and “recognize” the patterns that are fed through it. This requires pure brute-force computational power that, until quite recently, simply wasn’t available.

Why are algorithms that were basically invented in the 1950s through the 1980s only now causing such a transformation of business and society?

“Marvin Minsky [a well-known cognitive scientist at MIT] did AI in 1957. Computers at that time were a billion times slower than they are now,” said Richard Mark Soley, Ph.D., executive director of the Industrial Internet Consortium and chairman and CEO of Object Management Group. In addition to having only a fraction of the performance, machines of yesteryear also came with a hefty price tag.

“I worked on speech recognition 35 years ago,” Soley continued. “Computers then did an OK job and cost a couple million dollars. Now, what used to be thought of as supercomputers are inside smartphones. They cost a million times less, are a million times faster and have a million times as much memory.”

AI Hardware Today

Today, the latest chips from Intel can run over 10 trillion calculations per second. To give you an idea of the scale: If you had a dollar bill for each calculation and then stacked those dollar bills on top of one another, the stack could go to the moon and back, and then get halfway back to the moon again.

Modern chips have attained other capabilities that make them perfect for AI, aside from just raw power. Because many AI “math problems” have a similar structure, some chips are being optimized to carry out those calculations more quickly. Additionally, in a typical computer or server, every time the chip has to retrieve a bit of data from memory (typically a separate device), the whole process slows down. That’s why more memory is being built into the processor itself so there’s less need to move data back and forth between the processor and memory chips. Chips built this way, like the new Xeons being developed by Intel, lend themselves naturally to dense image recognition, deep learning on unstructured data such as voice and text, and a variety of AI jobs or “workloads.”

Another key hardware development is the ability to reprogram chips on the fly so they are more useful for a variety of applications. Devices called Field Programmable Gate Arrays (FPGAs) pair up with traditional chips to not only provide this flexibility, but also drastically speed up their processing power.

Moreover, as AI proliferates to our cars, our wrists and the little devices in our homes, it’s critical that the processors run on as little energy as possible, else you’d need to recharge your batteries every hour or so. While FPGAs provide much-needed energy efficiency, there are specialized, low-energy-footprint processors that are designed for devices in our homes and anywhere that’s not a data center.

The Overlooked Accelerant: Speed Of Communication

In the past two years, we created nine times more data than we did between the dawn of man and the year 2015, according to research by Domo. This data is the fuel for artificial intelligence applications. But just as oil in the ground powers no cars, if data can’t move to where it’s needed, it’s useless. This is not a trivial problem and was one of the key roadblocks to the growth of AI.

"Now, what used to be thought of as supercomputers are inside smartphones. They cost a million times less, are a million times faster and have a million times as much memory."

The invention of fiber-optic cables and 3G and 4G wireless was important for enabling large quantities of data to fly back and forth at rapid speeds. For instance, none of the video streaming services we now enjoy would be possible without these, and the iPhone would still be only a PalmPilot with a touchscreen and improved web browser. In a very real sense, it was these communications technologies, paired with cloud computing, that enabled the “app economy”and the wave of digital transformation that’s still sweeping through business.

5G (or fifth-generation wireless systems) promises to be the next great accelerator, especially for AI.

Stepping On The Gas With 5G

By 2020, Gartner estimates that 20.4 billion devices will be connected to the internet, many of which will use AI by tapping into the cloud and processing data on the device itself—or, more commonly, a bit of both. The communications bandwidth that would be required to allow these devices to send data back and forth and function as promised is staggering.

This requires a whole new approach to wireless communication. Here’s where 5G comes in. More a bundle of different communications technologies than it is “one thing,” 5G can scale up to function like a 20-lane freeway. With today’s traffic, there’s practically unrestricted movement, enabling extremely high speeds compared with the old standards. We can’t even begin to predict the new types of use cases this will enable.

AI: A Hardware Story

Even though the steam engine was invented in 1698, it took over 60 years for the first industrial revolution to really get going. In our time, it’s taken roughly as long for a set of inventive algorithms to give just an inkling of their full potential.

We may not know where the road is leading, but the future of AI, like its past, is likely to be decided by the hardware that powers it.