Semiconductor, AI and IP

Who would have thought that Google, Tesla, Facebook and Cisco would someday be considered semiconductor leaders?

Well, I did. Having launched Micrel, a semiconductor company I ran for 37 years, I frequently saw the industry shifting. That non-chip companies now design their own chips was not surprising. What is surprising is that other semiconductor companies are themselves somehow surprised.

It has always been about applications

My industry has always understood that applications drive adoption of semiconductors. Yes, general purpose devices – from the earliest transistors to Intel’s latest powerhouse CPUs – might have handled many product needs, but always required engineers to extract the value out of raw, capabilities. But starting in the 1980s, the industry saw value in Application Specific Integrated Circuits (ASICs). Offloading specific workloads onto chips designed especially for those tasks was a logical step in product development.

The problem was that the chip makers led the parade, often years behind what the customer wanted and needed. At the time, there were few alternatives. The cost of design engineering was exorbitant, and the pool of chip designers was relatively small. On top of that, manufacturing the chips, once designed, represented another huge outlay of cash. So, semiconductor companies waited until a critical mass of customers showed sufficient demand for an ASIC solution before they would build one.

What happened next was that we in the semiconductor industry designed the technologies and economies necessary to allow Google, Tesla, Facebook and Cisco to cut us out of the pipeline.

Software, internet, and Asia

With our own self-interest as skin in the game, we sought to streamline design and manufacturing while reducing capital investment risk. The industry developed software for creating chips that, over time, became sophisticated enough that even people without a master’s degree in electrical engineering could “design” a chip. These junior engineers needed to create a circuit schematic and let automated tools do the rest.

Since fabrication – the making of chips – was still an expensive process, the semiconductor industry looked to shave costs. Asia was ready for the task. They built companies and factories that took our chip design computer files and made the actual products (which, in turn, accelerated the downfall of the American manufacturing sector). When this was paired with Asia’s lower labor costs and looser environmental regulations, the cost to produce chips sank even further and made shifting manufacturing risk overseas very desirable.

Then came the internet. This shattered the last barrier. Designers from any place on the planet could work for any company and create chip designs that were then uploaded to Asia.

All Google, Tesla, Facebook and Cisco had to do was document what they wanted and recruit the right people for the design work. Google and Facebook, knowing the value of Artificial Intelligence, are designing their own AI-specific chips. Tesla has crafted chips to reach Level 5 in autonomous vehicles. And Cisco, seeing the need to route more and more data at faster speeds, started dabbling in silicon.

Necessary and dangerous

It has now become desirable for non-chip companies to design their own chips. Legacy semiconductor companies cannot put extensive resources into very narrow product categories. Google’s AI needs may initially be completely unique to Google, so the likes of Intel, AMD and even NVIDIA could not invest the time and resources to meet Google’s needs.

The benefits are obvious and long-reaching. Google and Facebook understand their customers better. Tesla can get grandma safely to the grocery store and back. And Cisco will tie all these pieces together for billions of humans around the globe.

But, with each chip file that gets uploaded to China, design IP leaks, and nations with abjectly horrid human rights records obtain new tools to continue to suppress and abuse humanity.

It also leaves major semiconductor companies in the unenviable position of relying more and more on commodity products. The days of exotic ASIC design for other industries is slowly fading to historical footnote status. This, in turn, means industry icons will fall into the pattern of designing core chips with ever more horsepower and ever lower profit margins. Some of yesterday’s premiere chip companies will devolve to shoveling silicon for a living.

The first exciting question though is, “who is next?” Facebook seemed an unlikely candidate, but they have a demonstrable need to be in the chip design business. This begs the question, what other industries are ripe for in-house ASIC?

The second exciting question though is “what can the semiconductor industry do to compensate?” They are caught in a trap where the cost to design niche products is high and the volume is low, which is not their natural business model. This leaves two fundamental approaches: innovate of fast follow a trend.

Semis could look into the future, make long-range guesses as to the changing needs in the information field, and hyper-innovate beyond what the customers can do on their own. Some claim Nvidia has done this by aggressively pushing ASIC in directions of video, AI and other fields. Their stock price has nearly tripled since 2016 as a result.

The other mode would involve becoming a “fast follower”. Silicon companies should be actively watching what any non-chip company is even thinking about doing with wafers. Two or more such players dabbling in one area of chip technology is an indication of an unmet need for which they are pioneering. Pioneers are the first to arrive in the Promised Land. Savvy semiconductor companies can and should watch, understand the new circuits being contemplated, and rapidly prototype both the silicon and open source compatible software as a fast follower behind the pioneers. The idea is that what Google and Facebook are monkeying with today is what everyone will want tomorrow, and what should be in the pipeline now.