Wake Up, Semi Industry: System OEMs Might Not Need You

System OEMs are hiring chip designers to design chips, leaving chip vendors with a smaller talent pool. It's time for the semiconductor industry to wake up and adapt, or die.

The rules are changing for the semiconductor industry and traditional vendors had better find ways to be more competitive or they will find themselves missing out on some of the most exciting, high-growth markets.

System OEMs in the hottest areas of technology are hiring their own chip designers in droves. It used to be a leverage play for these companies but the evolution is relentless: OEMs are now gearing up to produce their own chips.

The stakes are high and game-changing dynamics are in play. In a bygone era, chip companies used to control the feature definitions of the devices they produced. Now it is consumer mobility OEMs and box vendors that play a bigger role in calling the shots. Companies like Apple, Amazon, Microsoft, Google, Facebook, and others are increasing their leverage in this game by hiring teams of SoC design experts to shape the next category-defining device like the iPhone, iPad, Xbox, or Google Glasses. Even companies such as Lenovo are hiring semiconductor staff to raise their game against the smartphone competition.

Chip design is no longer "magic" The industry has evolved to the point where chip design has been demystified. The era is dawning where OEMs license semiconductor IP themselves, use their own team to layout the design in RTL and then deliver the GDSII files directly to a leading foundry. The trend has already taken hold in the server space where one industry insider believes that Facebook and Google will eventually bypass traditional chip vendors.

Chip companies that want to play in this market had better learn the new rules quickly and provide the cutting-edge feature sets that define these new markets. That means a greater ability to adapt, turn new designs, and deliver new system-on-chip devices with the specific features that enable OEMs to innovate.

Times have changed dramatically since semiconductor companies ruled the roost. Chip vendors used to control 99 percent of the chip design talent in the industry, but those days are past. In today's world, perhaps 75 percent of the top silicon design engineers work for chip vendors and the remaining 25 percent are being snapped up by OEMs. That's because the SoC design determines competitive advantage now more than ever, and software differentiation is not enough.

Apple, Microsoft, and Google started this trendThis trend raised eyebrows when Apple Inc. acquired P.A. Semi in 2008 and Intrinsity in 2010 which helped them bypass Samsung as a vendor and develop their own ARM-based A5 and A6 application processors. Apple started down this road when it demanded touch screen capability, low-power dissipation and small form factors that the merchant silicon industry would not provide. Instead of waiting around for silicon suppliers to come around, Apple created its own SoCs for iPhone and iPad devices and the OEM-chip vendor relationship was changed forever.

Another example of this trend is when Microsoft dictated the terms of its processor features for its next-generation Xbox gaming system. AMD stepped forward and acquiesced to Microsoft's requirements, producing exactly what the software giant needed.

Google, of course, acquired Motorola Mobility for many of the same reasons. It wanted SoC devices that could execute Android without compromise. It also didn't want silicon vendors to bluff them and say the devices it demanded were not possible. Because it has teams of chip architects on staff, Google has the leverage to tell vendors to deliver the SoCs it wants in the time frame it wants them or else say goodbye to a potential high-volume production chip. This capability will be especially important for Google to be able to meet the new form factor and power dissipation requirements for wearable computing, with Google Glasses being the first announced product.

Any semiconductor company not disturbed by this emerging trend is not paying attention. Now that this trend is firmly established in the consumer mobility market, it will soon take over other markets. OEMs will start making their own chips for the enterprise market because the rules have changed with the emergence of low-power webscale computing. Traditional device makers tuned to data-centric processing haven't adapted to the dynamic workload and power-consumption requirements of the webscale era. Mega datacenter operators like Facebook, Amazon, and Google have to seek out alternatives, and it starts with the devices that go inside every server box they operate.

Semiconductor and labor markets have evolved The industry has evolved to this point because of a few major factors. The market for semiconductor IP is robust, as is the labor market for chip designers who used to work at the traditional, and sometimes struggling, semiconductor companies. Just about any large company with a big enough stake at the table can design their own chips. That means they either get what they want from established chip vendors or they will develop their own design that meets their needs.

Incumbent chip vendors have advantages, for nowBut chip vendors are not out of the picture yet. For years, they have forged closer ties with system OEMs to better anticipate the feature definitions demanded by next-generation systems. However, even greater flexibility and accelerated time to market is required in this new era. Companies that want to compete had better spin a new design in nine to 12 months rather than the 18 to 24 months they used to take.

While advanced silicon IP is now available to just about everyone, chip makers have the advantage of experience. They also need IP that helps them incorporate new feature sets more rapidly than ever before. With this experience and advanced IP, they can turn silicon designs around faster than ever before and incorporate new feature sets at greater speed. With this capability, silicon vendors have all the means at their disposal to meet the most demanding requirements in the new era of OEM-driven, SoC feature definition.

— Kurt Shuler is vice president of marketing at Arteris and has extensive IP, semiconductor, and software marketing experience in the mobile, consumer, and enterprise segments working for Intel, Texas Instruments, and three startups. Prior to his entry into technology, he served in the US Air Force Special Operations Forces.

Hey Kurt - thanks for the reply. Indeed we DO agree - for the small slice of high value and high volume semiconductors. I think we can further agree that we can not determine the shape of the iceberg from the visible tip. Trends at the top end do make a huge difference to semiconductor companies, especially those over-exposed to a small number of sockets, but the majority of big names are well-diversified. The decades long trend of disaggregation indeed has allowed more systems companies to conveniently re-aggregate (per your examples) but this is, in my opinion, an exception that proves the rule rather than a industry-wide trend. So I do agree with more of your article than I disagree, Kurt, if you do not mind putting your conclusion and article title in the minority! Finally, I appreciate the willingness to put your views out there and in so doing to suffer the slings and arrows of... different viewpoints.

Jim, my point was to bring awareness to this trend. I'm sorry if you thought it was false advertising of my views.

I thought I was nuanced in explaining that this is occurring with big companies who are in growing markets with high value semiconductor content, and have a need to differentiate. Right now this is happening in mobile and servers. Although these markets may be a small slice of semiconductor volume, then do encompass a huge amount of semiconductor industry monetary value. Semiconductor companies making SoCs for these markets need to keep an eye on this.

I will leave it to analysts like Will Strauss, Jim McGregor, and Nathan Brookwood and firms like Semico, Linley Group, Gartner and IHS iSuppli to research this and supply the next level of quantitative detail and facts. They get paid to do that. I was stating my observations in my little part of the industry.

When I design a hardware/software system -- typically using an FPGA for the hardware -- I have limited resources (logic cells) for hardware and plenty of memory for software. Thus I put into hardware only what needs to be there because it has to have high performance and/or low latency. Everything else goes into the software. This keeps the hardware relatively simple, and throws the complexity into software. Software is a cheap way to perform complex functions, but it's really hard to design for and test all the odd conditions that can occur, and recover from errors gracefully.

With an SRAM-based FPGA, I have the luxury of being able to fix hardware bugs in later releases of the software. Even so, the hardware bugs rarely survive long and most releases are software updates and keep the same FPGA hardware.

Regarding software quality in general: it's pretty rare for any large program to work perfectly, and users have long ago set their expectations accordingly.

I like the idea. The smart-a$$ in me thinks the reason why software is so buggy compared to hardware is because the software folks always say, "Just ship it now! We can fix the bugs with a patch later." If the hardware folks tried this, it would be bad. (Understatement intended.)

Kurt, it seems your background would make for a far more nuanced read of the situation. You cite Apple, Microsoft, and Google and draw conclusions from a small slice of the market. Systems companies are going to spend $3 or $100 a part and buy from semis if they can avoid the millions, the time, and the risk of designing their own chips. But yes, they will design their own if that is what they need to do to differentiate. There is no new trend here. No new threat to semi companies. You have to look at each industry and the volumes and the differentiation strategies and the newness of their markets to determine where the integration (and integration expense/risk) will occur. A wake up call is not needed - the integration pendulum swings at different rates for different markets and there are hundreds of these markets, each with their own answer.

Am I being overly dismissive of this story if I call it fluff with a hard-hitting tag line to get readers attention? I guess it worked on me, so at least the second part of that is true.

For a future story, I am interested in seeing a discussion of why 70% of software projects fail or are plagued by endless bugs, but VLSI designs which share many similarities to large software projects can tape-out successfully. What can the software guys learn from the hardware guys?

I recently heard David Patterson (computer architecture researcher) was teaching software engineering this year. Perhaps the hardware guys are already starting to teach the software guys a thing or two. :-)

There's no pay for play with EE Times. I don't advertise with them or pay any fees. I just love to write and love our industry. So they invited me to contribute on a monthly basis.

The coffee choking was probably due to the fact that almost all of my company's customers are semiconductor vendors rather than OEMs or systems houses. The customer base is changing though, hence this article.

I'm glad you liked the article. I need to think of a topic for next month. Any ideas?

I should have guessed this article would cause some good discussion when my CEO read a draft and choked on his coffee!

This "OEMs making their own chips" trend seemed innocuous to me, and it's been very obvious to me and my company's sales team that this is occurring.

I think there is sufficient evidence in the marketplace to claim that some of the most innovative consumer product companies are "re-verticalizing", at least for their most important products that require differentiation. What I don't have a clear answer for is, "Why?"

I have a hypothesis that it is actually the software that is driving systems companies to design their own chips. When I was at TI, we offered operating system board support packages and driver software along with our OMAP phone chips. Software was not a core competence (buzzword alert!) of TI, and it took many years and lots of money to do it sufficiently well. TI were experts on the chip, but not on software.

When we look at a company like Google, Facebook or Microsoft, these companies are experts at software, but are looking to create innovative battery-operated devices. If they buy merchant silicon, they have to buy a chip that was designed for no particular OS, tool chain, application or form factor in mind. If they design their own chip, they have total control over all these things as well as exclusive access to the end product.

I don't think every OEM will choose to design their own chips, only the ones that can get an advantage through innovation (higher pricing) or significantly lower costs. I imagine the economic hurdle rate to design one's own chip is quite huge. Apple and Microsoft have determined that some of their product lines meet this hurdle rate, and Google and Facebook may have, too.

In any case, we're lucky to be in an industry that innovates not only with technology, but also with new business models. It keeps all of us from being replaced with computers ;-)