Google Ramps Up Chip Design

For years, speculation has swirled around this topic. It started in 2007 when Google bought PeakStream, a startup with tools for programming multicore processors. It came to a head back in 2010 when Google bought Agnilux. The startup included former employees of PA Semi, the company Apple bought to launch its work on the A series SoCs inside its iPhones and iPads.

A few years back, Google made a few disclosures about its efforts in specifying board-level servers, calming speculation about any chip-level ambitions. But it seems time are changing.

Google built its own system to enable software-defined networking between its datacenters, although it said the system used off-the shelf chips. Last year it joined the Open Power Consortium IBM created to drive its Power processor architecture forward, but it's not clear what will come of that effort.

"We've heard Google is working on its own network switch chips a couple years ago," said Linley Gwennap, principal of market watcher The Linley Group in Mountain View, Calif. "I think they've been doing that and now the question is whether they will do their own processors," he said.

The rise of ARM's server initiative and its 64-bit cores enables such a push, but "with so many other companies building ARM server processors, you would think they could get what they need" without designing their own chips, Gwennap said.

As far as i can tell , arm(and it's ecosystem) is coming up with generic chips. I see nothing about AI , nothing specific to google glass, and no very customized chips(for example chips customized for memcached processors with a world wide market of less than million units with google maybe needing 100K).

There has been speculation about all the internet powerhouses with large data centers developing their own chips. While it is a possibility, it is unlikely. Unlike the mobile market where Apple forged ahead with its own chip design because it was not satisfied with the solutions in the market, there are many options for custom server chips. In fact, this is a key part of AMD's new strategy. I would expect companies like Google and Facebook to partner with those companies that have the necessary IP and expertise to develop silicon solutions that meet their specific requirements. However, it is still in thier best interest to have some expertise in-house to drive the effort.

While all the speculation in the posts here are good and thoughtful, I'd bet they are only scratching the surface of what Google is thinking about using their own chips for. Addressing power consumption in their data centers seems like the sort of low-hanging fruit that justifies hiring hardware designers to begin with. And Google has vast troves of their own data on just what processes and applications consume the most power and time today, I'm sure.

But after the largest of those are addressed, what's next? They just noted they need solutions in network latency to handle the surge in connections to IoT, wearables, etc. One way they might attack that is pushing specialized processors closer to their network fringes so the appropriate levels of traffic can be moved where it matters most. Then there's the new robotics initiatives with the likes of Foxconn and aided by acquisitions like Boston Dynamics. This area alone could be quite high-profile and high-margin.

The sky's the limit, it's only the tip of the iceberg, pick whichever cliche you like, they all fit here.

Google has demonstrated its interest in AI by acquiring DeepMind recently. I think any of those options for Google make s sense, as far as what they'd like to do with silicon. But it's still strange that they think this is an area where they need to expand--into silicon--when ARM is coming out right now and offering all of these new options in the datacenter.

isn't a project. Google will just buy up a company when they are ready to do something real. I agree the most likely possibilities are the areas where they have very specific needs- specialized search engines (where they don't want to give out their algorithms to another company) seems like the best bet...

Google currently have big power barriers in google glass and robotics which prevents them from building products they want. So some sort of low power custom fit image processor is one possibility. Another possibility is some sort of deep learning(a new hot AI algorithm) algorithm , power optimized , that they can use in glass,robots and even in the data center.

The other possibility is of course accelerators, but do they have enough volume to justify doing stuff in 28nm , considering the fact that they can request custimzation from big suppliers(probably pending on big orders) ? My guess that for accelerators theyre more likely to use easic(via programmable asic) or something similar.