Not so fast. That's the gist of a report that says Intel's future graphics chip will face a grueling battle to gain ground against entrenched and very capable competitors.

When Larrabee was disclosed in August, it sent shockwaves through the chip industry. After all, this was Intel, the world's largest chipmaker, announcing its intention to assail one of the last major PC chip markets it has yet to dominate: standalone "discrete" graphics processors--or GPUs. Larrabee is due to ship in the 2009-2010 time frame.

Intel already dominates the market for integrated graphics silicon: graphics functions integrated into Intel chipsets come virtually free on tens of millions of PCs shipped worldwide every year, an offer that many PC vendors find hard to refuse. The resulting less expensive PCs are, in turn, welcomed by consumers.

But the discrete graphics market is a different creature. It is dominated by Nvidia and AMD's ATI graphics chip unit. Both companies supply chips that easily rival--or best--any Intel chip in complexity. Nvidia's latest chip, the GTX 280, boasts 1.4 billion transistors and 240 stream processors. In short, it is an extremely complex parallel-computing engine.

"Intel claims Larrabee is a graphics engine intended to outperform Nvidia's GPU offerings. The audacity of this claim is startling," according to a report issued by Ashok Kumar, an analyst at investment bank Collins Stewart. "Nvidia has had over 10 years to optimize the 3D graphics pipeline, the necessary drivers, the platform connections needed to supply the memory bandwidth required, and to work with the software and apps developers," he writes. (Note: Kumar started coverage of Intel at Collins Stewart on September 4 with a "buy" rating.)

Intel doesn't necessarily disagree with the spirit of the statement. "I certainly expect Nvidia and ATI to carry on being successful," Sean Maloney, Intel executive vice president and chief sales and marketing officer, said in an interview this week. "They both have considerable expertise and customer base and all of that," he said.

But he contends that Intel's approach is different. "I do think the changes going forward are a little bit different for the industry. It's not just multicore but how easy that multicore is to program," Maloney said, referring to the fact that Larrabee will have many processing cores. "The Nvidia guys got their experience base and their views on that. We're trying a different approach. This is something that will unfold over a few years. I don't think anything's going to happen overnight."

Pat Gelsinger, senior vice president and general manager of Intel's Digital Enterprise Group, said the chipmaker isn't simply diving headlong into the market. "We've had this architecture work going on for years. Design teams built up," he said in an interview at the Intel Developer Forum in August.

Kumar claims that the task is extremely daunting. "Two of Intel's main challenges at present are the GPU threat by Nvidia and the heavy lifting required for software to make use of multicore processors. With the upcoming Larrabee chip, Intel has chosen to undertake a frontal assault on both of these problems simultaneously," Kumar writes.

Drawing on the Pentium heritage
The report also claims that one of Larrabee's purported strengths--its x86 heritage that taps into a large existing software infrastructure--is a weakness, too, because it is based on the Pentium, a design Intel launched in 1993. "Larrabee proposes to compete by fielding a couple of dozen x86 cores on a single chip. Each core, by dint of its (Pentium) heritage...is carrying all of the baggage of the full x86 instruction set," Kumar writes, adding that the Pentium design is "antiquated."

CNET Blog Network contributor Peter Glaskowsky wrote in August that "the power consumption of a 32-core design with all the extra overhead required by x86 processing would be very high."

Code would not necessarily execute efficiently either, according to Kumar. "The vast majority of x86 code in the world today was not compiled to optimize execution on a Pentium-class core...and will suffer the same old slowdowns," he said.

But Intel's Larry Seiler, principal engineer at Intel's Visual Computing Group, says there's a method behind the madness. At a session in August, he described the original Larrabee experiment. "They replaced the modern out-of-order core with a core derived from the Pentium design. So it's a much smaller core. It's in-order. It doesn't have the advanced capabilities (of more modern processors). But what they found is that they could fit ten cores...in the space of two (modern cores)," he said.

So, Larrabee's Pentium-derived design "has five times as many cores; each core has a vector (processor) unit that is four times as wide. So for throughput computing, potentially, it can run 20 times faster," Seiler said.

This was the key idea that made the original group that came up with the idea of Larrabee realize that it could design an architecture built out of CPU components that achieves the kind of performance and parallelism that previously had been the domain of graphics processors, Seiler said.

Kumar will be surprised if Intel pulls it off. "If Larrabee ends up knocking out Nvidia, it will be a shocking upset, considering how much inertia in the software industry is present, the degree of difficulty many-core chips present, and the high efficiency of Nvidia's existing designs," Kumar writes.

Intel's Gelsinger relishes the challenge. "We've not been bashful about saying we want to win Nehalem," Gelsinger said. Nehalem is Intel's next-generation chip architecture that will roll out over the next 12 months. "(Larrabee) will plug into Nehalem, and into Westmere, and into Sandy Bridge," he said, referring to future Intel chip platforms. "And volume consumer applications as well," he said.

"It's cool to watch somebody step up to the plate and point to the fences like Babe Ruth. Just remember: Babe Ruth also struck out a lot," Kumar concludes in his report.

About the author

Brooke Crothers writes about mobile computer systems, including laptops, tablets, smartphones: how they define the computing experience and the hardware that makes them tick. He has served as an editor at large at CNET News and a contributing reporter to The New York Times' Bits and Technology sections. His interest in things small began when living in Tokyo in a very small apartment for a very long time.
See full bio