Binning Renderer: Pipe-line Dream Or Reality?

Larrabee's use of a binning renderer leads to some interesting results compared to the alternative, an immediate mode renderer. The graph below shows the bandwidth (in MB) required per frame from a sample of 25, sorted in order of the immediate mode bandwidth. As you can clearly see, with immediate mode rendering different frames can vary hugely in the bandwidth and even when they don't, such as with Half Life 2, binning still equates to lower bandwidth in general.
As a result of using less bandwidth per frame and having a more consistent frame size, Larrabee using a binning renderer should be able to push out more frames per given amount of time than if it used immediate mode rendering. Obviously there is a finite amount of data the card can cope with per unit of time, so having each frame take up less can only be a good thing.

Perhaps a more interesting aspect of Larrabee's architecture is that it is incredibly scalable. Testing across three games and measuring relative performance, Larrabee exhibits almost entirely linear scaling. This is especially good considering that Gears of War is based on the Unreal 3 engine and Half Life 2 uses the Source engine, both used by a large number of game developers.
It should be mentioned that this test uses simulated hardware, rather than actual 8-, 16- and n-core Larrabee chips, but Intel says there's no reason why these results shouldn't translate exactly into silicon.

While most of the focus at the moment is on Larrabee in its GPU implementation, the fact that it has x86 cores means it is also capable of executing any code a developer wants to. Much like nVidia's GTX 280, Larrabee can either work in a graphics mode or in a general computing mode. Larrabee, though, doesn't require a compiler like nVidia's CUDA, to have code run on it, but instead will run C/C++ code natively.

Obviously programs coded for a parallel, throughput-orientated, environment are going to run faster on Larrabee and Intel will doubtless start pushing Ct (C for Throughput Computing) at developers as Larrabee's launch draws closer.

Interestingly, while Intel has only talked about C and C++ on Larrabee thus far, there seems to be no reason why any x86 language shouldn't be able to run on Larrabee, such as Python, for example. Presumably all Intel needs to provide is a common interface for telling a program to run on Larrabee if it's there.

Where to now?

Intel has given us an interesting taste of what Larrabee is set to provide, whenever it does finally ship as a real product. As with everything, we can’t form an opinion on it until we’ve actually seen real chips for ourselves. However, based on the information Intel has released today, the signs are definitely positive.

Personally, I just can’t wait to see what Intel has to show at IDF later this month!