Is deferred rendering the future of real-time 3d rendering on pc hardware (at least until raytracing becomes feasible)? I'm aware of the benefits (lots of lights, less state changes) and also trouble spots such as anti-aliasing, translucent items, and higher memory bandwidth.

But will the next-gen engines all be using deferred? Is forward rendering a thing of the past?

2 Answers
2

Deferred rendering hits GPU bandwidth hard. The lighting passes fetches from anywhere from 3 to 5+ textures, for every pixel on the screen, for every light. That's a lot of bandwidth.

This hurts mobile GPUs, an increasingly important segment, more than others. Yes, they're low-power chips, but they still have some shader power behind them. Deferred rendering on such platforms is going to hurt. This is particularly true for PowerVR-based platforms (the current most popular mobile GPUs), as their tile-based rendering system already gets many of the advantages of deferred rendering.

But more than that is the current trend to putting GPUs on CPUs. Now, you may think that this would only work for low-end games, but AMD is talking about CPUs with up to 400 shader processors or so; that's some decent horsepower. Yes, you won't run the highest of the high-end games at full settings, but it would be a serviceable GPU.

For these GPUs, the lighting pass is going to suffer. These chips currently use the same bandwidth the CPU uses. Indeed, for such platforms you may want to start using fast noise functions to compute colors or normals instead of textures.

Deferred rendering is really tailored towards higher-end hardware. With the current "race to the bottom" in CPU/GPU chip design (lower-power, etc), deferred rendering may be the future only for those who still have discrete GPUs.