If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Hybrid View

Intel Makes More Driver Improvements For Valve's L4D2

Phoronix: Intel Makes More Driver Improvements For Valve's L4D2

Developers at Intel's Open-Source Technology Center have made more improvements to their open-source Linux graphics driver to benefit Valve's upcoming release of their Left 4 Dead 2 game that's powered by the Source Engine natively on Linux...

Does Intel Ivy Bridge and Sandy Bridge run Source Engine games at good quality and good performance?

If so, then what would I need a AMD or Nvidia card for, and why would I want one?

You get borderline performance at average quality, at best. Think of it this way, it's about the same as your onboard AMD or NVIDIA graphics chip. You want a dedicated card for high resolution output, high quality settings. Intel Ivy/Sandy graphics don't support a lot of graphical features, and there is an obvious degraded image quality in using them that is really apparent once you compare them. At best, these driver enhancements are simply cutting corners to improve performance, Ivy Bridge especially uses a feature just for this.

Does Intel Ivy Bridge and Sandy Bridge run Source Engine games at good quality and good performance?

If so, then what would I need a AMD or Nvidia card for, and why would I want one?

No, they don't. Even if they did, Source engine is hardly cutting edge, and L4D2 is already 3 years old (and wasn't even close to cutting edge at the time it came out). If _all_ you care about is Source, then maybe the Intel GPUs will be passable. If you wanted to play a game with modern high-end graphics capabilities, like a theoretical Arkham City Linux port, you will find the very bestest Intel HD4000 Ivy Bridge GPU to not even handle a minimum 30 fps unless you set everything to absolute lowest quality and low resolution. Running in beautiful maximum quality 1080p w/ D3D11/GL4 features (and stereoscopic 3D even, if you're in to that crap) would be right out the window, while a Radeon HD 6870 can manage maximum settings at 1200p in D3D11 mode at 30 fps just fine, an NVIDIA GTX 680 can handle it at full 60 fps no problem, and a CrossFire/SLI system (including the "one card" dual-GPU solutions like the GTX 690 and presumably the 7990 when it's available) can handle maximum settings 1200p D3D11 mode with stereoscopic 3D, with room to breath (granted, you'll hear it breathing, because cards like that are loud as hell).

No matter whatever Intel or AMD might ever say, integrated graphics will ALWAYS suck compared with a even average dedicated GC.

....and if you think otherwise, you are clueless about hardware PC design and serious gaming

Or maybe you're the one that is clueless? Historically IGPs sucked because DDR1/DDR2 did not have enough bandwidth to satisfy both the CPU and the GPU. This is already better with DDR3, the GPU part AMD Trinity APU gets pretty close to the performance of a mid-range dedicated GPU, it's mostly the CPU part that limits your frame rate on any AMD APU. And with DDR4 memory IGPs will actually get a chance to challenge the highend dedicated GPUs.

Or maybe you're the one that is clueless? Historically IGPs sucked because DDR1/DDR2 did not have enough bandwidth to satisfy both the CPU and the GPU. This is already better with DDR3, the GPU part AMD Trinity APU gets pretty close to the performance of a mid-range dedicated GPU, it's mostly the CPU part that limits your frame rate on any AMD APU. And with DDR4 memory IGPs will actually get a chance to challenge the highend dedicated GPUs.

But DDR4 is still some years in the future.
By then, there maybe is GDDR6.