If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Well I have a i5+hd4000 laptop and I do not miss windows 8 bloated UI at all. But it has been know that Intel's driver for windows performs better. The gap as been closing steadily but isn't there yet... The open source Intel driver is alright as it stands, its plays CS:S and DoD:S fine (with a work~around) and it has played good with Wine. It isn't my 1100t+670GTX with Nvidia's BLOB but the OSS driver does the job fine for my little 12 inch.

I find that really hard to believe. If Intel is already developing drivers for their hardware, aren't they already giving away all the details?

The "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.

A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.

This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.

That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.

Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.

The "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.

A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.

This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.

That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.

Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.

This is very interesting info for me! Could you provide some link to the LightsMark broken geometry issue? Thanks

Earlier Intel Windows drivers had terrible AF quality, but with driver tweaks they got it to filter properly (and slower). So either a real oversight, or attempted cheat that was removed when detected.

...

The title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.

The title of this disgusting brown nosing article is very misleading. It should be "Windows 8 CLOSED SOUCE DRIVERS perform better than LINUX OPEN SOURCE DRIVERS, but Linux CLOSED SOURCE DRIVERS PERFORM THE SAME AS WINDOWS 8 DRIVERS with OpenGL" That should be the title, clueless shill. BTW, Valve games run much better for me than on my wife's shitty Windows 8 computer.

That remark is
a) insulting beyond bounds
b) suggests adjustment to the title for nuance

There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
a) insult the author of the article
b) suggest adjustment to the title

That remark is
a) insulting beyond bounds
b) suggests adjustment to the title for nuance

There is no reason to adjust the title. Windows GPU drivers are closed source. Linux GPU drivers are either open source or closed source. In the case of Intel, it's only open source. So there is no need to
a) insult the author of the article
b) suggest adjustment to the title

I hope you don't ever reproduce...

+1

Lots of people do not like data so they want to degrade its importance.

The "cheating" is a bit different (and I don't know that Intel has ever been found to do it, though both AMD and NVIDIA have). The cheating is that the drivers can detect popular benchmarks or games and make changes to the rendered scene to improve performance.

A famous example is when one driver cheated hard at LightsMark and did some special culling of the submitted geometry. Users who modified the benchmark found that if you turned the camera around while using said driver, you'd see broken geometry. On other drivers, you'd see the bits of the scene you'd expect.

This is a popular tactic for both common benchmarks and newer games. It lets the driver authors publish much higher benchmark numbers for a stock configuration of the application, tricking users into buying their hardware/driver despite the fact that it is no faster in the general case or on app configurations they didn't cheat at, or that games might break if the user does something out of the ordinary on a cheated configuration.

That's unlikely in FOSS drivers because all that code is a huge maintenance burden that only helps in marketing products, something that FOSS projects are not generally interested in. Also, spilling the beans on why a particular benchmark is so good kind of defeats the purpose of cheating at the benchmark in the first place.

Again, there's no evidence I know of that Intel's Windows drivers are guilty of this at all.

I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders. My opinion is this:

Also there is not an exact way to compare open source drivers with the closed ones, because the closed ones cheat. If you ask me Intels_open and Intels_closed are equals. Also they share the same OpenGL code. How the hell some of you figure out that are different? Make your brain think!

I'm really glad that someone understands. Games and benchmarks speak to the driver, not directly to the hardware. If the driver wants to cheat will cheat, there is not technology available to measure quality of the picture. In fact when you have 2x GPUs you only have +50% FPS, that's is because the driver goes in quality and precision mode, same with double the shaders.

The title of the article mentioned testing the drivers. Not the cards. So what's the point?

Furthermore, FPS is only a single (one of many) ratio's defining 'performance'. So an 50% increase in FPS will not increase performance with 50%.