I'm not into promoting other sites, many people in here (who usually keep quiet for reasons uknown) know that of which i am talking about, google does too for the ones who want to search for "ATI Image Quality Optimizations".

Just thought i should say a couple of words about the "performance" numbers by ATI lately that's all, no need to start a flame.

PS : They make great cards but i really hate it when either company does something bellow the radar to gain performance, other than the usual game optimizations which both companies do.

The only problem i have with that comment is that all the improvements are normally listed as game specific and many improvements are not games that are bench marked here/most sites or even played as much anymore in some cases.

From this article what i take is that there are improvements between driver versions but on average the improvements are low enough and also game specific enough for it not to be worth anyone complaining about what driver version is used in a review.

Click to expand...

You are right.... I guess don't mind my comment so much I have been just really frustrated with ATI's drivers and malfunctions as of late.

And my comment was just more of a compliment to Wizz and a suggestion at the same time.

Again just voicing frustration!

But still you are right it does show how driver improvements are more game specific and complaining about what driver is used is not worth the complaint.

Truthfully both ATI and NVIDIA have strengths and weaknesses in regards to Image Quality. If you look in the past several years, I personally have to admit ATI won more Video Card IQ tests than NVIDIA, but by a small margin. What I’ve learned is NVIDIA tends to produce better contrast which in turn results in a cleaner gaming experience, contrast is or was ATI’s weak point, and images were slightly darker when in game. Though overall ATI did lean toward overall better IQ in other video quality aspects.
Today it’s rare to see any difference between the two. You have some games looking better on ATI’s Radeons and others looking good on NVIDIA’s GeForce. It all depends on the game.

Actually i tend to disagree on the game part. True in video quality some tests put ATI infront by a small margin, others however do not.

However in games the truth is just far away and the comment made by Steevo above is just science fiction to say the least. To my knowledge the new GTX4/5 series outperform everything ATI has to date in terms of IQ in games but of course if Steevo or anyone else has any specific test to show us with latest drivers that show that ATI is ahead in D3D/OpenGL IQ i would love to see it.

unless of course you go into CCC and manually set the Image quality sliders omg so hard That said having used 8800gtd 640 9800gt gtx 470 ati 4870x2 5850 and 6970s image quality in games is to the point you will not notice it so my only reaction is its kinda childish to go OMG this vendors quality is suxxors cause there drivers drop quality. Its a game both camps have been playing since the dawn of there creation let it go buy what works for you and just let it go. Id have bought 2x gtx 570s if Nvidia had decent mobos on the AMD side but 2 choices isnt really good enough so i went ATi again.

It's not about if it looks better or not, I thnk. The trilinear and anisotropic "optimizations" on latest Ati drivers are true and are very well documented by some sites. The default image quality has been degraded to improve performance and that's a fact, even if it's not noticeable in games at all or only on some occasions and by some people. The improvements to HD5xxx that latest drivers (since HD68xx release) bring is mostly due to this optimization alone.

Now, no one is discussing if these optimizations are legit or not, if they should be made or not or whether it degrades picture quality on games or not. As long as most people find them useful or don't care at all, it's all well. But just calling things for what they are, the performacne imrovements in latest drivers come from lowering IQ.

It's like Crysis 2, it will most probably run better and most people will think it looks the same or better, but the fact is that technically it will most probably be inferior to Crysis 1. Most of what will make Crysis 2 run better on our machines will be because of lowered IQ and not because of real optimizations.

* Can someone honestly believe that after 14 months on the shelves, it's now with Cat 10.11 and 10.12 that the HD5870 gets improvements? All while the newly released cards don't improve?

Actually i tend to disagree on the game part. True in video quality some tests put ATI infront by a small margin, others however do not.

However in games the truth is just far away and the comment made by Steevo above is just science fiction to say the least. To my knowledge the new GTX4/5 series outperform everything ATI has to date in terms of IQ in games but of course if Steevo or anyone else has any specific test to show us with latest drivers that show that ATI is ahead in D3D/OpenGL IQ i would love to see it.

Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.

Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......

Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.

Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......

Click to expand...

"Image Quality Conclusion

Both companies offer good default image quality, which we believe is directly comparable. Performance and Quality testing and comparisons should be performed at like to like settings. Currently that default is 'Quality' for both companies, which is probably not the optimal solution for high performance enthusiast products. Perhaps there ought to be different default levels for driver settings, depending on the graphics product. While that sounds like a problematic scheme to propose, it's not too far away from what is already in place; different GPU cores have different codepaths, which perform different optimizations for the same external driver control panel settings."

Benetanegia whatever you say won't change a thing to people that don't like to listen because they think that what they own is always the best and that's a fact. Now there are always other reasons as well but that's another story.

Steevo did you even bother to read the entire review ? Obviously not since in the end they admit NVIDIA has a better overal IQ when applying the right settings with their drivers. However for some " weird " reason we don't see them mentioning the IQ cost / Performance issue......

Click to expand...

I see you've edited the post, but I was going to say regardless of it being rage 3d you can just looks at the pictures and come to your own conclusions, it's what I'm doing right now, yay!

*edit* seems you didn't read it all either, they don't state if one is better, they say AMD has a few glitches in the filtering techniqe and could do with more options. ( that's the only thing that could possibly be miscontrued as nv has better quality, if anything the review implies it's down to what the users eyes find best)

Panther pictures represent settings you use and we have no way of knowing the real settings used in them. Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant. Of course it's not the first time ATI has cut IQ details from here and there to gain performance but again thats another story.

Finally no i don't trust results from an ATI fan forum that carries the name of one of ATIs first 3d cards ( which were a fail btw, for example black waters in tomb raider for all those who used them back then ) as i would not trust results from a forum if it had the name Riva128. Makes sense i think correct ?

Speedo stop cutting pieces and go where they say that ATI could do better

Also we have to think about why they "forgot" to mention the IQ trick ATI has been pulling as of late to get up to a 10% performance increase, which for me is important since it feels like they think of the end user as ignorant.

Click to expand...

I thought it was common knowledge that both companies do this fairly often.

Normally when the competitor releases a new card

As for not knowing real settings used, fair enough paranoid but fair enough. You have a 580 and an 5970 though, download the various tests they used and run them yourself that way you know for sure.

by the by, just read the 570 review on that website, they don't seem to have a bias (that effects the review)at all, none of the tests are skewed in anyway and they even give the 570 5 stars are were impressed by it and the 580s performance.

No, using low quality setting as quality and quality as high quality setting is not something both companies do ( true you can hardly see the differences but that does not change the fact that they are there ).

Now 10% is not something huge i agree on that with some people but imagine what would happen if in every test we have seen lately of both the 6870/6970 and even the less powerful 6850/6950 people were to deduct a 5-10%, what then ?

The entire world was impressed with the 570/580 who could say otherwise and why ?

Steevo no what they are stating are typical optimizations and not the IQ trick, or how should i put it, they are not really giving the right "emphasis" on that. Better now ?