New Drivers From NVIDIA Change The Landscape

Today, NVIDIA will release it's new 185 series driver. This driver not only enables support for the GTX 275, but affects performance in parts across NVIDIA's lineup in a good number of games. We retested our NVIDIA cards with the 185 driver and saw some very interesting results. For example, take a look at before and after performance with Race Driver: GRID.

As we can clearly see, in the cards we tested, performance decreased at lower resolutions and increased at 2560x1600. This seemed to be the biggest example, but we saw flattened resolution scaling in most of the games we tested. This definitely could affect the competitiveness of the part depending on whether we are looking at low or high resolutions.

Some trade off was made to improve performance at ultra high resolutions at the expense of performance at lower resolutions. It could be a simple thing like creating more driver overhead (and more CPU limitation) to something much more complex. We haven't been told exactly what creates this situation though. With higher end hardware, this decision makes sense as resolutions lower than 2560x1600 tend to perform fine. 2560x1600 is more GPU limited and could benefit from a boost in most games.

Significantly different resolution scaling characteristics can be appealing to different users. An AMD card might look better at one resolution, while the NVIDIA card could come out on top with another. In general, we think these changes make sense, but it might be nicer if the driver automatically figured out what approach was best based on the hardware and resolution running (and thus didn't degrade performance at lower resolutions).

In addition to the performance changes, we see the addition of a new feature. In the past we've seen the addition of filtering techniques, optimizations, and even dynamic manipulation of geometry to the driver. Some features have stuck and some just faded away. One of the most popular additions to the driver was the ability to force Full Screen Antialiasing (FSAA) enabling smoother edges in games. This features was more important at a time when most games didn't have an in-game way to enable AA. The driver took over and implemented AA even on games that didn't offer an option to adjust it. Today the opposite is true and most games allow us to enable and adjust AA.

Now we have the ability to enable a feature, which isn't available natively in many games, that could either be loved or hated. You tell us which.

Introducing driver enabled Ambient Occlusion.

What is Ambient Occlusion you ask? Well, look into a corner or around trim or anywhere that looks concave in general. These areas will be a bit darker than the surrounding areas (depending on the depth and other factors), and NVIDIA has included a way to simulate this effect in it's 185 series driver. Here is an example of what AO can do:

Here's an example of what AO generally looks like in games:

This, as with other driver enabled features, significantly impacts performance and might not be able to run on all games or at all resolutions. Ambient Occlusion may be something some gamers like and some do not depending on the visual impact it has on a specific game or if performance remains acceptable. There are already games that make use of ambient occlusion, and some games that NVIDIA hasn't been able to implement AO on.

There are different methods to enable the rendering of an ambient occlusion effect, and NVIDIA implements a technique called Horizon Based Ambient Occlusion (HBAO for short). The advantage is that this method is likely very highly optimized to run well on NVIDIA hardware, but on the down side, developers limit the ultimate quality and technique used for AO if they leave it to NVIDIA to handle. On top of that, if a developer wants to guarantee that the feature work for everyone, they would need implement it themselves as AMD doesn't offer a parallel solution in their drivers (in spite of the fact that they are easily capable of running AO shaders).

We haven't done extensive testing with this feature yet, either looking for quality or performance. Only time will tell if this addition ends up being gimmicky or really hits home with gamers. And if more developers create games that natively support the feature we wouldn't even need the option. But it is always nice to have something new and unique to play around with, and we are happy to see NVIDIA pushing effects in games forward by all means possible even to the point of including effects like this in their driver.

In our opinion, lighting effects like this belong in engine and game code rather than the driver, but until that happens it's always great to have an alternative. We wouldn't think it a bad idea if AMD picked up on this and did it too, but whether it is more worth it to do this or spend that energy encouraging developers to adopt this and comparable techniques for more complex writing is totally up to AMD. And we wouldn't fault them either way.

Post Your Comment

294 Comments

Hey, you're the one with the lies and the cover-ups for ATI, and now the anti-semitic conspiracy theories.
Even with all the spewing you've got going there, you couldn't just say " ati is really the one who lost money, not nvidia with the GT200".
Oh well, it's more important to spread FUD and now, conspiracy against "Jews".
Amazing. I had no idea the rabbit hole goes that deeply. rofl
Reply

Check for yourself.It's not a conspiracy, these are facts.
In fact, the 4800 series cards are the most successful generations of cards ATI ever produced.The 4890 that measures about half the size of the gtx285, beats the later in most games at full HD resolutions.

Now tell me about employment or jobs ? Is that in the communist inflation reprint economy that costs us taxpayers trillions - the fantasy world where CONSTANT billion dollar losses on just a billion dollar company is "sustainable" ?
AMD/ATI IS IN DEADLY SERIOUS TROUBLE AND HAS BEEN
NVIDIA IS ALREADY RECOVERING AND HAS BEEN POSTING A PROFIT.
___________

But in your inaginary world filled with HATRED and LIES, it's just the opposite... isn't it.
How pathetic.

You cite "the last quarter", but of course only a fool would use that as a future indicator concerning quality and viability of the company. It's another pathetic attempt, fella. Global downturn means nothing to you, and you FAILED to cite the ati numbers, the two quarters in question, so you really have no point. You must have been afraid to tell the truth ?
If Nvidia has one low quarter in the midst of massive global downturn, while ati had at least 9 quarters where they suffered losses in a row, who is really in danger of playing on the "competitors" chip ?
You see, that's WHY the ati red roosters had to SCREAM endlessly about nvidia's GT200 die size - because THE WHOLE TIME BEHIND THE SCENES OF THEIR FLAPPING RED ROOSTER BEAKS - THEIR BELOEVED ATI WAS LOSING BILLIONS....
See bub, that's what has been going on for far too long.
It's really sad and sick, that people can't be HONEST.
All the red roosters had to do was say " hey buy ati, they're in financial trouble and have been, we all want competition to continue so let's pitch in, because the brands are about equivalent. "
See, that would have been honest and respectable and manly.
Instead the raging red roosters lied and covered up and FALSELY ACCUSED their competition of imaginary losses while their little toy was bleeding half to death - like little lying brats, they couldn't help but spew in the midst of IMMENSE BILLION DOLLAR losses for ati, how the gt200 was "hurting nvidia" and how "ati could crsuh them" with PRICE DROPS -. lol - man alive i'm telling you - all those know it all red rooster jerks - it was and is still amazing.
That's fine, just be aware that it --- has been pathetic behavior. Reply

Actually, you were the one throwing around $billion losses and FAILED to mention Nvidia's own horrible financial situation. Did you say anything about the global downturn while ranting like a fanatic on AMD's losses?

What was it you were saying about HONESTY again? Yes, in caps.

Nvidia hasn't had one low quarter - they've lost 2/3rds of their share value in a year. That doesn't happen in one quarter, same as it didn't happen to AMD in one quarter either.

Nvidia are a horrible little company who hold back progress, and more and more people are wising up to their methods. Articles like this on Anand show what they are like. Nvidia CANNOT COMPETE with ATI on performance so instead they bribe with more cash than ATI use on R&D, and those that don't accept the bribes get cajoled or threatened instead.

All the while sad sycophants like you are banging on about PhysX and cuda as if they make a difference to anyone. What does make a difference is their pathetic rebadging of ancient tech, catching out the people who don't know any better.

That just proves how far ahead the r700 is vs the g200b. ATI put money in research in order to improve the experience, while Nvidia put money into bribes in an attempt to hold onto whatever slender lead they have. It's only a financial lead, in tech terms ATI are a country mile ahead and only the worst Nvidia fanboi cant see that.

" That just proves how far ahead the r700 is vs the g200b."
That rv700 can't compete AT ALL with the GT200 UNLESS it has DDR5 on it.
That is a FACT. That is REALITY.
Without DDR5 it is the full core 4850 that competes with the "old technology" at the DDR3 level on both cards, the 4850 and the 9800 series and flavors.
That's the truth, YOU LIAR.
Case closed, no takebacks, no uncrossing your fingers, no removing your red raging horns - like - forever.
The r700 CAANOT COMPETE WITH THE GT200 - unless DDR5 is added as an advantage for the r700 which actually competes with the g80/g92/g92b.
NOW, if you screamed and schreeched DDR5 is awesome and ati rocks because they used it, I wouldn't disagree or call you the liar you are.
Got it son ?
Figure it out, or go take a college class in logic, and skip the communist training if you possibly can. Might get an estrogen emotion reduction as well while you're at it. Reply

Check the five year stock charts before you keep lying, and then as far as your idiotic rant about nvidia, it just goes to show there is no such thing as a fair performance comparison from you people, you will lie your red rooster rooter butts off because you have a big twisted insane hatred for Nvidia, based upon some communist like rage that profit is a sin, and money in the industry is BAD, except PEOPLE get paid with all that money you claim NV throws around. lol
Dude, you're a red rooster rager, look in the mirror and face it, snce you can't face the facts otherwise. Embrace it, and own it.
Don't be a liar, instead - or rather if all you're going to do is lie, at least admit it - you're body painted red, no matter what.
The really serious issue is ati has a really bad continuous loss, and might go under.
However, I can understand you communist like raging red roosters screaming for more price drops as you declare the much better off financially NVidia the one "to be destroyed", and demand more price drops, as you scream "profiteering".
Well, the basic fact is plain and apparent, ati had to lose 2 billion dollars to provide their competitive price, and ati purchasers are sitting on that loss, their gain, huh.
Like I said, if Obama and co. give ati/amd a couple billion in everyones taxes, it might work out ok, otherwise bankruptcy is looming - or some massive new investor relations are required.
Either way, you people don't tell the truth, and that of course is the point, over and over again.
Reply

Well having trawled through all 16 pages of comments I have to say that as much as power & temp benches, I really want NOISE benchmarks. Yes power usually comes at the expense of noise and although I'm primarily a gamer, I hate fan noise too.

I happen to have a 8800GT which was great value when it first came out but it becomes a whirlwind in most games and it drives me crazy, breaks the immersion and only in ear headphones help.

When the scores are this close, I err on the side of silence and (from other sites) it sounds like the GTX275 is noticably quieter than the 4890 under load.

Also, the GTX275 may suck up more juice under load but it is also the same amount more economical when idle and as I spend way less than 50% of my computer time gaming, that is much more useful to me...

Agree that PhysX is overhyped promises at the moment. So, for sound and power efficiency, I think the GTX275 just sways the vote *for me*. And it can overclock a bit even if the impression I'm getting is not quite as much as the 4890.

Then again, here in the UK the prices are different. The new parts are £200+ and that's 33% more than the GTX260 55nm core216 which can be had for only £150 now and is only a little less powerful than the GTX275 and will surely last fine till the DX11 parts come out... choices... choices...
Reply

You can edit your vga bios using the radeon bios editor v1.12, which is the one im using now on my 4870, and adjust the frequencies in different modes.By downlcocking your radeon card in idle mode, you can get it operate properly in idle mode without sucking so much power.you can use ATI tray tools also for the same purpose.

As for the noise, i definitely recommend you to wait a little bit until the non-reference cards get released.

According to some sites, AMD is going to release its DX11 cards in Q3 this year, so if your planning to upgrade, you'd better consider waiting a little bit and get a far better card than the current available ones.

From my personal experience, a 4870 1gig is more than enough to play most current games at 24" resolutions with all the setts on their highest including the eye candy.......except for Crysis and Stalker clear sky....If you have a smaller monitor than mine, than you might as well consider a 4850 or a gts250....