All this talk of tuning video cards for max flexibility and performance beings to mind a great idea. why not have a write up on all the neat things you can do with Riva Tuner.i only know how to do 2 things. setup overclocking and fan profiles but i know there is more neat stuff in there.

What I found very interesting, (besides the overclockability of the 4890) was just how close the 4890 is to the 285 in resolutions less than 30" monitor size. Close enough to be within the realm of "margin of error".

This is all good to know as I have a 24" monitor I cannot see giving up for a few years yet:-)

Thanks again. Oh, by the way....Those who can do...Those who cannot....criticize. Reply

1. How would the 4890 compare in price and performance to the 4870x2 or 4850x2. Would these cards give similar performance at a lower price?

2. When is DX11 coming, how will it be implemented, and will it be any more efficient hardware wise than DX10? Even now, most games take a serious performance hit with DX10. Will DX11 require even better hardware?? If so I will either have to do a serious upgrade or run 2 generation old DX9. I have played Company of Heroes and World in Conflict, and I ran both is DX9 mode. The games looked fine and performance was so much better. DX10 to me has been a big disappointment in that it is so resource intensive without much visual improvement.

" Guru3d has OC reports of both. The GTX275 seems to be clearly in the lead. "
--
This is red roosterville. Compare the special edition from manufacturer ATI 4890 to the standard or sub OC GTX275, then spin it up more for the red roosters with special custom game profiles, resolution and game settings, and leave PhysX ON even if the game doesn't use it.
Post the massively biased results and provclaim ATI the overall superking winner.
Allow your red rooster fanbase and anyone else enjoy the fantasy, while lying as much as possible, and apologizing every time the bias is pointed out by a poster, claiming you didn't know, you'll fix it, that's a good question, you'll get to it, you didn't mean to compare Oc'ed ATI to stock GTX275 - etc etc etc .
--
That's why other reviews so often show the GTX275 way ahead. Not every single one - there are more red rooster stations about, but hey, some people like Derek just can't help themselves. Reply

... you could have presented the graphs in a much more readable manner. The only thing you were really doing was finding the Amdahl's Law curves for the part.

Watch out for sentence fragments in the future too :)

As an aside, usually mainstream hardware reviewers are absolutely horrible and incompetent at overclocking, so it's refreshing to see some results that actually match what end-users have been reaching on various enthusiast forums (~1GHz core clocks on stock air). Reply

Quite a few folks in here are talking about messing with voltages - how do you do this? I have a 4870 with a thermalright cooler and I'd love to mess with voltages to get some extra performance out of it, but have had no luck searching for a utility that'll let me do this. Reply

the ASUS card we tested actually comes with a utility to tweak its voltage. there are some hardware mods out there as well ... if you're really hard core you can go check out mvktech.net and download some bios readers/writers and editors and really go crazy (and possibly destroy your card).

I suppose I just thought the 4770 article was straight forward enough to be stripped down -- that I said the 4770 was the part to buy and that the numbers backed that up enough that I didn't need to dwell on it.

But I do appreciate all the feedback I've been getting and I'll certainly keep that in mind in the future. More in depth and more enthusiastic when something is a clear leader are on my agenda for similar situations in the future. Reply

Sorry for AMD, but even with a super powerful card in Direct-X, their OPenGL implementation is still bad, and Nvidia Rocks in professional applications running on Linux. We saw the truth when we put an Radeon 4870 in front of an GTX 280. The GTX 280 Rocks, in redraw mode, in interactive rendering, and in OpenGL composition. Nvidia is a clear winner in OpenGL apps. Maybe it´s because the extra transistor count, that allows the hardware to outperform any Radeon in OPenGL implementation, whereas AMD still have driver problems (Bunch of them ), in both Linux and Mac.
But Windows Gamers are the Market Niche AMD cards are targeting... Reply

WTF? Windows gamers aren't a niche market, they're the majority market for high end graphics cards.

Professional OpenGL users are buying Quadro and FireGL cards, not Geforces and Radeons. Hobbiests and students using professional GL applications on non-certified Geforce and Radeon cards are a tiny niche, and it's doubtful anyone targets that market. Nvidia's advantage in that niche is probably an extension of their advantage in Professional GL cards (Quadro vs. FireGL), essentially a side effect of Nvidia putting more money/effort into their professional GL cards than AMD does. Reply

For example, in Call of Duty 4, the 8800GT performs significantly worse in OS X than in Windows. And you can tell the problem is specific to nVidia's OS X drivers rather than the Mac port since ATI's HD3870 performs similarly whether in OS X or Windows.

Another example is Core Image GPU acceleration. The HD3870 is still noticeably faster than the 8800GT even with the latest 10.5.6 drivers even though the 8800GT is theoretically more powerful. The situation was even worse when the 8800GT was first introduced with the drivers in 10.5.4 where even the HD2600XT outperformed the 8800GT in Core Image apps.

Supposedly, nVidia has been doing a lot of work on new Mac drivers coming in 10.5.7 now that nVIdia GPUs are standard on the iMac and Mac Pro too. So perhaps the situation will change. But right now, nVidia's OpenGL drivers on OS X aren't all they are made out to be. Reply

Better remember this then:
" We absolutely must caution our readers once again that these are not off-the-shelf retail parts. These are parts sent directly to us from manufacturers and could very likely have a higher overclocking potential than retail parts " Reply

Interesting read, but I was a bit disappointed not to see the 4850x2 included in the benchmarks. The 1GB model is currently at very rough price parity with the 4890 and the 2GB model still cheaper than a 285. As such, it would've been nice to be able to more easily note the advantages of a multiple GPU card over a single GPU card cranked up to ludicrous speed. Reply

The numbers used for the 4850 in this article correlate to those in the Multi-GPU article from February. So all you really need to do is compare the 4850 X2 results from that article to the 4890 results in this article. The two are closely matched, although I'd be willing to bet that the perceived performance and fps range is tighter and more consistent on the 4890 than the 4850 X2, as single GPU solutions usually are. Reply

That is a good point. I have setup a 2D profile in ATT which undervolts the card (you can undervolt, but not overvolt with ATT) and that does help in 2D. I haven't tried undervolting at stock speed (although mine's a OC version, so it may need the full voltage). Reply

I have an HD4890 and it does overclock well. But it's just too damn loud. I've been actually underclocking it for most of my games, as its fast enough to run them fine while underclocked, and it keeps the noise down. It's just not fun running it overclocked and having your game drowned out by a hairdryer. Headphones help. Reply

The title is a little bit stupid if you ask me, a 4890 can go way further then 1GHz on the core and 1.2GHz for the RAM. Why didn't AT go further then normal achievable clocks? The title says to the max, too bad it cannot make up the promise.
Interesting read though, must have cost a tremendous amount of work. Reply

Well ... it is to the maximum value the built-in overclocking features of the driver will let you set the card. So that's where "to the max" came from.

Yes with 3rd party tools you can get higher on some hardware, but we didn't want to go into aftermarket cooling and not everyone can even make it to 1GHz ... We wanted this based somewhere in achievability.

Why didn't you guys play with the fanprofiles? Added some VGPU with 3th party software? Just went to the realy maximum of the card. It would be amazing to see the results of that. Using 3th party software shouldn't be any limitation to overclocking if you ask me. Reply

We did set the fan on 100% manually all the time. Sure, it's a leafblower at that speed, but we wanted to keep it cool. Sorry I didn't mention that.

It is possible to get some 4890 cards higher, but this wasn't an article for the hardcore overclocker. Rather we wanted to talk about what the average user can do out of the box by clicking a few buttons in the driver. Which, in my opinion, is more impressive than if we'd gotten an extra couple percent increase in core and memory clock speeds.

Not that even I wouldn't be interested in knowing how fast you could get one of these beasts ... It's just a different article than the one I wanted to write here. Reply

It will be a nice day when you don't have to apologize ten times in every ati vs nvidia article for your massive ati biased BS Derek.
--
" We did set the fan on 100% manually all the time. Sure, it's a leafblower at that speed, but we wanted to keep it cool. Sorry I didn't mention that. "
---
All you red roosters aren't sorry - the enthusiasts see your charts and head off to redland purchse while your lying CRAP misdirects them ! And you know it - and are sickeringly proud of it no doubt !
It's been going on forver by you here !
PERIOD! Reply

The title just imply different things then this review. That's all, i enjoyed reading it and see how the card responds to Core and Mem OC's. But if you use the words 'extravaganza' (very special) and 'to the max' people start thinking about dry-ice/ln2, voltage modifications etc.
The article is awesome, the title is a little bit wrong Reply

These tests and the article took a while for me to put together. I am planning on doing a GTX 275 overclocking and seeing if anything interesting happens. If I get good results I'll put together an article on the subject.

I tried pretty hard to keep this article on the subject of the 4890 itself and where I made comparisons it was to hardware outside it's price class with the GTX 285.

No matter how well the GTX 275 does, the 4890 still did as well as it did versus the GTX 285 (and relative to itself).

If I screwed up and made direct comparisons between overclocked 4890 and GTX 275 or something, let me know and I'll fix it 'cause I didn't mean to. Otherwise I hope that this article is as appropriate as I wanted it to be.

Once I've also overclocked the GTX 275 I'll have something to say about it's relative value versus the 4890 in that regard. Reply

" If I screwed up and made direct comparisons between overclocked 4890 and GTX 275 or something, let me know and I'll fix it 'cause I didn't mean to. Otherwise I hope that this article is as appropriate as I wanted it to be.

Once I've also overclocked the GTX 275 I'll have something to say about it's relative value versus the 4890 in that regard. "

First you say IF you used a non overclocked GTX275 and screwed up doing it - someone should point it out (because you obviously can't see it in your own charts - right red rooster?) - and if someone points out your massive idiots bias, you'll change it otherwise you won't... THEN YOU ADMIT YOU HAVEN'T OVERCLOCKED THE GTX275 YET ! PROVING ABSOLUTELY YOU DID IN FACT COMPARE IT NON OVERCLOCKED TO THE 4890 SPECIAL MANUFACTURERS EDITION SENT TO YOU !
-
SO WHAT THE HELL DID YOU MEAN BY " If I screwed up and made direct comparisons between overclocked 4890 and GTX 275 or something, let me know and I'll fix it 'cause I didn't mean to." ?!!???
--
YOU DID IT DEREK - YOU KNEW YOU DID IT, YOU DID IT "ON PURPOSE" THEN EVEN AS YOU DENY IT AFTER BEING QUESTIONED, YOU ADMIT IT INCRIMINATING YOURSELF !
YOU EVEN ASK THE QUESTONER POINT IT OUT WHEN YOU DID THE CHARTS AND THE TESTS HERE !
MY GAWD!
--
Dude, if you think your BS is even passable, THINK AGAIN !
Reply

Yes, our great masterous "fair" and "unbiased" reviewer Derek
.
DOESN'T EVEN KNOW WHEN HE IS COMPARING OC'ED SPECIAL MANU. TO STOCK !
.
.
WHAT a genius! Keep that boy hired ! Must be worth a whole 50 cents an hour, if that ! Reply

It's my impression that AMD cards are more responsive to clock frequencies, while NV cards are rather dependent on units/clusters. We've seen it from GTX 260 vs. GTX 280, as well as 8800 GTS vs. 8800 GTX. There were situations where GTX 280 / 8800 GTS can't come over the performance of their older brothers, even with massive overclocking. (and their clock generator don't generate clocks linearly)

ATI has lost TWO BILLION DOLLARS SELLING THEIR GPUs ! A BILLION A YEAR EASY - YET DEREK HAS THE UNMITIGATED GAUL TO SAY THIS:
.
" In the meantime, NVIDIA's margins are much tighter on their larger GPUs and now their single GPU performance advantage has started to erode. It seems the wonders of the RV7xx series have yet to exhaust themselves. "
ROFLMAO !
THAT MIGHT MAKE SENSE IF ATI HADN'T COST AMD A BILLION PLUS A YEAR ON BARELY 2 BILLION IN SALES! ATI IS LOSING 33%.
THEY SELL $100's OF ATI CARD, THEY LOSE $33 !
---
THE TRUTH REALLY REALLY SUCKS , HUH RED ROOSTERS ! Reply

depeche: " Doesn't AMD make the cards a whole different way then Nvidia? Don't they have like really high clocks and memory while Nvidias is a lot lower with higher of other parts?

Also has Nvidia made a card that's simply overclocked like the 4870/4890?
"

THE ARTICLE HERE " We absolutely must caution our readers once again that these are not off-the-shelf retail parts. These are parts sent directly to us from manufacturers and could very likely have a higher overclocking potential than retail parts. "

I had an NVidia Ti4200 that overclocked well - good enough to basically double the cards value. The original Radeon LE also overcloked well, and you could unlock extra pipes. I also still have an X800GTO2 that could be Bios flashed to an X850XT. Reply

The most Overclockable card that I can remember from nVidia was the Geforce FX 5700LE it's stock core speed was 250mhz, I managed to crank my core speed on that card all the way to 640mhz on the stock Heatsink and fan, however thats not saying much as the 5700LE was basically a regular 5700 but with it's clockspeeds reduced to 5200 speeds, before that I had a Geforce 2 MX100 which was incredibly overclockable, and the TNT Vanta I had before it was as well, however I don't think I have seen a high-end part with that much overclocking margin. Reply