It's not often we get to take two brand-new GPUs and pit them against each other in one launch article, but that's what we're doing with ATI's HD 4890 and NVIDIA's GTX 275. Both cards are priced at $249, and both also happen to offer great performance and insane overclocking-ability. So coupled with those and other factors, who comes out on top?

You can read our full write-up on the two new cards here and discuss them here!

Very good review Rob. I know how much work, time and effort you put into it and the final result was a good, complete review.

To be honest, the 4890 looks to be a very solid product and I don't see it hanging around that much above the $220 price point that much longer. That's just me though. The performance is there but it's just below NVIDIA's 275 in most areas.

__________________"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Just a quick note there isn't any mention of the actual test settings used in the overclock chart / page! That one chart makes for a pretty good snapshot of where things stand, great to see it in there. Nice review.

To be honest, the 4890 looks to be a very solid product and I don't see it hanging around that much above the $220 price point that much longer.

I'm thinking along the same lines, and it's too bad. I didn't expect the GTX 275 to come along and obliterate the HD 4890 in pretty-well every test. It's still a terrific card, but pricing will undoubtedly have to be dropped a little bit.

Quote:

Originally Posted by Kougar

Just a quick note there isn't any mention of the actual test settings used in the overclock chart / page!

Fixed... I never think to include that information. It will always be the highest settings possible though, which in this case, is 2560x1600 (and also the same configurations we used throughout our regular testing). All of the stock results included on that graph were swiped from our 2560x1600 graphs throughout the article.

Was talking to Nate last night about this... it's a strange issue. Honestly, I never noticed that when we did use that particular game (we don't now thanks to DRM), although I admit I didn't pay that close attention. If I ever get that game working again, I'll have to test this out as well, since that's the same area I used for benchmarking.

I really have to wonder if there are other games that experience this sort of thing. Is this what we're coming to? Will benchmarkers really have to analyze each and every scene we use to see if issues like this exist?

nice review! loved the red/green bar chart colors! for some reason i got more interested in the 40nm lineup,lol!
the 40nm's are gonna be awesome! though i am really aching to spend some o me $650 on summat, lol! parents wont agree with the new setup for a 24" LCD. will have to go for a 22" one now. Oh i'll wait for the 40nm GPUs n buy the 22"LCD and a new GPU then!
The innovation Vs Rebranding thing was really nice!

nice review! loved the red/green bar chart colors! for some reason i got more interested in the 40nm lineup,lol!!

I don't personally care about the 40nm chips to be honest but I am blown away that they are able to do it.

Think about it. Who else is at that level? ATI will beat NVIDIA to 40nm. Intel has working 32nm silicon I believe but it wont be to market before ATI. Even AMD doesn't have CPUs at that point. This is a HUGE technological achievement and deserves to be lauded for it.

__________________"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Well, the thing about 40nm is... we won't be seeing high-end cards built on that process for a little while. The launch cards will be the HD 4700 series, specifically the HD 4770. That card is somewhat similar to the HD 4830, but interestingly enough, it will use GDDR5 rather than GDDR3. That's alongside a tighter bus, of 128-bit. Seems a bit odd to tighten up the bus and then use faster GDDR, but AMD said that they expect the performance from that configuration to be quite similar to a 256-bit bus width on GDDR3. I want to see it before I believe it though.

Quote:

Originally Posted by Greg King

Intel has working 32nm silicon I believe but it wont be to market before ATI.

You're right, AMD is first. The CPU guys are skipping over 40nm entirely and going straight to 32nm, and we should be seeing the first units there this fall. AMD's first 40nm will be out next month though, so they are really on a roll.

You're right, AMD is first. The CPU guys are skipping over 40nm entirely and going straight to 32nm, and we should be seeing the first units there this fall. AMD's first 40nm will be out next month though, so they are really on a roll.

wohoo! damn i knew i should not have put so much money in the 780i and should have gone for the cheaper Asus P5Q mobo. 3 way SLi mobo,sheesh!, and i am never even gonna 2 way SLi on it, lol!

If your going to come onto the boards and trash our review (that varies in results only slightly with the two that you linked to) please do yourself a favor and attempt to string together at least a readable sentence. That's barely English.

But to be perfectly fair, you're entitled to your own opinion. If you don't care for the site, perhaps you should visit one of the ones you do like.

__________________"It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring."
- Carl Sagan

Also, I wouldn't say that the gtx275 dominates the hd4890.... Not everyone plays (or benchmarks) at 1680x1050.

If you look at other reviews, you will see that the 4890 actually pulls ahead in many tests at lower resolutions. (1440x900, etc.)

To be honest, there is no reason for someone to purchase a $250 graphics card if their resolution is 1680x1050 or lower, unless they are trying to somehow future-proof or if they plan on upgrading their monitor in the near-future. I think back to when we used to review 9600 GT cards... despite them costing only $100, they could handle games like Call of Duty at 2560x1600 WITH anti-aliasing. It of course wasn't as smooth as a beefier card, but that's at a huge 4.1 megapixel resolution... it only got better going downward.

That's the great thing about GPUs nowadays (and CPUs even)... you really don't have to spend much money to get stellar performance, especially if you are running more modest resolutions.

To be honest, there is no reason for someone to purchase a $250 graphics card if their resolution is 1680x1050 or lower, unless they are trying to somehow future-proof or if they plan on upgrading their monitor in the near-future. I think back to when we used to review 9600 GT cards... despite them costing only $100, they could handle games like Call of Duty at 2560x1600 WITH anti-aliasing. It of course wasn't as smooth as a beefier card, but that's at a huge 4.1 megapixel resolution... it only got better going downward.

Are you joking? You really think somone with a 19" monitor (1440x900) wouldn't buy a 4870 when they came out ($300)....

It isn't "future proofing (no such thing in pc gaming)" its getting the performance you want for a price you can pay.... Some people like playing games at (big shock here) high frame rates, not just slightly above 30... Also, Call of Duty is not a graphics intensive game. Try playing Empire Total War with max settings 1440x900 on a 9600gt.... (gl)

As for your numbers, in every other review I have read the 4870/275 have been neck and neck, 4890 leading in some benches, while 275 leads in others. Yet somehow in your review the 275 is ahead in every single benchmark.... (curious)