Let's Get It Out of the Way: Radeon HD 3870 vs. GeForce 8800 GT

The question on everyone's mind is how well does the 3870 stack up to the recently launched GeForce 8800 GT? If you haven't been noticing our hints throughout the review, AMD doesn't win this one, but since the 3870 is supposed to be cheaper a performance disadvantage is fine so long as it is justified by the price.

Does the 3870 deliver competitive performance given its price point? Let's find out.

Honestly, the Radeon HD 3870 stays very close to the 8800 GT, much closer than AMD's previous attempts to touch the 8800 series. But is the price low enough to justify the performance difference? For that we must do a little numerical analysis; the table below shows you what percentage of the 8800 GT's performance the Radeon HD 3870 delivers:

3870: % of GeForce 8800 GT Performance

1280 x 1024

1600 x 1200

1920 x 1200

2560 x 1600

Bioshock

84.4%

82.4%

87.9%

93.9%

Unreal Tournament 3

87.8%

85.8%

89.6%

91.6%

ET: Quake Wars

80.5%

95.9%

96.8%

103%

Oblivion

66.7%

74.1%

74.4%

71.5%

Oblivion (4X AA)

70.5%

77.7%

80.2%

82.6%

Half Life 2: Episode 2

101%

95%

91%

86.7%

World in Conflict

81.5%

85.7%

84.9%

89.2%

Call of Duty 4

103%

98.3%

92.3%

82.1%

Crysis

72.4%

73.3%

75.5%

-

Average

83.1%

85.3%

85.8%

87.6%

Here's what's really interesting, on average the Radeon HD 3870 offers around 85% of the performance of the 8800 GT, and if we assume that you can purchase an 8800 GT 512MB at $250, the 3870 manages to do so at 87% of the price of the 8800 GT. The Radeon HD 3870 becomes even more attractive the more expensive the 8800 GT is and the opposite is true the cheaper it gets; if the 8800 GT 512MB was available at $219, then the 3870 doesn't stand a chance.

If AMD can actually meet its price expectations then it looks like the 3870 is actually competitive. It's slower than the 8800 GT, but the price compensates.

Post Your Comment

114 Comments

But - now new Catalyst drivers have been released - so an updated benchmark needs to be completed as the drivers provide better support for the hardware and thus, better performance.

Also, you used a non-AMD MoBo and Chipset... if you went with XFire + AMD 790 chipset + Phenom X3/X4 processor (Spider platform) you would have seen a better performance as well. There are other benchmarks that are/were done with these components (spider) and the results weren't nearly as mediocre. Just a little tip... Reply

I cant see how every review I have read differs from your charts, the 2900 xt can't be faster then the 3850.I mean I spent a month researching cards and the winner was the 3850 overclocking it to 3870 speeds. To think that AMD spent all that time to make a new 2900xt and name it the 3850-70, is just foolsih. from the benchmarks you provided only an idiot would buy the new gen cards for 60-100 buxks more when the 2900xt is on par. Could you please explain to me how this happened? I feel like ordering a 3850 was a waste of money because the old 2900 is better anyway. Reply

It looked to me that the biggest complaint on the HD Video Decode article was that the 2600/2900 options did not provide an off switch for the Noise Reduction. Did you notice if this option appeared to be present in the newer drivers of this card (3850)?

First Nvidia with its 8800GT... I clearly recall seing those at about $200, now they're $300 or more. At least these may come bundled with a game... they also "hold the crown".

Now the HD 3870 has gone up to $269.99 (at newegg) and availability is every bit as bad as the 8800GT.

This review assumes that AMD/ATI was going to deliver in volume, at a fixed price and they haven't delivered either. It would be really nice if you could slap their wrists... as individual consumers we are being tossed about and we don't have the "pull" to do anything other than "take it".

I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

If I'm not mistaken the 8800GT is DX10 only right? Is DX10.1 so insignificant as to not count to the favor of the 3800's over the GT's? Don't get me wrong, I'm not trying to defend AMD; I just want to know if it's a good idea to sell my 8800GTS 320mb still at a good price now(I live in Brazil and they're still pricey here) and buy a 3870 or a 8800GT with 512mb. I recently bought a 22" monitor and the GTS is somewhat disappointing at 1600x1050. Nah, it's just that crappy game world in conflict. It runs similar to crysis demo at max! I have to play at medium and the textures are really crappy for a high-end pc 8-month old :(
Who knows, maybe I'm already CPU or memory bound with a core 2 duo 6400@24xxMhz and dual ocz platinum 2 1gb 800mhz(2gb total)...
Thanks in advance for any more input on the qualities of DX10.1 :) Reply