AMD pairs the 700 million transistors GPU with 512MB of GDDR3 video memory. The 1.65 GHz GDDR3 memory communicates with the GPU via a 512-bit memory interface, delivering 106 GB/sec of bandwidth. The new ATI Radeon HD 2900 XT continues to make do with an 80nm fabrication process to consume approximately 215 watts of power altogether.

Although images have shown the ATI Radeon HD 2900 XT with dual dual-link DVI ports, the card does in fact support HDMI output. An included adapter allows users to experience high-definition video and 5.1 surround sound audio via HDMI output. Also bundled with the ATI Radeon HD 2900 XT are keys for Valve’s upcoming Half Life 2: Episode Two and Team Fortress 2. The keys allow owners to download the games, when released, over STEAM.

Taking on NVIDIA’s recently launched GeForce 8600 family is the new ATI Radeon HD 2600 series. AMD plans to paper launch the ATI Radeon HD 2600 in PRO and XT guises. The ATI Radeon HD 2600 series features 120 stream processors. The GPU is clocked anywhere between 600 to 800 MHz depending on flavor. AMD backs the 120 stream processors with eight texture units and four render backends.

The amount of processing power brings the ATI Radeon HD 2600 GPU transistor count to 390 million. Nevertheless, the ATI Radeon HD 2600 is manufactured on a 65nm process. AMD rates power consumption at approximately 45 watts.

At the bottom of the new ATI Radeon 2000-series lineup is the ATI Radeon HD 2400 series with PRO and XT models. The new ATI Radeon HD 2400 features 40 stream processors with four texture units and render backends. GPU clocks vary between 525 MHz to 700 MHz.

The ATI Radeon HD 2400 series features less than half the transistors as the HD 2600 – 180 million. AMD has the ATI Radeon HD 2400 series manufactured on the same 65nm process as the HD 2600. Power consumption of the ATI Radeon HD 2400 hovers around 25 watts.

Despite an announcement for the complete ATI Radeon HD 2000 series, the ATI Radeon HD 2900 XT will be the only card available for purchase tomorrow. The ATI Radeon HD 2600 and HD 2400 series will hit retail in late June 2007 with an accompanying benchmark NDA lift.

Expect to pay $399 for an ATI Radeon HD 2900 XT tomorrow from the usual add-in board partners including Diamond Multimedia, HIS, PowerColor and Sapphire Technology. AMD expects to target the ATI Radeon HD 2600 series towards the $99 to $199 market segment and ATI Radeon HD 2400 series towards the less than $99 segment.

Comments

Threshold

Username

Password

remember me

This article is over a month old, voting and posting comments is disabled

DX10 is bullshit. Call of Juarez isn't shipping yet, and teh bundle that comes with the review kits is actually just a demo. It won't even run on NVIDIA cards, so the fact that CHW put numbers up should be a testimant to their validity.

And I would guess the reason those 2 didn't take their reviews down is cause they only broke the nda by a few hours. Tweaktown broke it by a day -- and believe me when one of those guys breaks teh NDAs, every editor calls ATI until ATI makes the offending site take it down.

Regarding reviews, I posted this below, but I wanted to make sure people actually read this, so here it is again:

-------------------------------------------------

A few weeks ago, [H]ard|OCP gave a glowing review of the 8600 GTS, in which they concluded that it soundly beats the X1950 PRO (for now, we'll ignore the fact that they were comparing the $150 1950 to $220 OC'ed versions of the GTS, which cost roughly 50% more). Of course, every other review on the web concluded that in the best cases, the 8600 GTS keeps up with Ati's 1950 Pro, and in the worst case is completely embarassed by the 1950 Pro; in several cases, even a stock 7900 GS can cream the 8600 GTS.

So when they stated in the intro of their HD 2900 XT review that the 8600 offers "the best performance in [its] class," I was reminded of their deceptively positive review of the 8600, and thus was not a bit surprised when they showed the 8800 GTS outperforming the 2900 XT in virtually ALL cases. Most reviews I've seen so far have had the 2900 keeping up with, and in some cases soundly beating, the 8800 GTS in at least a few significant games (see, e.g., Rainbow Six: Vegas in AnandTech's review).

I won't go as far as to say that [H]ard|OCP massaged or fabricated data, but I do think they deliberately chose their tests in both their 8600 and 2900 reviews to make the ATI parts look bad.

In any case, I don't think any of us can ever trust that website again.

I'm trying to understand your point and it still seems to be missing me. I accept that you do not like [H]. That's fine, everyone has likes and dislikes. I actually like the way [H] does their reviews because I am more interested in what IQ options can I use with each card rather than the reviews that look at framerates at some insane resolution (Really, how many people run native rates of 2560x1600?) without any concern over IQ.

You appear to be comparing the 8600GTS with the 2900XT. Why would you choose those two cards to compare? Just looking at NewEgg, the 8600GTS (one reviewed by [H] actually) can be had for $175+S/H. The 8800GTS-640Mb can be had for $330+S/H-MIR and the HD2900XT is running $430+S/H. Clearly, the $430 for the 2900 will fall in the next few months but it will never be less expensive than the 8800GTS. That being said, the 8800GTS is the closest competitor to compare to the 2900XT.

Ok, we've established the 8800GTS and the 2900XT are competitors rather than the 8600GTS vs. the 2900XT. Now let's look at the reviews you linked. The "sound beating" the 2900XT delivers to the 8800GTS is in a game with no AA or AF included. It is true that the 2900XT clearly demonstrates greater framerates in this particular graph, but it's not really the full story. You may argue that AA/AF is unnecessary in R6:Vegas, but that is subjective. You should consider such data when describing how one card “soundly beats” another. Look back at the review Derek did of the 8800Ultra. Look at the R6:Vegas graphs of with and without AA (Notice that the graphs aren’t at the same resolution. You have to look down into the tables to compare with and without AA at the same Res). Notice that turning on 4xAA in R6:Vegas hammers the framerates (about a 45% drop in FPS). Without data on how AA affects the performance of the 2900XT in R6:Vegas, we have to look to other games where data with and without AA is presented. Let's look over at [H] a second and look at Oblivion. There, they show that the 8800GTS shows basically the same framerates as the 2900XT, but with increased grass draw distance. In Dereks review, he doesn't say what the settings are for the grass but you can see that when AA/AF is enabled the 2900XT goes from clearly leading the 8800GTS to slightly behind. Also, look at the S.T.A.L.K.E.R. results. Both Derek's and the [H] review show the 8800GTS clearly leading the 2900XT even if you don't include AA/AF as Derek chose not to do. Adding the IQ factors, the [H] review shows similar performance between the 2900XT and the 8800GTS but with substantial increases in IQ for the 8800GTS (1600x1200 for the 8800 vs. 1280x1024 for the 2900, full dynamic lighting for the 8800 vs objects dynamic lighting for the 2900, enabled sun and grass shadows for the 8800).

Is the 2900XT a total dog? Absolutely not. It has features that could potentially add a lot of value as they are implemented down the line. Based on pricing today however, the 2900XT clearly lags behind the 8800GTS which gives AT LEAST as good a performance although in many situations it is substantially better than the 2900XT. The 8600XT is not a competitor either in performance or price with the 2900XT. Derek clearly points out that where AMD really has the chance to shine is if they can bring a product to market to compete with the 8600 in price and performance, but that product isn’t here yet. For now, you have to stipulate that based on price/performance in current applications the 2900XT is of less value than is the 8800GTS.

quote: As said before, I had results from both the Catalyst 8.36 and Catalyst 8.37 Drivers. How did performance improve between this small driver jump of 0.01 version?

quote: Taking a setting at 1600x1200 with 16xAF, I saw a major increase in performance, particularly Company Of Heroes and Quake 4. Performance went up by 11% on COH and 42% on Quake 4! This shows that the drivers is still very raw on this card , with just a minor driver revision boosting up performance that much, it gives us quite a lot of hope for a fair bit of improvement to come. Let's hope for that!

quote: In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is piss-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.

I don't know about Quake4 but the performance boost for Company of heroes in X1000 class cards exist since catalyst 7.1 or maybe 7.2. Even though the new generation are complicated products with DX10, Audio and maybe physics handling, but immature drivers after 6-7 months delay is not acceptable.

quote: Performance went up by 11% on COH and 42% on Quake 4! This shows that the drivers is still very raw on this card

That's a hell of a conclusion to pull. What I suspect happened (this happened with us until we figured it out) was that you had to put two cards into Crossfire first, then back down to single-card. Doing so would increase single card operation by like 20%. I'm guessing the 8.37 addressed this.

quote: ... you had to put two cards into Crossfire first, then back down to single-card. Doing so would increase single card operation by like 20%. I'm guessing the 8.37 addressed this.

~_~

Sorry but I am a bit confused here. Are you implying that prior to 8.37, in a two cards setup, even if you choose to test a single card, still the second card will influence the test? or you are saying that the single card test was 20% lower than what it should have been and adding another card corrected the mistake.

After seeing the benchmarks all over the web I can honestly say that: This HD 2900 XT is a fucking joke , whichever developed it should be fired immediately, because such low gains in performance over the old X1950 XTX is unacceptable for a card with 700 mil transistors.

They should have know that they have to deliver at least 2x the performance of the old X1950 XTX, at least that's what NVidia does with every new chip they develop it to be at least 2x faster than the old model.

"Game reviewers fought each other to write the most glowing coverage possible for the powerhouse Sony, MS systems. Reviewers flipped coins to see who would review the Nintendo Wii. The losers got stuck with the job." -- Andy Marken