A Prelude about Availability

Just two weeks ago, ATI introduced their brand new line of X850 and X800 GPUs, a month before the end of the year. In a highly unusual move for a graphics company to introduce so many new products just before the year's end, many have speculated as to the true reasoning behind ATI's decision. Had this been a launch in October or November, it would have been a non-issue, as that's still (barely) enough time to capitalize on the holiday selling period, but a month before the end of the year?

If you ask ATI, the launch was their chance to resolve some availability problems that they had with their higher end cards. ATI promised that within a week of its launch, the Radeon X850 line would be available for purchase online and by January, the new X800 GPUs would also be available. If you ask NVIDIA, they'll say that ATI's year end launch was nothing more than an attempt to delay the purchase of NVIDIA cards and that the availability of the X850 before 2005 was a myth.

If you ask us, we don't know who or what to believe anymore. This past year has been filled of missed releases, missed launches and misinformation. From NVIDIA's Video Processor to ATI's disappearing X700 XT, we're tired of all of it. ATI's promise about X850 GPUs being sold a week after they were launched turned out to be empty as we still can't find any X850 based GPUs available for sale. Even after firing off multiple emails to ATI asking for resolution on the availability matter, we were met with nothing but silence. ATI had no problem answering other questions, but all of our queries about X850 availability have been left untouched. Maybe radio silence will make the problem go away...

We don't think so. The credibility of both GPU vendors has been hurt tremendously this year. Neither side can say that they aren't guilty of the very same things that they accuse the other of doing. It's really a shame because in the end, it hurts their consumers, our readers and makes it much more difficult for anyone, including us, to put faith and trust in what we are told by ATI and NVIDIA.

Hands on with the Radeon X800 XL

Despite all of the issues with availability, about a week ago, ATI fired us off an email saying that we should expect a Radeon X800 XL at our doorsteps. You'll remember from our review of the new X850 and X800 GPUs that the X800 XL is a lower clocked version of the 16-pipe/256-bit memory bus X800 XT built on a 0.11-micron process (as opposed to a 0.13-micron low-k process). Originally priced at $349, the X800 XL would inevitably be ATI's overdue answer to the GeForce 6800GT. We mention its "original" price because less than 24 hours before the publication of this article, ATI informed us that the new price of the X800 XL would be $299, a full $100 less than the GeForce 6800GT. ATI also adjusted the price of the vanilla X800 down to $199.

ATI R480/R430 Product Lineup

Corec

memc

ppipe

ddvi

mem

fab

price

Radeon X850 XT PE

540

1.18

16

yes

256

0.13

$549

Radeon X850 XT

520

1.08

16

yes

256

0.13

$499

Radeon X850 Pro*

520

1.08

12

no

256

0.13

$399

Radeon X800 XL

400

1

16

no

256

0.11

$299

Radeon X800

400

0.700

12

no

128

0.11

$199

*Radeon X850 Pro clock speeds are not yet final

The overclocking community was also quite interested in the X800 XL because of its 0.11-micron manufacturing process. While the card comes clocked at 400/500 (core/memory) by default, a 0.11-micron process would mean that it should reach higher clock speeds than the slower, more power consuming 0.13-micron transistors in the X850, right? The problem here is that although the X800 XL's 0.11-micron transistors do consume less power and can switch faster than those in ATI's 0.13-micron GPUs, that doesn't mean that GPUs based on them will run faster. If we were talking about an isolated situation with just one transistor, then there wouldn't be much of an argument, but with the X800 core, we're dealing with around 160 million transistors - greatly complicating things.

The problem with making smaller transistors running at high frequencies is that there ends up being a great degree of interference or crosstalk between adjacent wires connecting these transistors on the chip. The amount of crosstalk goes up as the operating frequency increases (and as the transistor size decreases), thus becoming a problem as you try and increase the clock of a GPU. The crosstalk is caused by the inherent capacitance of the material surrounding the wires, or put more plainly, the nature of the material next to a wire to remember the charge placed on that wire. So, in order to ramp up clock speed, you have to reduce capacitance and currently, that's done by using materials that have lower "k-values" to insulate between these crosstalking wires. The 0.13-micron process that is used on GPUs like the Radeon X850 XT Platinum Edition is a 0.13-micron process that employs a low-k dielectric just like we mentioned above. The k-value of the dielectric material used in the X850 XT PE is lower than that used in the 0.11-micron X800 XL, making the X850 more resistant to crosstalk. Keeping in mind that the 0.13-micron process is more mature, it's not too surprising that when ATI needed to hit a 540MHz clock speed, they chose their 0.13-micron low-k process and not the 0.11-micron process on which the X800 XL is based.

This isn't to say that the X800 XL won't be a good overclocker, but you shouldn't expect it to be able to hit clock speeds as high as the X850 series. The future is in smaller transistor feature sizes, there's no doubt about that, but for now, the more mature, higher clock speed process is the 0.13-micron low-k process that ATI has used on other members of their product line.

Because of its "low" 400MHz core clock and low power 0.11-micron transistors, the X800 XL gets all the power that it needs from the PCI Express slot with no extra power connector needed.

Unfortunately, we could not test the overclockability of our X800 XL sample as none of the available tools would recognize, much less allow us to adjust the clock speed of the GPU. As soon as the X800 XL is shipping, we should be able to provide you with accurate overclocking expectations of retail cards, which will be much more useful to you than how well our press review sample managed to overclock in the first place. That being said, we would've liked to fulfill our curiosities if it were possible.

So today, we have ATI's first 0.11-micron high end GPU. It won't be able to outrun a X850, even if we could overclock it, but it's priced $100 lower than NVIDIA's 6800GT, which makes it interesting. We've already looked at the performance of the X800 XL in our X850 article, but now we have the ability to focus on power consumption as well as the comparison to NVIDIA's 6800GT in this review. So, let's get to it.

Post Your Comment

44 Comments

I'm satisfied with X800 XL card for playing the game America's Army, very smooth graphics even with 3D settings cranked up to the highest level at 1024x768 resolution. With my previous ATI All-In-Wonder 9600 XT AGP card the game was unplayable, I was "stuck in the matrix" so to speak even when no network lag present. I replaced with ATI All-In-Wonder X800 XL PCI-E, and now I get amazing results, great fun!

I'm disappointed with the TV Tuner, ATI screwed things up big time. With my old 9600 AGP card I could repeatedly press the arrow up or arrow down keys on my keyboard to switch channels instantly with no problem. Now with my new X800 XL PCI-E card I press the arrow key and it takes 1 second before changing to next channel, then once it switched channel I lose sound for 1 second and then it come back to normal. Problem persists whether activating onboard audio or installing PCI Sound Blaster card.

With the X800 XL card I have to reboot my computer every time I want to open the TV Tuner, otherwise I get dangerously loud white noise in my headphones, around 110 dB I think. Considering the earing damage limit is 85 dB, are ATI trying to make my ears bleed or something?

While watching TV the image freeze frequently, the TV Tuner won't respond anymore so I have to reboot my computer. Also I must point out that I'm not dealing with a cheap no-name motherboard here. I'm using the ASUS A8R-MVP with the ATI CrossFire chip integrated onboard, this motherboard was supposedly designed specifically for ATI video cards, one would think they figured how to build stable drivers on their own hardware.

Don't bother with X800 XL if you plan on watching TV on your computer, this card is pure crap. Sure it does works fine for gaming, but this makes no sense, for a comparable price why not buy a better standalone graphic card specially for that.
Reply

Perhaps the $299 price is correct. It may be a matter of supply and demand. Resellers could be making a huge profit on the cards they do have, and within a few months the prices will deflate to the correct MSRP. That's how it always goes. Or, maybe ATI changed their mind...Reply

When are these flagrent B/S articles going to stop. I am tired of reading reviews on both Anandtech and Tom's Hardware based on mis-information and hear say. Lets get the research done properly before going to press!!...$299 my butt!!...at a minimum be more objective in your expectations concerning pre-release misinformation that these graphics card developers always love to pump you guys up with just to get you to hype there products....grrrrrReply

How did you guys measure the wattage of these cards?
You guys did a review before : http://www.anandtech.com/cpuchipsets/showdoc.aspx?...and the wattage numbers on both of the reviews don't look right at all... after all, the AMD processor case could only dissapate just below 200 watts.Reply

That HL2 runs slowly on FX cards doesn't necessarily mean that Valve intentionally wanted it to. I think it has to do with the failures in the FX design. I see no reason why Valve should optimize the game for DX8.1 graphics processors.Reply

Half-Life 2 should not be used for benchmarking. There is growing evidence that Valve crippled NVIDIA cards to make ATI cards to run faster[1]. Although this affects more to GeForce FX cards, the extra bandwidth incurred for using 32-bit shaders vs 16-bit could make a difference on frame rates. Regardless of who is at fault, unless the situation is resolved, Half Life 2 is deemed unsuitable for benchmarking purposes.

Good review. I'm particularly grateful for the inclusion of non-standard games like Bloodlines and Pirates. Most of the games I play do not have Doom or Half-Life or Unreal in their name, and its nice to be able to gauge the performance of a card in games that Nvidia and ATI have not bothered to optimize their drivers for.Reply