ATI's new family of graphics processors is finally born. It took the company a couple of years to develop Shader Model 3.0 graphics architecture and several month of re-spinning the chips to deliver products that run at nearly extreme clock-speeds. Let's find, what the RADEON X1800 XT, 1800 XL, 1600 XT and 1300 PRO are capable of!

Power Consumption: 100W Is Not the Limit

We couldn’t disregard such important topic as the power consumption of the new graphics card family. We make the corresponding tests for all four ATI newcomers.

We used a special testbed based on a modified mainboard that allowed connecting special measuring devices to the 12V and 3.3V power lines leading to the PCI Express x16 slot. To measure how much power the graphics accelerator consumes through the external connector, we used an adapter equipped with special shunt and connectors. Our testbed was configured as follows:

Intel Pentium 4 560 CPU (3.60GHz, 1MB L2);

D925XCV Intel Desktop Board mainboard;

PC-4300 DDR2 SDRAM (2x512MB);

Samsung SpinPoint SP1213C HDD (Serial ATA-150, 8MB buffer);

Microsoft Windows XP Pro SP2, DirectX 9.0c.

To create realistic testing conditions we used the game 3 benchmark from 3DMark05 test suite run in 1600x1200 resolution with enabled FSAA 4x and AF 16x. This test loads the pixel processors really heavily that is why in this test the graphics accelerator works in conditions close to those in many contemporary games. So, the results of our measurements were close to real life, too. Here is what we obtained during this test session:

RADEON X1800 XT consumes almost twice as much power as RADEON X1800 XL. Moreover, this graphic adapter beat all the records for the single graphics cards and even exceeded the power consumption of the 24 pipeline GeForce 7800 GTX! Of course, the higher clock frequency and complexity of the new chip played an important role here. But I can also suppose that the main contribution to this sky-high power consumption rate of the new RADEON X1800 XT was made by the 512MB of memory working at 750 (1500) MHz with 2.0V voltage.

It could also be the case that the memory voltage was increased even beyond that value to ensure maximum stability of the solution, but unfortunately, we cannot check if this supposition is true or not, because there are no utilities that could monitor the voltage level of the RADEON X1000. Two RADEON X1800 XT graphics card working in CrossFire mode can consume up to 225W of power, so if you have a 400-450W PSU, it will hardly be enough in this case. You will most likely need a power supply unit that can produce up to 550-600W in continuous mode, because a top-end gaming system with two RADEON X1800 XT graphics cards will also have a bunch of other power-hungry components, such as a powerful CPU and a couple of hard drives.

RADEON X1800 XL demonstrated considerably lower power consumption rates – only 59W, which is about the level of GeForce 7800 GT. The latter uses a 0.11micron GPU with 20 pixel processors onboard, however the higher working frequency and greater chip complexity of the ATI’s solution makes up for that easily. Unlike RADEON X1800 XT, the onboard graphics memory of the RADEON X1800 XL works 1.5 times slower, and its voltage level is only 1.8V. Besides, there are only 256MB of memory against 512MB by the top-end model. This is another argument proving that one of the major power consumers on the RADEON X1800 XT is the graphics memory. We could theoretically check this out by overclocking the RADEON X1800 XL GPU to the level of XT, however, we failed to do it this time, because neither RivaTuner, nor PowerStrip can work with the RADEON X1000 family at this point. RivaTuner 15.7 simply doesn’t recognize the graphics cards at all, and PowerStrip 3.61 displays the memory working frequency incorrectly and doesn’t allow adjusting the GPU frequency at all.

The more predictable results were obtained for the mainstream representatives of the RADEON X1000 family: