As the owner of a TI-4600 128DDR, I find the card to run what I would call unnecessarily hot all the time, whether in 3D or in generic 2D windows. I would not call the typical 2D graphics of Windows all that stressful on the card.

So why must it run at full power, so hot all the time that it must be fan-cooled, when all I am doing is sitting here reading static nonmoving graphics images of articles on the SPCR forums? Is redrawing a static image of the desktop without the mouse moving, really all that taxing on the graphics processor?

If nothing onscreen changes, the graphics processor really has nothing to do other than refresh the monitor from the already-rendered image in video memory. Oh, and update about 25 pixels once a minute as the Windows clock slowly advances. (I have several old 16-bit ISA 2meg SVGA cards that can handle this same job and barely get warm to the touch, compared to the heat rolling off the TI-4600..)

I would bet that video card manufacturers do not consider power consumption an issue, and if you have the money to purchase a high-power card you are expected to need a huge cooling system. And besides, designing their high-power chips for efficiency when graphics load is minimal would likely cut into their elbowing for top place in the all-important FPS score charts.

Who cares if you need a leaf-blower fan to keep the thing from melting to slag when editing a Word document? The GPU is just sitting there, spinning along at 9,000 RPM with the clutch pushed in because there's nothing for it to do but generate useless heat.

They are not under the same pressure as Intel, to design chips that can squeeze the max life out of a laptop battery, for example. Instead they simply design a weak chip for laptops, and leave the good stuff energy sucking chips for the desktop, rather than designing a powerful but "speedstep-capable" video chip that only uses as much power as is needed at a given moment to draw complex 3D or simple, boring 2D.

I am aware of the CoolBits hack that lets you adjust the CPU/memory speed. And yes, my system still does great in Windows with the sliders all set to the minimum. The question is, why isn't this throttling down an automatic part of the system when graphics load is low?

Methinks we need a SPCR.com initiative to get some sort of concrete information on video card power consumption and dissipation...much like we have this page for pc chips, I'd love to see one for video cards:

so long as I get the videocards, I have the means to test and calculate their powerdraw.

I have a dead 9800pro 256mb that will power up, so I could get idle powerdraw but I don't think anyone would be interested and the credibility of such a measurement is questionable when it doesn't even POST.

I also have a softmodded 9500 128mb non pro @9700 stock speed that I could test. This one works;).

I have updated my 3rd posting here in the thread. Now, you can compare the complete range of ATI and Nvidia on the same system. Remember that they measure complete system AC draw.

Analysis at Windows desktop:Nvidia is very good since they clock back. They are all in the pretty tight range.
Price/Heat/Performance king is the FX 5900(XT) with 90W.
If want less power draw, the GF 4 MX 440 offers 86W.
Old GF 4 series is quite bad with 96 and 104W.

ATI should include mobile features in their theoretical superior GPUs
Price/Heat/Performance king is the 9600pro with 86W.
The lowest is the 7500 with 83W.
More power means exeptionally more heat, a 9700pro with 106W is a good choice here.

As a reference: A Radeon 7500 with very tiny stock heatsink and fan uplugged works is a little bit hot at Windows dektop whereas a 9700 with Zalman HP80 is almost too hot to touch.

Btw, if you go for a 5900 ultra instead of a 9800 pro, you save 30W of heat while not gaming! This means here at germany ~45 Euro/year if your PC is 24/7h.

Yes, ATI should definitely add throttling but it's not *that* big of a deal (Nvidia only added it b/c well, they *had* to, given the combustible engines they're putting out mascarading as GPUs). Use Powerstrip and set up two profiles - one minimum and one maximum. You can switch between them in 2-3 clicks depending on whether you're on the desktop or gaming.

old post, but in case anyone else happens to wonder, i'm *quite sure* (but wouldn't stake my life on it) that anything newer than the 4200-4600 will automatically underclock in 2D. Not sure about the 4800.
I also remember ATI cards never doing this (not that they particularly needed to), but the X800s must do something, considering how low their power consumption is on idle.

Who is online

Users browsing this forum: No registered users and 1 guest

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum