It's still a lot of extra heat being dumped into the room. And it probably wouldn't be as big of a deal if the gap with Nvidia isn't as large as it's ever been, historically.

Click to expand...

Having had a w3520 @ 4.2ghz and a tri sli GTX 470 system back in the day. I know full well about heat dump. However I doubt I would be able to perceive the difference if I shut off a GPU since the issue is circulation.

I open a door if it gets too warm. Even my laptop can heat up the room after some time.

5870s were never 220W. They average 180-190W under load. Meanwhile, GTX 480s were going as high as 350W and if you even touch voltage they'd skyrocket into 400W territory. Everyone basically agrees that the GPU power draw records are a tie between Vega 64 OC and GTX 480/580

5870s were never 220W. They average 180-190W under load. Meanwhile, GTX 480s were going as high as 350W and if you even touch voltage they'd skyrocket into 400W territory. Everyone basically agrees that the GPU power draw records are a tie between Vega 64 OC and GTX 480/580

Click to expand...

228W TGP is what AMD listed for the 5870 2GB and Eyefinity Edition. In practice they didn't draw much more than the 1GB card.

5870 and 6970 competed in same price and performance tier as 470 and 570, not 480 and 580.

480 didn't draw that much by itself, you're probably talking total system draw. (Unthrottled FurMark might be different story but that's a bad example.)

Nvidia listed TGP was on the conservative side for Fermi (esp. 480 very conservative as it was closer to 300W than 250W), but difference vs. AMD competitors wasn't that big:

Edit: Don't forget that 480's draw was exacerbated by its poor reference cooler which made it run extremely hot, something Nvidia fixed with 580.