I'm guessing he's talking about the fact that there is a measureable change in power consumption between hot and cool VRMs. SPCR did an article about this I think. Can't seem to find it right now. The question posed seems to ask if that extends to the CPU temp as well... My thought is probably not as much as the VRMs and only if you have a really hot CPU under very heavy load. And even then it's probably not much. But that's just my guess.

Yes. The hotter it is, the higher the resistance and the more power it consumes. I think the difference is very tiny, though. You're talking about maybe a couple watts difference over the usual temperature range for a CPU or GPU. I'd swear I remember reading an Xbitlabs (I think) article where they actually measured this on a GPU, but of course I can't find it now.

Power consumption is affected not only by the temp of the CPU but also peripheral components such as the VRM and the northbridge chip. Just how much this can vary depends on the particular parts, and how great the change is. With the hottest CPUs running at full artificial load (prime95, etc), the difference between the heatsink fan running at 12V vs 5V (assuming a Nexus 120 or similar) can be >10W at the AC outlet. However, if we're comparing two HSF at full speed, one a bit better than the other, then the difference would be much smaller -- as others have said, not more than a few watts.

Given that the temps on VRM and NB stays the same, one can wonder how variations in CPU fan speed effects the combined power consumption of the CPU and the fan...
Any savings gained by a cooler CPU might be cancelled out by more power fed to the fan, and vice versa.

Given that the temps on VRM and NB stays the same, one can wonder how variations in CPU fan speed effects the combined power consumption of the CPU and the fan...Any savings gained by a cooler CPU might be cancelled out by more power fed to the fan, and vice versa.

CheersOlle

I don't think such a scenario is realistic. When the CPU temp goes up, almost invariably, the VRM temp goes up -- my observation after nearly a decade of CPU HSF testing. Also, the power draw of a 1500rpm fan is usually under 2W at full tilt, and running it at half the speed probably saves no more than a watt. I think this would be less than the difference in power draw with the CPU running hotter/cooler -- tho that depends on what the CPU temp difference is.

It's not obvious to me whether temperature would change CPU efficiency - as long as the CPU is running at a constant voltage. If it were just a matter of the increased resistance, the power loss should go down as the temperature goes up (until the CPU stops working because appropriate gate voltage levels aren't achieved).

On the other hand, if using a better cooler allows you to run the CPU at a lower voltage, then clearly, you'll get some savings.

Who is online

Users browsing this forum: No registered users and 2 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum