Interesting article...until the conclusion. Gotta be one of the strangest bits of writing I read in a long time, because they post numbers, but make no real analysis of the data, as the title claims...I mean even an amateur one would be far better than this...this...I dunno WHAT it is...:

Quote:

So, there is nothing frightening about the numbers. Of course, 500 watts is quite a lot. It is about one quarter of an electric iron, but PSUs that can deliver it are widely available for reasonable money, especially if you compare it with the cost of the other components of such a power-hungry configuration. If you want to have a 50% reserve of wattage, a 750W power supply will be sufficient for a system with a Core i7-920 and a GeForce GTX 295.
The other configurations are much more economical. If the graphics card is replaced with a single-chip one, a 500-550W power supply can be used (and it will have a reserve of wattage, too). And an inexpensive 400W PSU will do for midrange gaming PCs.
Note also that this is the power consumption under very heavy tests. No real game can load the computer as heavily as FurMark. It means that a 750W PSU will offer an even larger reserve of power for the most advanced of the tested configurations.

I wonder how many 750w PSUs I can find that would fail running a i920 and a GTX295, simply because they were plugged in wrong, and crossloaded.

It's definitely an interesting article, device and software, however the assumptions that people are purchasing far greater power supplies (and more expensive) than they require, as well as manufactures "declare overstated specifications", I don't think is true. Most people I know get the general power requirements from the product manufacturers.min req. from EVGA:GTX 285 - 550wGTX 295 - 680wThe listed power requirements by the card producers fall right in line with the measurements that Mr. Artamonov ultimately produces, allowing for a few extra drives, some overclocking and my USB powered coffee plate. These manufacturer listed p/s req. seem to be about right.

His seems to give a slight hit to other reviewers (I could be misinterpreting his intent here) stating that "the measurement of real-life consumption, even though mastered by most computer-related media, is often deficient." From what I read, most reviewers try to explain their power measurements in detail along with efficiency caveats etc. and that they are to be considered in a relative context.

Oleg seems to forget that games load the CPU AND the GPU. Put Furmark on the same settings as the games and check your CPU usage. I've done it and guess what? 5 - 12% CPU usage.

If a Furmark run causes my system to pull 600W from the wall, Crysis or Far Cry 2 DX10 at 1920 resolution will pull 750W - 775W at certain points.

It is blanket statements like that which really tick me off.

Yeah, I have to wonder about that little point myself. Now, the 500W figure they were getting was a Prime+Furmark number, so the tester's obviously aware of that fact on some level. Now, whether he jsut misspoke, or whether he actually believed that Furmark's excess consumption would equal CPU+GPU in a gaming situation... meh, dunno.

Aside from the obvious, I think this article is inadvertantly doing a great job of illustrating what overclocking does to your power consumption.