Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

I don't think that the power draw is right. 66watts full load for a gtx260?

Click to expand...

For a 55nm card it wouldn't be a problem. Remember that figure accounts for nothing but the GPU. There are also 14 memory chips onboard that each munch away something like 2W.

Here's a shot of a HD4890 getting busy:
The reported wattage figure for this card is even less relevant as these things have a secondary core power circuitry whose output is not included in this figure. And ofcourse memory on top of that.

Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.

Nearly four months after its previous version, the chaps at oZone3D have released Furmark 1.7.0. This release packs a host of nifty new features, and a number of bug fixes. For starters, FurMark is able to work along with GPU-Z to provide real-time readings on the graphics card's temperatures, voltages and VDDC current (for cards that support it). An experimental feature allows you to Twit your score onto your Twitter account. While the stability test or benchmark is running, the main GUI stays minimized, so you needn't have to start another instance to run several tests.

With multiple GPUs doing the rendering, each GPU is given its own temperature graph. You can start or stop the rendering by hitting the space key without having to close the window. A number of new resolutions have been added, and the application is now also available in Castilian, Bulgarian, Polish, Slovak, and Spanish, thanks to translations. Issues relating to temperature updates in the graph, and the application's multithreading management are resolved. Give your graphics cards a sunbath.

Then why do we need one 6 pin and one 8 pin of some graphic cards? Pci-e slot= 75watts. 6 pin=75 watts, and 8 pin= 150watts. In total, the max is 300 watts for a graphic card. But none of these cards actually reach that high.

Click to expand...

Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.

Also keep in mind that quite a bit of that converted wattage going through the VRM's is wasted as heat

Click to expand...

Volterra chips are around 90-95% efficient. Seems like they're more efficient than other more conventional VRMs, which is evident from the increased power consumption of GTX295 when it went from 2 PCBs to 1 PCB which no longer uses Volterra VRMs.

Because if the card has an onboard PCIe power plug, slot power cannot be used for powering the same load as the 6pin plug is used for. Otherwise current load would be shared between slot and PCIe plug, and that's something one doesn't want to happen, for a number of reasons.

Click to expand...

That doesn't make sense. Why would one use a 75w power connector when one could simply use the 75w from the slot seeing as the power from the slot becomes unavailable when an external power connector is present. And pci-e 2.0 is 150w... why would anyone put a single 75w external power connector (ala 8800GTS G92) on a card that already gets 150w from the slot when using an external power source makes the slot power unavailable? I must have misunderstood somehow...