Huh? 1 sapphire was $542, GD70 was $160, PSU was $330It cost me about $2 to mine 1 BTC. 1.4 BTC/day. It will take forever (if ever) to get my money back (at current BTC rates)Maybe in 500 days I'll break even.

That's what I was wondering. $2 to mine 1 BTC is at current difficulty with your electricity rates. I was more curious about the hashes as that is less specific to time and place.

All I am trying to measure is the additional power consumption of the cards. This is a machine that is always on, and has been always on for the last 3 years. Thus, the idle power consumption is the baseline, as it would be consuming that regardless. Determining the incremental energy consumption is all I am concerned about, and now have a good approximation of that number.

Now if I was building a dedicated mining rig from the ground up, of course the over-all system wattage would be the important number.

And this is precisely my point! This makes your numbers meaningful only to you, and meaningless to all of us on the forum (because most people here favor more efficient systems -- and a dedicated 2 x 7970 miner idles at 90W or so). You should have disclosed your unusually high baseline idle load in your first post.

All I am trying to measure is the additional power consumption of the cards. This is a machine that is always on, and has been always on for the last 3 years. Thus, the idle power consumption is the baseline, as it would be consuming that regardless. Determining the incremental energy consumption is all I am concerned about, and now have a good approximation of that number.

Now if I was building a dedicated mining rig from the ground up, of course the over-all system wattage would be the important number.

And this is precisely my point! This makes your numbers meaningful only to you, and meaningless to all of us on the forum (because most people here favor more efficient systems -- and a dedicated 2 x 7970 miner idles at 90W or so). You should have disclosed your unusually high baseline idle load in your first post.

Meaningless? Are you serious? Being able to compare apples to apples the power consumption of an overvlocked/undervolted 7970 to a 5970, ot 5870, or any other number of cards is meaningless? What about those with existing farm of dedicated miners? Might they find some meaning in stand-alone efficiency of cards? Might they find some meaningful cost savings if they can upgrade to more efficieint GPU's?

3.74 Mh/J is meaningless to us, because it is inflated by your inefficient baseline idle power of 150W (perhaps you have a high-power CPU, or your PSU's efficiency sharply drops at low loads, or you are running a graphics-intensive Windows Aero desktop, etc)

3.74 Mh/J is meaningless to us, because it is inflated by your inefficient baseline idle power of 150W (perhaps you have a high-power CPU, multiple hardware components not necessary for mining, or your PSU's efficiency sharply drops at low loads, or you are running a graphics-intensive Windows Aero desktop, etc)

2.62 Mh/J is meaningful.

I disagree for exactly the same reasons.

2.62 Mh/J is meaningless as it's not comparing like-for-like, you're adding in the unknown variable of the system, which will be vastly different (as you noted) system to system.

3.74 Mh/J is meaningful, as other people, who wish to compare their values, can use this number by factoring out their own baseline system power.

3.74 Mh/J is meaningless to us, because it is inflated by your inefficient baseline idle power of 150W (perhaps you have a high-power CPU, multiple hardware components not necessary for mining, or your PSU's efficiency sharply drops at low loads, or you are running a graphics-intensive Windows Aero desktop, etc)

2.62 Mh/J is meaningful.

I disagree for exactly the same reasons.

2.62 Mh/J is meaningless as it's not comparing like-for-like, you're adding in the unknown variable of the system, which will be vastly different (as you noted) system to system.

3.74 Mh/J is meaningful, as other people, who wish to compare their values, can use this number by factoring out their own baseline system power.

I think both are useful.

One is nice for comparing cards, the other is nice for comparing systems.

I wanted to compare my 1.8GH rig to this 7970 rig, so overall power consumption was what I wanted.

Now if I wanted to look into swapping my 5970s for 7970s, then the per card consumption is more useful.

2.62 Mh/J is meaningless as it's not comparing like-for-like, you're adding in the unknown variable of the system, which will be vastly different (as you noted) system to system.

3.74 Mh/J is meaningful, as other people, who wish to compare their values, can use this number by factoring out their own baseline system power.

Your argument is invalid because 3.74 Mh/J is also influenced by unknown variables, such as the efficiency of power supplies which varies with load: http://www.anandtech.com/show/2624/3

Here is a thought experiment: yochdog's load/idle power draw is 512/154 Watt. He replaces his power supply with one that is just as efficient at high loads, but more efficient at low loads, changing his measuremnts to 512/130 Watt. Suddenly his mining efficiency went down from 1340/(512-154) = 3.74 Mh/J to 1340/(512-130) = 3.51 Mh/J ! Explain to me why using a formula in which efficiency becomes worse when using better hardware components is useful?

Of course, if everybody had clamp meters, the ultimate way to measure the efficiency of a card would be to measure current at the PCIe power connectors and PCIe slot, like I demonstrated a while ago: http://blog.zorinaq.com/?e=42

2.62 Mh/J is meaningless as it's not comparing like-for-like, you're adding in the unknown variable of the system, which will be vastly different (as you noted) system to system.

3.74 Mh/J is meaningful, as other people, who wish to compare their values, can use this number by factoring out their own baseline system power.

Your argument is invalid because 3.74 Mh/J is also influenced by unknown variables, such as the efficiency of power supplies which varies with load: http://www.anandtech.com/show/2624/3

Here is a thought experiment: yochdog's load/idle power draw is 512/154 Watt. He replaces his power supply with one that is just as efficient at high loads, but more efficient at low loads, changing his measuremnts to 512/130 Watt. Suddenly his mining efficiency went down from 1340/(512-154) = 3.74 Mh/J to 1340/(512-130) = 3.51 Mh/J ! Explain to me why using a formula in which efficiency becomes worse when using better hardware components is useful?

Of course, if everybody had clamp meters, the ultimate way to measure the efficiency of a card would be to measure current at the PCIe power connectors and PCIe slot, like I demonstrated a while ago: http://blog.zorinaq.com/?e=42

Holy bleeping shit.

I try to put some useful (evidently not!) information out on the forum, and MRB comes along to piss all over it because anything but his prefered stats are meaningless.

Apologies to the forum. I wasted everyones time with a useless post.

From now on I will only post the total power consumption of my systems, and you are all on your own deciphering what is being used where.

Here is a fictitious system in idle/load conditions demonstrating my point that cards can draw more than what the kill-a-watt difference shows:

Idle: 130W at the wall, power supply outputs 95W (73% efficient) = 15W to the video card + 80W to the rest of the systemLoad: 200W at the wall, power supply outputs 176W (88% efficient) = 96W to the video card + 80W to the rest of the system

yochdog would conclude from the kill-a-watt readings that the card under load is drawing at most an extra 70W (200-130), but in fact, it is drawing an extra 81W (96-15). This is why subtracting the idle from the load wattage at the wall is not as accurate as you all seem to think.

At the very least, if a kill-a-watt is all you have, I suggest you:- publish your power supply specs to look up the energy efficiency curve at different loads- measure "idle" condition with the card physically removed from the system

My first 7970 came with a number of issues while booting. I had it RMA'd, the new card that just arrived reaches 750 MH/s (1225MHz core, 1375 ram, 1.07 V, 168A) with sensor readings (HWiNFO64) indicate I am pulling 193 watts (Core + RAM). My first card would already be over 220 watts at these settings, seems either software improved or hardware improved, I used the same drivers from the end of Feb, so probably hardware changed?

How accurate are the GPU/CPU sensor readings by GPU-Z or HWiNFO64? Would using a multimeter carefully to measure PCIE and all other power connectors on system be more accurate than onboard sensors or KILL A WATT?

Here is a fictitious system in idle/load conditions demonstrating my point that cards can draw more than what the kill-a-watt difference shows:

Idle: 130W at the wall, power supply outputs 95W (73% efficient) = 15W to the video card + 80W to the rest of the systemLoad: 200W at the wall, power supply outputs 176W (88% efficient) = 96W to the video card + 80W to the rest of the system

yochdog would conclude from the kill-a-watt readings that the card under load is drawing at most an extra 70W (200-130), but in fact, it is drawing an extra 81W (96-15). This is why subtracting the idle from the load wattage at the wall is not as accurate as you all seem to think.

At the very least, if a kill-a-watt is all you have, I suggest you:- publish your power supply specs to look up the energy efficiency curve at different loads- measure "idle" condition with the card physically removed from the system

While I get your desire to have 100% accurate info, I don't fully agree that it is more useful.

1- virtually everyone is going to be running mining rigs on computer power supplies, which are never 100% efficient

2- knowing the actual real world power usage from the wall is very useful for calculating cost of mining, while knowing the exact power usage from the power supply is an interesting mental exercise it doesn't actually benefit anyone who is merely trying to do a cost/benefits analysis.

3- most miners use 80+ certified power supplies, if not 80+ silver or greater spec. The difference between actual power usage and exact power usage is going to be less than you indicate. For example, 80+ silver gets between 85% and 88% efficiency guaranteed, a 3% difference even on a 1000W scale is only 30W and not really a huge inaccuracy.