Not sure why you guys are dicking around with trying to determine voltage by watching the power draw. Whip out a multimeter and dispel all uncertainty.

Probably because nobody feels like destroying their card by trying to probe a BGA mounted GPU.

I see you've never looked at a GPU before. The SMT caps on the back of the GPU supply the voltage you guys are all talking about. One side is ground, the other side is vddc. Not even sure why the GPU being a BGA package comes in to play here.

If your card doesn't have a backplate you can also measure from the vddc output capacitors. Next weekend when I consolidate my 5870s I'll try to remember to probe some of them while playing w/ voltage settings.

I got some 5870's today, a his reference and a non-reference. Been playing with the settings listed here. No crashes at 0.959 volts, but even though I have core at 850 and get almost 400 reported mhash it appears to produce less shares then when I put them at 1.02v and the same speeds, which has been my go to setting for 5970's. It shouldn't take more than 10-15min to throw luck out of the equation as far as shares are concerned right? Does anyone else notice the same thing?

this is going by the share/minute average in cgminer, maybe this isnt even accurate who knows?

Both rigs were pulling between 766-770 watts as measured through my kill-a-watt. I changed the voltage on the 7 cards that will allow changes to 1.100 V, but noticed zero power consumption change through the kill-a-watt. I guess I am going to go shoot for what I can get on 1.06v tomorrow when I have to change some cards around.

It shouldn't take more than 10-15min to throw luck out of the equation as far as shares are concerned right? Does anyone else notice the same thing?

Depends on how of a difference. If it is 50% higher then yes 10 minutes is more than enough. However if it is more like 3% higher it could just be variance after 10 minutes. You really want 3-4 hours to reduce variance enough to make small changes comparable.

this is going by the share/minute average in cgminer, maybe this isnt even accurate who knows?[/quote]

So if 0.959v isn't a valid voltage the card will actually be running at voltage > 0.959V whatever the next higher step is. Which according to johnyj (not verified by me) is 1.06V

So 0.959V, 0.96V 1.0V, 1.0123456789V it doesn't matter the card is running at the next higher valid step which may be 1.06V.

The only way to verify for sure is to pick a static clock (lower is better) and connect rig to watt meter (like kill-a-watt). Change voltage and look for a change in wattage. No change in wattage = no actual change in voltage. Since wattage is going to have some variance anyways it may require measuring power instead (kWh) and time to get average wattage.

So something like

Set clocks to a static 700 Mhz.Set voltage to stock.Measure power for ~10 minutes. Divide power by exact time to get avg wattage.Lower voltage and try again.

You will notice something like this

Voltage: 1.05V, 1.04V, 1.03V, 1.02V 1.01V = same wattage. Then at some point the wattage will drop. THAT IS THE DISCRETE STEP.

That's exactly what I did, I measured the change with only one 5870 installed, the result is directly visible on the power meter after each voltage change, I do not remember the exact voltage step location, but my card can run at 1.06v as high as 880Mhz, so I'm quite satisfied at this setting for now

Not sure why you guys are dicking around with trying to determine voltage by watching the power draw. Whip out a multimeter and dispel all uncertainty.

reflashing a card and checking for change in total power draw takes me 5 min tops, if I would want to check each type of card with a multimeter it would mean id have to take apart my rigs completely, it's not very accessible when you run 6 cards tightly packed with extenders in ghetto custom builds, duct tape/pieces of wire/cables ties ye you get the picture

So the highest voltage setting is definitely to be avoided, but weather or not to use the lowest voltage setting is up to each person's preference. As long as the heat and noise is not a problem, I'm satisfied with stage 2 voltage settings, anyway I do not want my hash rate drop too much, time is also a cost that should be put into consideration

BTW, I saw some screen artifacts when I run lowest voltage settings, stability could be an issue

So the highest voltage setting is definitely to be avoided, but weather or not to use the lowest voltage setting is up to each person's preference. As long as the heat and noise is not a problem, I'm satisfied with stage 2 voltage settings, anyway I do not want my hash rate drop too much, time is also a cost that should be put into consideration

BTW, I saw some screen artifacts when I run lowest voltage settings, stability could be an issue

I am getting artifacts on some of mine too at .95 but they still mine, no errors. It has been over a week.

Fixed the artifacts on the only card that was doing it for me by using a bios from another card. The card with artifacts was a release day xfx 5870 that I used in my gaming rig before it was demoted to mining, other bios came from a ref xfx BE that was manufactured at least 4 months after release. Might just have been a fluke but who knows!

I also remember that 5xxx had a lot of problems with GSoD (grey screen of death) when idle at release (I was one of the victims :/) and that was fixed with later drivers. Think what they did to fix it was to disable the lowest idle clocks the card had, iirc they also released updated bios versions that supposedly fixed it later on so it might be some low voltage bug that is the culprit in all of this. Or it might just have been a fluke like I said :p I like rambling about random things!

I run my 5870's at ~215mem and use 128 worksize... maybe 1.5mhash less..

Dacentec, best deals for US dedicated servers. They regularly restock $20-$25 Opterons with 8-16GB RAM & 2x1-2TB HDD's (ofc, usually lots of other good stuff to choose from). I did a Serverbear benchmark of one of my $20/mo Opteron (June last year), it's here. Have had about a half dozen different servers with Dacentec, & none have failed to sustain at least 40MB/s (burst higher). My favorite is a 12-month rent-to-own ZT Systems 2XL5520 16GB 2x2TB SATA for $40/month (got lucky with the 'off-brand', haven't seen a RTO 2xL5520 for under $50/mo since -- at least for monthly contracts). wholesaleinternet.com has some ancient 2-core intel CPUs @ $10/mo sometimes (I got an Intel Core 2 6300 @ 1.86GHz, with a 250GB HDD with 46000 hours on it, LOL. $20 @ Dacentec is much better, if you can grab one). joesdatacenter.com (same location as Wholesale Internet) also occasionally has specials (or if you don't want to wait, it has an AMD Opteron 170 @ $16/mo).

Interessting, today i set the voltage from my 5850 from 1.044>1.012. Original was 1.088V. Still works now for ~8hrs with 850Mhz, same clockspeed like before. GPU Temps stay the same but Roomtemp rising to 23°C, +2°C.

Great info in this thread. I can get 310 Mh/s (760/300) on my Sapphire 5850s at 0.95v using BAMT (Phoenix2). I wish I could go to a lower voltage, but there doesn't seem to be a way to do it. Are any of you doing something special to get below 0.95? I can tell my cards to go lower, but they don't do it, and in Atitweak 0.95 is listed as the lowest setting.