my GPU card idles in the upper 40s/50s right now, and loads at 60 in games and 70 in furmark (give or take a few), with average room temp of 85-100^F, the lower being at night when the sun goes down and I have a window fan sucking in cooler air.)
Jumping from 900Mhz to 1000Mhz only increased Furmark by 1^c, sometimes 2^c.
Jumping to 1Ghz didnt give me any perf. boost in Furkmark, but i think it was VSynced at 60Hz, the 1Ghz score was actually lower than the 900Mhz. (shrug)
BF3 it helped a lil, nothing noticable with Fr/Ps Counter off though.
the 95-105 degrees F are the outside temperatures for the area in the last 2 weeks (not PC/GPU temps),
And since this old house has no AC, the indoor/room temps are usually only 1-2 degrees off.
but for 8 months of the year the card idles in the low-mid 20s (well with eyefinity it idles in the upper 20s/ Low 30s),
and loads in the upper 40s.
(avg. Room temp in the 60^F range, as low as 40^F in the middle of the winter)
So when it cools off outside, my room temps will go down enough for me to play with overclocking more.
Down the road I'll prolly mount a closed loop water system to it.

Ah i have central AC my room stays 73-75 degrees. I also have a Huge case with great air flow.. Cosmos 2 FTW.

Yeah I understand that, I know how the binning process works and all, I believe I have just got a lower quality chip with quite small tolerances. I have tried (in overdrive) every voltage % from 0 to 20 in 5% gaps.
The main issue here is I cant request an RMA from my supplier because it doesn't overclock as well as I hoped. Gigabytes customer service is shocking, as much as I like to defend the brands of products I own. I have attempted an RMA with them for the coil whine the card produces which in some circumstances such as Crysis 1 sounds like a dying mouse. It seems their "customer support" is like a call center in some asian country (no offense intended that is my genuine impression due to the kind of replies I get and where the company is based), who is instructed to avoid all responsibility and lay it on the supplier. Of course the issue with any coil whine issue is they can be hard to replicate, if my supplier cannot replicate it, it states in their terms the item will be returned to me at my costs including labor costs. I have just attempted another RMA request, but I doubt it will be successful.

If you are only using ccc to over clock then you are not increasing voltage at all. All power tune does is increase the tdp limit before throttling kicks in. You need to use afterburner or trixx to increase voltage. Also, coil whine isn't really a problem at all, it is just the nature of the product and will decrease with time. 1040mhz is a good oc on stock clocks honestly. I'm sure u will get more out of it once you increase voltage.

No haven't tried 12.7 yet. D3 and the other games I play have all been working really well with 12.6 so I was in no hurry to move to 12.7. The only issue I have had with 12.6 on both of my rigs is that randomly while I am playing any game, the system will switch to the desktop and I have to switch back to the game by clicking on it in the taskbar - anyone else been seeing this with 12.6?

I've been seeing that also with 12.4 jumping to desktop. Found that windows updates was kicking me. Update and that went away for me. Was going to try 12.6, but after 2hrs of headaches with 12.7 beta, I just went back to what works.

Why is it that anytime I uninstall the drivers, all a sudden the screens just go black and stay there? They never come back on so I restart by hitting power button, then system boots to a black screen and just hangs also. Power down again and restart, then it comes back. I love the power of these cards but dame they are a pain to get working sometimes, or all the time.

I even followed Thracks instructions exactly and no where in his instructions did it say you will see black screens and system will do nothing

Ah i have central AC my room stays 73-75 degrees. I also have a Huge case with great air flow.. Cosmos 2 FTW.
Id say your next upgrade should be a Window AC unit lol.

I had a window AC for a few summers, And I kept my room at 65-70..

Other than Cooler temps, the load on the line when the Window unit's Compressor turned on was enough to cause damage. (old Wiring too!)

Old house, the Central Air we had that wasnt used after 1993ish, was added on by previous owner and wired separately,
which was another reason it wasnt used after a while, it kept burning out Fuses the breaker, and 10,000 BTU was not enough to cool 2 stories, 5 bedrooms, 2 bath, living/den, and kitchen. Especially in a older house with crap insulation installed.

My next upgrade needs to be:

-Re-wiring the entire house with Heavy Gauge Romex Cable,
-Re-Run the ThermoStat and Central A/C Lines to the Main box,
-Throw the old Fuse box/Power Distribution box in the trash,
-Call dominion have them come out and install a new Circuit Breaker/Distribution Center Box,
-Re-insulate the walls,
-Put up the new dry wall,
-Have someone come clean the ancient vents.
-Then cut down the tree that grew through the External Central unit.
-Tear Up the Old Central Unit, trash it or recycle the metal/radiator..
-Expand the Mounting platform,
-Drop a 40-50,000 BTU Solar Panel Assisted unit
-Install WiFi Advanced Thermostat.

No, that is incorrect. Powertune affects EVERY AMD graphics card (that has that feature). It is NOT just an MSI thing, its an AMD thing. Reason your Powercolor card wasn't affected to the same degree could simply be because its got a different default voltage, ASIC, or you weren't pushing the card hard enough.

Hmmm maybe you were right pioneerisloud but my friend (he use to reviews VGA in mycountry from CHIP online forum) told me that 7970 also doesnt affected by changing powerlimit, btw he uses afew brands like Club3D, Afox, etc.
Btw I have tested mypowercolor 7979v2 to 1200MHz ... same score I got wether powerlimit "0" or"20/max" in Vantage.

But maybe... it affecting in 3DMark 2011, I saw different GPU score with MSI 7870 TF OC while set powerlimit to 0 and max.... and for 7970v2 I didnt see any difference in GPU Vantage but I forgot to test in 3DMark 2011.Edited by neoroy - 7/12/12 at 11:49am