Ghz ?

I've looked around a bit here and I may have missed it, but why are we so locked down on GPU GHZ? I know from the overclockers that heat is the enemy and depending on how far you want to up the GHZ it can get very expensive to dissipate that heat away. I know also that with a die shrink you can count on improved efficiency and less heat for the same performance as the last generation.
My question is this. Would it be possible to die shrink a GPU and keep the same number of transistors but spread them out over the size of the last generation GPU to offer a greater area of heat dissipation?
I'm not an engineer and I'm sure that my simple idea has been passed over for good reason but I wonder about this.

Well, it has already been done. Not exactly how you said it. The best example is the G92 core. Decreasing the photolithography process but keeping die size doesn't help/improve heat dissipation that much. Also die shrinks means less die space (obviously), thus more chips can be made out of the same wafer, thus keeping a balance in manufacturing costs.
I'm sure I'm missing some key points here, but I guess it's covered.

Thank you for the replies but I still wonder. Let's use a hypothetical. Let's say we reduce the architecture to 28 nm but we increase the size of the GPU chip while keeping the same number of transistors thereby increasing the overall area for heat dissipation. Wouldn't that allow for a GHZ increase?

What you are purposing is counter productive. You are say we should shrink the individual transistors down, but leave the die space the same. Practically, the smaller transistors will use less power and produce less heat. The key point you are missing is within a confined space there is no place for the heat to dissipate. In order to spread the now smaller transistors out over a large surface area, they would have to be connected via metal pathways. Those pathways will be A housing a current flow that is producing heat, and B trapped in a confided place filled with other heat producing sources. Overall you have not reduced the amount of heat producing sources, nor have your reduces the amount of metal on the die which houses that heat you are trying to avoid.

You can't think about a CPU or GPU as a complex component when it comes to heat production. As far as heat is concerned related to these, the objective is to reduce the amount of metal and reduce friction. Every die shrink is about reducing the metal. Often this allows them increase the transistor count and thus processing power while still reducing overall amount of metal. Every advancement in Instruction Sets, coding, new 3D gates, manufacturing techniques is about reducing friction either by design or by reducing the number of working transistors to achieve a goal. If an new instruction set can reduce a calculation by 1 clock cycle, that could be literally thousands of transistors and gates that don't have to be moved.

What you want to do is decrease the size of the die/chip and increase the efficiency of the heat dissipation via better heatsink technology.

"The key point you are missing is within a confined space there is no place for the heat to dissipate. In order to spread the now smaller transistors out over a large surface area, they would have to be connected via metal pathways. Those pathways will be A housing a current flow that is producing heat, and B trapped in a confided place filled with other heat producing sources. Overall you have not reduced the amount of heat producing sources, nor have your reduces the amount of metal on the die which houses that heat you are trying to avoid."

CPUs with SOI sell for far less $$$ than GPUs. There is a different reason, and expense is just not part of it. AMD Fusion APUs with a GPU included are using SOI, and sell for far less than any current-gen 7-series GPU

As the 7970 being very nearly sold out still shows...people will pay whatever is asked for something...you just need to have marketing that justifies the price....and specifically with GPUs, fanboys will buy anyway. nV had no issues selling $800 8800GTX cards.

And with that said, I guess I'm no longer an ATI/AMD fanboy, becuase I think i'm gonna buy nV cards this time.

its the manufacturing process, gpu's use a cheaper(slightly) process, gpu's also have more transistors anyway then cpu's due to the shader array so produce way more heat then cpu's and 1Ghz is doable and has been a few years more then thats on the way id imagine, amd especially would have a lot to gain from equalising gpu and cpu speeds as then they will be able to easier integrate the cpu and gpu in Apu's to share rescources

CPUs with SOI sell for far less $$$ than GPUs. There is a different reason, and expense is just not part of it. AMD Fusion APUs with a GPU included are using SOI, and sell for far less than any current-gen 7-series GPU

As the 7970 being very nearly sold out still shows...people will pay whatever is asked for something...you just need to have marketing that justifies the price....and specifically with GPUs, fanboys will buy anyway. nV had no issues selling $800 8800GTX cards.

And with that said, I guess I'm no longer an ATI/AMD fanboy, becuase I think i'm gonna buy nV cards this time.

Click to expand...

A 7970 that also has ram, a board, a cooler, power controls, its own BIOS, drivers they have to write, and die size cost, yeild loss.

By the time Nvidia or AMD engineer the core, spin it a few times to workout bugs, mass produce it ship it, then sell it in bulk to AIB partners that extra cost is alot.

You don't honestly believe they couldn't make a 2Ghz GPU with more money? Prescott.

Heat currently isn't the biggest issue, newer cards run under 80C with good air cooling. And can you really think of a good reason that we aren't using a included TEC if money isnt the issue. the fact is, money IS the issue, every time you add cost you decrease profits. AMD doesnt make a dime more if a retailer or AIB maker raises the price for a finished card.

"Stuffing 225W ++ into a dual-slot space is stupid. nevermind 500W, 750W, 1000W with multiple cards. But people still do it..."

Thank you Dave between the Ca Ca. It's better than being DaveCaCaDave...that would be a sh#t sandwiche.

Humbly speaking for the PC Gamers...we stuff our slots with monster cards because (A) it's fun and (B) the Corporate Profit Hoes that run EA (and some others) can't be bothered to optimize their f'ing games for PC so we either play games that look like console port caca or we spend $$$ for something that looks better than caca because we can.

Yeah, I'm stuck in the middle of the crap that is my daily life. The most I can say I accomplish with my time is my reviews...nothing else really amounts to much. Well, taking care of my kids is important, but that has it's own issues as well.

I am basically using a second card, so I can enable Ultra in BF3. One card plays high just fine. You are very right...I was just talking to TheMailman78 the other day, and I mentioned just this:

Top-level "settings" in games, really, aren't there to be used...today. They are there to make you want to upgrade, in the future.