PC enthusiasts with Ivy Bridge engineering samples, and reviewers at large have come to the consensus that Ivy Bridge is a slightly warmer chip than it should be. An investigation by Overclockers.com revealed a possible contributing factor to that. Upon carefully removing the integrated heatspreader (IHS) of an Ivy Bridge Core processor (that nickel-plated copper plate on top of the processor which makes contact with the cooler), the investigator found common thermal paste between the CPU die and the IHS, and along the sides of the die.

In comparison, Intel used flux-less solder to bind the IHS to the die on previous-generation Sandy Bridge Core processors in the LGA1155 package. Attempting to remove IHS off a chip with flux-less solder won't end well, as it could rip the die off the package. On the other hand, the idea behind use of flux-less solder in CPU packages is to improve heat transfer between the die and the IHS. Using thermal paste to do the job results in slightly inferior heat transfer, but removing IHS is safer. One can be sure that making it safe for IHS removal couldn't have been the issue behind switching back to conventional thermal paste, as everything under the IHS isn't user-serviceable anyway, and off limits for them. Perhaps Intel kept extreme overclockers in mind.

douglatins said:I don't think we are in such numbers to do that. Most maisntream High end PC owners just buy them prebuilt or don't oc that much

I think we are, because the only people that buy a K-series CPU are enthusiasts, since they cost more for the same stock performance. If the heat issue with IB is real on production models from 29th April, then you watch the 2700k sell out in no time flat. I said it here first! :toast:

You could take the tops off before with the solder. You just needed to put the proc on your electric stove and get to the right temp. I haven't done it in a very, very long time but I have performed the operation in my overclocking days.

HighEndToys said:You could take the tops off before with the solder. You just needed to put the proc on your electric stove and get to the right temp. I haven't done it in a very, very long time but I have performed the operation in my overclocking days.

Ouch. You have to use a very thin razor on the gray glue stuff first, then heat the top. When you do it right you just grab the green PCB while it is still on the stove and it lifts right off without any prying or fuss. Obviously you want to put the heat spreader on the stove and not the PCB.

qubit said:Unfortunately, this issue means that the 2700K will stay UP in price, not come down, since savvy enthusiasts will prefer this to IB. I now really don't want to overdrive mine too hard, to ensure that it has a long service life.

Aside from the fact that overclockers make up hardly ANY percentage of PC users, that would mean the 2700k and the 3770K would be priced the same. That makes no sense from a business/marketing perspective to me.

They didn't really shaft everyone else, the things still overclock pretty damn good, just not as good as expected.

And they pretty much shafted everyone else when they locked overclocking on every processor but the top two. Defeats the original purpose of overclocking. What is the point of taking a processor that is already super fast, way faster than anyone needs it to be, and making it faster? The idea of overclocking was to take a weaker processor and make it fast.

newtekie1 said:And they pretty much shafted everyone else when they locked overclocking on every processor but the top two. Defeats the original purpose of overclocking. What is the point of taking a processor that is already super fast, way faster than anyone needs it to be, and making it faster? The idea of overclocking was to take a weaker processor and make it fast.

While I agree that the holy grail of mainstream overclocking is to make the cheap CPU perform like the top one to save some pennies, I disagree that Intel's strategy totally defeats the purpose of overclocking.

One can still take one of those K CPUs and make it work significantly faster. For example, my 2700K has a stock speed of 3.5GHz, but I have overclocked it easily to 5GHz, with no effort at all, just set the multiplier to 50 and reboot. Obviously, to a hardcore overclocker that's really interested in seeing what it can do, would probably get it to 5.5GHz or more at a guess (at the expense, of heat, power and fan noise, obviously). That's a massive step up from the stock speed, so yes it's still worth it.

In my particular case, I'm a very "casual" overclocker, so I backed it off a bit and run it at 4.8GHz, since the Asus mobo monitoring tool warned that the voltage was a bit on the high side.

qubit said:While I agree that the holy grail of mainstream overclocking is to make the cheap CPU perform like the top one to save some pennies, I disagree that Intel's strategy totally defeats the purpose of overclocking.

One can still take one of those K CPUs and make it work significantly faster. For example, my 2700K has a stock speed of 3.5GHz, but I have overclocked it easily to 5GHz, with no effort at all, just set the multiplier to 50 and reboot. Obviously, to a hardcore overclocker that's really interested in seeing what it can do, would probably get it to 5.5GHz or more at a guess (at the expense, of heat, power and fan noise, obviously). That's a massive step up from the stock speed, so yes it's still worth it.

In my particular case, I'm a very "casual" overclocker, so I backed it off a bit and run it at 4.8GHz, since the Asus mobo monitoring tool warned that the voltage was a bit on the high side.

Yes, but it defeats the purpose of overclocking because no one really needs anything beyond a 2700K. A 2700K at 3.5GHz is a beast of a CPU, it will handle everything without a problem. And the gains of overclocking it to 5.0GHz in real world use are very marginal. However, you will see far better gains from taking an i3 or Pentium from 3GHz to 5GHz. And that is what overclocking as for, to get performance gains in real world use, but now on the Intel side it has just turned into a dick wagging competition with people overclocking their processor simply to say that they have done it.

So why complain now? It's not gonna change again...this is what we get!

Also, I remember some 10 years ago, when OC was far less popular than it is today, it was explained that OC was a way to get "tomorrow's performance today"(which still applies), not OC the snot outta a poor performer to make it a good performer...that's the cry of a broke man trying to get top-lev el performance for pennies, and that just doesn't fly with businesses that make multiples of millions of dollars.

I mean, really, for most, a PC is still a luxury, nevermind that machines that many of us have. There's literally no need for OC in the market right now. NONE. There's a little left over for the die-hards, but let me tell you...my personal rigs, more often than not, don't get no OC at all...there's just no point.

Reality Check! PC enthusiasts and OC'ers make for less than 10% of the market! Do more than 10% of chips OC?

newtekie1 said:Yes, but it defeats the purpose of overclocking because no one really needs anything beyond a 2700K. A 2700K at 3.5GHz is a beast of a CPU, it will handle everything without a problem. And the gains of overclocking it to 5.0GHz in real world use are very marginal. However, you will see far better gains from taking an i3 or Pentium from 3GHz to 5GHz. And that is what overclocking as for, to get performance gains in real world use, but now on the Intel side it has just turned into a dick wagging competition with people overclocking their processor simply to say that they have done it.

It's a matter of horses for courses; this answer seriously isn't black and white or write or wrong. I agree that speeding up a weaker, cheaper processor is better, but you will still find people that gain real-world improvements from overclocking these CPUs and it's not all dick wagging. Folding@Home is just one example of where it will make a real, tangible difference. Of course, gaming is another, where you can't have too much horsepower and want to prevent those frame drops at all costs, especially if you're aiming for 120Hz, like I do - high end graphics card required, of course. :) I replaced my old E8500 specifically for this reason.

I complain now just like I complained then. And I thought only the ones with turbo-boost got the 4 extra bins(beyond the turbo-boost bins), or do they all get 4?

And while I agree somewhat with the getting tomorrows performance today, it was still used to see real world performance gains, even the high end processors saw real world performance gains, because hardware was behind software. That isn't the case today, hardware(especially Intel's) has far outpaced the software. Whether it was overclocking a high end processor or a low end, it was for a pretty noticeable performance gain. I still remember overclocking Ahtlon XP 3200+ chips to shave up to half an hour+ of video rendering, now doing the same to an i7 shaves maybe 30 seconds.

And, yes, I completely agree that most people do not overclock. But I'm just talking about the people that would care that there is TIM under the heatspreader vs solder. The idea was said to be to help with extreme overclockers while shafting everyone else, which I assumed to mean everyone else that overclocks, not everyone in the world that will use the processor. Obviously, as you said, this doesn't affect 90% of the people that will end up using these processor.

newtekie1 said:I complain now just like I complained then. And I thought only the ones with turbo-boost got the 4 extra bins(beyond the turbo-boost bins), or do they all get 4?

And while I agree somewhat with the getting tomorrows performance today, it was still used to see real world performance gains, even the high end processors saw real world performance gains, because hardware was behind software. That isn't the case today, hardware(especially Intel's) has far outpaced the software. Whether it was overclocking a high end processor or a low end, it was for a pretty noticeable performance gain. I still remember overclocking Ahtlon XP 3200+ chips to shave up to half an hour+ of video rendering, now doing the same to an i7 shaves maybe 30 seconds.

And, yes, I completely agree that most people do not overclock. But I'm just talking about the people that would care that there is TIM under the heatspreader vs solder. The idea was said to be to help with extreme overclockers while shafting everyone else, which I assumed to mean everyone else that overclocks, not everyone in the world that will use the processor. Obviously, as you said, this doesn't affect 90% of the people that will end up using these processor.

crap, you might be right about the turbo thing..that slipped my mind.

ANd yeah, it's a slippery slope, right? Your comment about hardware being behind software is quite accurate, to me, and you are also right, that that situation today has greatly changed.

I see no way that this helps extreme guys. ZERO. Because you have to use the heatspreader so that the CPU actually contacts the pins, many will not due to socket plastic height.

I said it before..this was a move to sell new coolers, and that is all. I haven't seen this actually effect extreme clocking..Those guys are doing just fine with paste under the IHS, and this will give reason for people to buy the new chips that will come out about 6 months from now..

I have a 3960X with 3x 6950's and 32 GB of ram that i use to play BF3 pretty much exclusively. Why would i need to OC? :wtf: Got guys playing on FX-8150 and much lesser CPUs just fine...heck, i don't even need the third card. Why would i want to increase my power bill?

cadaveca said:I have a 3960X with 3x 6950's that i use to play BF3 pretty much exclusively. Why would i need to OC? Got guys palying on FX-8150 and much lesser CPUs just fine...heck, i don't even need the third card. Why would i want to increase my power bill?

Sure, I'm just messin' with ya and doing a bit of PC enthusiast nerdraging. :p

The cost of cooling to OC, jsut doesn't jsutify the gains in performance any more. Like sure, there was a time when that was all I'd do..but I was chasing decent performance on a 2560x1600 monitor...today, I have 3 monitors in Eyefinity...and the hardware bugs prevent me from enjoying that...not a lack of performance. It's my monitors loosing picture when i start apps, or exit them, flicker on the side monitors, apps not supporting the resolution...those are the current issues that affect high-end users, not a lack of performance. It's crappy drivers and poorly-designed hardware that needs to be addressed, not overclocking potential!!!.

The cost of cooling to OC, jsut doesn't jsutify the gains in performance any more. Like sure, there was a time when that was all I'd do..but I was chasing decent performance on a 2560x1600 monitor...today, I have 3 monitors in Eyefinity...and the hardware bugs prevent me from enjoying that...not a lack of performance. It's my monitors loosing picture when i start apps, or exit them, flicker on the side monitors, apps not supporting the resolution...those are the current issues that affect high-end users, not a lack of performance. It's crappy drivers and poorly-designed hardware that needs to be addressed, not overclocking potential!!!.

Well, my answer in post 42 applies here too. I don't see what else I can say.

I'm sorry that Eyefinity is giving you headaches and I hope AMD pull there fingers out and fix it.

NHKS said:found this... not sure of the credibility but definitely implies SB is better for OC with air cooling (in case u do OC ur CPU)
also, upto 4.6GHz lower vcore is needed to reach the same freq with IB..

Meh, not every CPU is like what OBR has, and just a few days ago he was proclaiming that all the heat was due to the 3D transitors drawing more current. WTF do you even pay attention to him and his postings? I could grab two CPUs and show the exact opposite.

PopcornMachine said:The number of people who wish to overclock really is moot here.

The point is the, if the use of TIM instead of solder is the cause of the heat problems, why did Intel do this? What did they possibly gain? A half cent on each chip?

Why not use the same tried and true method they always have?

If they manufacture the same way, then it doesn't matter who overclocks and who doesn't.

This is the real point. Why would they mess with something they didn't need to mess with?

Duh yes, exactly. I just want to see reports from regular users like us of their overclocking experiences with retail models. I'm holding out a tiny sliver of hope that production CPUs will use solder. Holding my breath! (slightly)