A report out of Expreview says that users should expect Intel's 8700K 6-core processor to easily clock up to 4.8 GHz with conventional cooling methods. Apparently, the chip doesn't even need that much voltage to achieve this feat either; however, thermal constraints are quickly hit when pushing Intel's latest (upcoming) leader for the mainstream desktop parts. Expreview says that due to the much increased temperatures, users who want to eke out the most performance from their CPU purchase will likely have to try and resort to delidding of their 8700K. While that likely wouldn't have been necessary with Intel's 7700K processors, remember that here we have two extra CPU cores drawing power and producing waste heat, so it makes sense that thermals will be a bigger problem.

This is understandable: Intel is still using their much chagrined (and divisive) TIM as a heat conductor between the CPU die and the CPU's IHS (Integrated Heat Spreader), which has been proven to be a less than adequate way of conducting said heat. However, we all knew this would be the case; remember that Intel's HEDT HCC processors also feature this TIM, and in that case, we're talking of up to 18-core processors that can cost up to $1,999 - if Intel couldn't be bothered to spend the extra cents for actual solder as an interface material there, they certainly wouldn't do so here. As with almost all peeks at as of yet unreleased products, take this report (particularly when it comes to frequencies, as each CPU overclocks differently) with a grain of salt, please.

It's not really valid to suggest that Intel is using TIM on processors just to save money. Soldering chips to heat spreaders is a great way to get rid of heat, but the solder itself will degrade faster than TIM and Intel is in the business of supplying chips for a lot of long-duration, intensive tasks, such as enterprise servers. If Intel can hit thermal targets with TIM and have the chips last longer, I'm sure they don't care that enthusiasts have to delid their chips for a scant 200 MHz of extra speed.

danbert2000 said:It's not really valid to suggest that Intel is using TIM on processors just to save money. Soldering chips to heat spreaders is a great way to get rid of heat, but the solder itself will degrade faster than TIM and Intel is in the business of supplying chips for a lot of long-duration, intensive tasks, such as enterprise servers. If Intel can hit thermal targets with TIM and have the chips last longer, I'm sure they don't care that enthusiasts have to delid their chips for a scant 200 MHz of extra speed.

danbert2000 said:Soldering chips to heat spreaders is a great way to get rid of heat, but the solder itself will degrade faster than TIM and Intel is in the business of supplying chips for a lot of long-duration, intensive tasks, such as enterprise servers.

danbert2000 said:It's not really valid to suggest that Intel is using TIM on processors just to save money. Soldering chips to heat spreaders is a great way to get rid of heat, but the solder itself will degrade faster than TIM and Intel is in the business of supplying chips for a lot of long-duration, intensive tasks, such as enterprise servers. If Intel can hit thermal targets with TIM and have the chips last longer, I'm sure they don't care that enthusiasts have to delid their chips for a scant 200 MHz of extra speed.

Seams like intels bulldozer to me , and that's a light jab as way back I fully understood the physics of the task and got an 8350 and plenty of cooling regardless, then clocked it to 5ghz , its at 4.8 now due to degradation and being on its whole life (5+years of Folding@home and a bit of wcg:)), some year's now , I would love to hear how stock untouched i7 7700ks or this new beastie hold clocks at 4.8 after similar abuse ,as I've experience myself of this tim stuff and it isn't capable of miracles.

R-T-B said:Yeah, there is no way TIM outlasts solder. Even if it did, all the first gen i7s running around prove the longevity issues nonexistant.

I agree they can last don't get me wrong but um talking about overclocking to this 4.(high) on all cores then putting adequate cooling on it and setting it to task for five years, I'd like to see some proof of such a thing without a delid.
Id expect much reduced thermal transfer over time and glitchy stability myself.

theoneandonlymrk said:I agree they can last don't get me wrong but um talking about overclocking to this 4.(high) on all cores then putting adequate cooling on it and setting it to task for five years, I'd like to see some proof of such a thing without a delid.
Id expect much reduced thermal transfer over time and glitchy stability myself.

I'm running a first gen i7 in my brothers rig at 4Ghz no delid. Then again, first gen i7s are soldered. I think the silicon will go before the solder does at this rate.

"Micro cracks in solder preforms can damage the CPU permanently after a certain amount of thermal cycles and time. Conventional thermal paste doesn’t perform as good as the solder preform but it should have a longer durability – especially for small size DIE CPUs."

Paste doesn't crack when it's sealed in a heatspreader. Solder is well known to crack under successive heat loads. For example, this is what caused the 30% failure rate of Xbox 360's.. Sorry guys, your feels don't affect materials science.

danbert2000 said:"Micro cracks in solder preforms can damage the CPU permanently after a certain amount of thermal cycles and time. Conventional thermal paste doesn’t perform as good as the solder preform but it should have a longer durability – especially for small size DIE CPUs."

Paste doesn't crack when it's sealed in a heatspreader. Solder is well known to crack under successive heat loads. For example, this is what caused the 30% failure rate of Xbox 360's.. Sorry guys, your feels don't affect materials science.

I thought it was because the bga solder that held the cpu to the motherboard actually melted from the heat... that's why people could bake their xbox 360's back into working condition...

AFAIK the xbox CPU didnt even have a heatspreader.

EDIT: Yeah just checked... it was the solder that held the chip to the board, not thermal solder that holds the heatspreader to a cpu.

danbert2000 said:"Paste doesn't crack when it's sealed in a heatspreader. Solder is well known to crack under successive heat loads. For example, this is what caused the 30% failure rate of Xbox 360's.. Sorry guys, your feels don't affect materials science.

First off, the solder on the Xbox 360 wasn't attaching the CPU to the heatsink but the CPU to the board. The thermal stresses are WAY different. Second, it probably wouldn't have been a problem anyway if Microsoft had used the correct solder for the BGA balls. BTW the CPU chip itself was poorly designed and had a low yield. Lesson learned; don't let a software company design hardware in house!

danbert2000 said:"Micro cracks in solder preforms can damage the CPU permanently after a certain amount of thermal cycles and time. Conventional thermal paste doesn’t perform as good as the solder preform but it should have a longer durability – especially for small size DIE CPUs."

Paste doesn't crack when it's sealed in a heatspreader. Solder is well known to crack under successive heat loads. For example, this is what caused the 30% failure rate of Xbox 360's.. Sorry guys, your feels don't affect materials science.

The issue with the Xbox 360 was the solder used in attaching the chip itself to the motherboard , not the solder used for the heatsink. They didn't even use solder for that.

Yes I know the Xbox did not fail due to soldered heatsink failure. It did fail due to cracking solder. Which everyone else is saying isn't a problem. Clearly it is. Solder can crack anywhere due to heat/cool cycles, including between a $2000 server CPU and its head spreader.

Intel did the math and figured they would save warranty money and have a more reliable product if it didn't have the risk of cracking solder at the major thermal interface.

You guys really suck at reading comprehension by the way, I never said the Xbox failed due to solder in the heatspreader failing. I pointed to it as one of the most famous examples of solder failure messing up a product's reliability. Now put on your synthesis hats and apply that experience, of solder failing, to a different application of solder. I know you can.

So you either read the article and disagree with it but don't have the intelligence to give a reason, or you didn't read the article and just wanted to be pedantic for what turned out to be your mistake in reading comprehension. Gotcha. I guess I'm the best kind of troll, the one that discusses the actual post and gives evidence as to why Intel may not use solder even if it has better thermal performance.

I don't care about the article at all , my comment had nothing to do with it. I reckon you should take a better look at what I said and tone back the insults.

You claimed the Xbox 360's high failure rate was due to those "micro-cracks". Which is simply false , the low temperature solder Microsoft used would easily reach the melting point due to poor cooling causing the solder points to deform and ultimately brake connection. I have no idea how the hell you made the connection between that and the micro-cracks caused by thermal cycles that article was talking about. Actually I do , you don't know what you're talking about. Talking about comprehension , oh boy...

Vya Domus said:You claimed the Xbox 360's high failure rate was due to those "micro-cracks". Which is simply false , the low temperature solder Microsoft used would easily reach the melting point due to poor cooling causing the solder points to deform and ultimately brake connection.

"German computer magazine c't blamed the problem primarily on the use of the wrong type of lead-free solder, a type that when exposed to elevated temperatures for extended periods of time becomes brittle and can develop hair-line cracks that are almost irreparable"

danbert2000 said:Yes I know the Xbox did not fail due to soldered heatsink failure. It did fail due to cracking solder. Which everyone else is saying isn't a problem. Clearly it is. Solder can crack anywhere due to heat/cool cycles, including between a $2000 server CPU and its head spreader.

Intel did the math and figured they would save warranty money and have a more reliable product if it didn't have the risk of cracking solder at the major thermal interface.

You guys really suck at reading comprehension by the way, I never said the Xbox failed due to solder in the heatspreader failing. I pointed to it as one of the most famous examples of solder failure messing up a product's reliability. Now put on your synthesis hats and apply that experience, of solder failing, to a different application of solder. I know you can.

I am no expert but I am pretty sure that they are different alloys of indium, complete with different pre-prep, thermal performance, and soldering processes. Can you use indium wire for soldering your heatsink? sure... is that what intel / amd uses? probably not. Do they have soldered CPUs that have been running for decades? yes...

To say that "solder can crack and damage your core" takes a bit of assuming. I think intel definitely did the math, and realized their operating profit would be much higher with minimal impact to failure rates if they opted for cheaper tim.

danbert2000 said:"German computer magazine c't blamed the problem primarily on the use of the wrong type of lead-free solder, a type that when exposed to elevated temperatures for extended periods of time becomes brittle and can develop hair-line cracks that are almost irreparable"

phanbuey, did you read the article I posted? I'm not the one making the case for solder as the TIM being more prone to failure, the nice people at overclocking.guide are. I was just relaying what I read as it's a counterpoint to the author's position that Intel is only doing this for the money. Also, I did not use the XBOX 360 RROD fiasco as proof of why solder TIM is a bad idea, I gave it as an example of solder failing under thermal load. Before that post, the mood of this thread was derision at the mere thought that solder instead of paste could cause problems. You need to respect the difference between an example of general solder failure rather than a link in my argument. I know why everyone jumped on the 360 stuff though, that's a lot easier to argue against than the analysis in the article I posted, which is pretty solid.

To your point about thermal paste drying out, it won't dry out in a sealed environment like a heatspreader, or at least not for decades. Where is the moisture going to go?

Thanks for being civil while arguing against me, it's a lot easier to have a discussion when people aren't throwing "troll" and "ignore list" around.

R-T-B said:I'm running a first gen i7 in my brothers rig at 4Ghz no delid. Then again, first gen i7s are soldered. I think the silicon will go before the solder does at this rate.

I agree

danbert2000 said:"German computer magazine c't blamed the problem primarily on the use of the wrong type of lead-free solder, a type that when exposed to elevated temperatures for extended periods of time becomes brittle and can develop hair-line cracks that are almost irreparable"

Oh man, in fighting against trolls you've become one. I have all the evidence on my side and all you have is a bad attitude.

That's solder used in a different use case ,and solder dries out, all of it does , eventually, but if soldered poorly electrical connections can deteriorate fast , but thats fixable sometimes via reflow , I've had jobs doing just that.
Soldered heatsink is purely for thermal not electric conductivity and you ares so wrong it hurts.

theoneandonlymrk said:That's solder used in a different use case ,and solder dries out, all of it does , eventually, but if soldered poorly electrical connections can deteriorate fast , but thats fixable sometimes via reflow , I've had jobs doing just that. Soldered heatsink is purely for thermal not electric conductivity and you ares so wrong it hurts.

Here, I think you missed this article I posted and we're all discussing. Maybe you want to actually read it this time. Why are you even bringing up electrical conductivity... The whole point is that cracking solder between the heatspreader and the processor will lead to hotspots on the processor and could kill it. I'd like to see you reflow the solder on a processor after the solder has begun to degrade.

the director of Intel officially stated that the solder will no more ever, they do not see in it the needs!
Intel as the company degrade:
Savings on textolite, it became thinner
Savings on the thermal interface
Savings on the cooling system (previously there was always a cooler in the kit)
Savings on quality conductors, less gold