45 Comments

I just did a build with an E8500. The temp always shows 30 degrees no matter how high I overclock it or what speed I have my Vantec Tornado at. Being an overclocker it stinks that I bought a cpu with a temp sensor that doesn't work. I guess its a common problem with this cpu and I hear Intel won't RMA a cpu with a bad sensor. I'm gonna be giving them a call. Reply

I do not agree with much of mindless1's critique on page 3, but we arrive at a somewhat similar conclusion: the section " The Truth About Processor "Degradation" " is lacking. Rather than adressing my issues with mindless1's post I'll just explain my point.

Showing the influence of temperature on reliability is nice and well, but you neglect the factor which is by far the most important: voltage. It's effect on reliability / expected lifetime / MTTF is much higher than temperature (within sane limits).

How did you generate the curves in the first plot on that page? Is it just a guess or do you have exact data? Since you mention the 8500 specifically I can imagine that you got the data (or formula) from some insider. If so I'd be curious about how these curves look like if you apply e.g. 1.45 V. There should be a drastic reduction in lifetime.

If you don't think voltage is that important and you have no ways to adjust the calculations, you could pm dmens here at AT. I'd say he's expert enough in this field.

where are the sub $200 q6600's? i know microcenter had some for $200, but they are no where near me. any other ones? stating it in the article like that makes me think they are available at almost any retailer for that price. maybe if it was rephrased to something like they have been known to be priced as low as $200 or something like that. then again. maybe i'm not in the know, and am just not looking hard enough. Reply

Yet another example of lies. The cheapest Q6600 on pricewatch is $243. And that doesn't come with a 3yr warranty OR a heatsink. So really the cheapest is $253 for retail box with heatsink/fan and 3yr. That's a FAR cry from $200. Cheapest on Cnet.com is $255. Where did they search to find these magical $200 Q6600 chips? I want one. I suspect pricegrabber etc would show the same. I'm too lazy to check now...LOL Reply

150 pounds is about 250-300$ american which is nowhere near what the articles author is claiming. One microcenter deal doesnt really constitute claiming you can bag one from retailer(S) for under 200$. Also, another poster pointed to what he called a q6700 for 80$. That is not true, it was an e6700 which is dual core not quad. Reply

quote:Intel has also worked hard to make all of this performance affordable. Many US retailers now stock the 65nm Q6600 quad-core CPU at less than $200, which places it squarely in the 45nm dual-core price range - something to think about as you make your next purchasing decision. However, if it comes down to the choice between a 65nm and 45nm CPU we would pick the latter every time - they are just that good. The only question now is exactly when Intel will decide to start shipping in volume.

While this may be true for those building a new system around a new CPU, this might not hold true for those looking to overclock using an existing board. These 45nm CPUs with their higher stock 1333 FSB will by necessity use lower multipliers which essentially eats into potential FSB overclocks on FSB-limited chipsets. Considering many 6-series NV chipsets and boards will not support Penryn *at all* this is a very real consideration for those looking to upgrade to something faster.

Given my experiences with NV 6-series chipsets compared to reviews, I'm not overly optimistic about Penryn results on the 7-series either. Curious as to which board you tested these 45nm with? I haven't kept up with P35/X38 capabilities but the SS you showed had you dropping the multiplier which is a feature I thought was limited to NV chipsets? I might have missed it in the article, but clarification would be appreciated.

I have a problem with this part of that statement "Intel has also worked hard to make all of this performance affordable." They forget it was AMD who forced Intel to cut margins on cpus from 62% (I think that was their high a few years back) to a meager 48% (if memory serves) and their profits to tank 60% in some quarters while driving AMD into the ground. Do they think they were doing it for our sakes? NOT. It was to kill AMD (and it worked). WE just got LUCKY.

How much INTEL butt kissing can you do in one article? Notice that since AMD has sucked Intel is starting to make an ABOUT FACE on pricing. Server chips saw an increase of 20% about a month ago or so (it was written about everywhere). Now we see the E8400 which was $209 on newegg just a few weeks ago and IN STOCK, is said to go for $250 if you believe Anandtech. Even newegg has their future chips when they come back in stock now priced at $239. That's a $30 increase! What is that 14% or so? I missed the first E8400's and thought it would go down, instead Intel restricts volume and causes a price hike to soak us since AMD sucks now. I hope AMD puts out a decent chip shortly (phenom 3ghz would be a good start) so we can stop the price hikes.

What's funny to me is the reviewers let Intel get away with pricing in a review that comes nowhere near what it ACTUALLY debuts for. They've done the same for Nvidia also (not just anand, but others as well). The cards always end up being $50 more than MSRP. Which screws competitors because we all wait for said cheap cpu/gpu and end up not buying a very good alternative at the time (on the basis of false pricing from Intel/Nvidia). They should just start saying in reviews, "but expect $50 more upon debut than they say because they always LIE to get you to not buy the competitors product". That would at least be more accurate. For the record I just bought an 8800GT and will buy a wolfdate in a few weeks :) My problem here is the reviewers not calling them out on this crap. Why is that? Reply

Yes you are right that the higher default FSB works against them, it would be better if a Pentium or Celeron 45nm started with lower FSB so the chipsets had enough headroom for good overclock.

NV is not the only one that can drop the multiplier, I've not tried it on X38 but have on P35 and see no reason why a motherboard manufacturer would drop such a desirable feature unless the board simply was barren of o'c features, something with OEM limited bios perhaps. Reply

Agreed. I haven't had a cpu that hasn't been heavily overclocked since like 1992 or so. All of these chips clear back a 486 100mhz ran for others for years after I sold them. My sister is still running my Athlon 1700+ in one machine. It's on all day, she doesn't even use standby...LOL (except for the monitor). It's like 6yrs old. Probably older than that. Still running fine. I think it runs at 1400mhz if memory serves (but it's a 1700+, you know amd's number scheme). Every time I upgrade I sell a used chip at like 1/2 off to a customer/relative and they all run for years. I usually don't keep them for more than 1yr but they run well past the 3yr warranty for everyone else, and I drive them hard while I have them. I'd venture to guess that most of Intel/AMD chips would last 5+ years on avg. The processes are just that good. After a few years in a large company with 600+ pc's I realized they just don't die. We sent them to schools (4-5yr upgrade schedule) before they died or sold them to employees for $50-100 as full pc's! I think I saw 3 cpu deaths in 2.5yrs and they were dead weeks after purchase (p4's...presshots..LOL). Don't even get me started on the hard drive deaths though...ROFL. 40+/yr. Those damn Dell SFF's get hot and kill drives quick (not to mention the stinking plastic, smells burnt all day). You're lucky to get to warranty on those. I digress... Reply

Yes, the author is completely wrong about overclocking. Overclocking (within sane bounds like not letting the processor get hotter than you'd want even if it weren't overclocked), INCREASES the usable lifespan, not decreases.

The author has obviously not had much experience overclocking, for example there are still plenty of Celeron 300MHz processors that ran at 450+MHz for almost ten years then were retired due to beyond beyond their "usable" lifespan, just slow by then modern standards. Same for Coppermine P3 and Celeron, Athlon XP, take your pick there are almost never examples of a processor that fails prematurely that had ran stable for a couple years, unless it was due to some external influence like the heatsink falling off or motherboard power circuit failure.

Overclocking really isn't a gamble - unless you don't use common sense. 2-3 years is a lifespan you'd get if you were doing something extreme, not a modest voltage increase using a heatsink that keeps it cool enough.

I suggest the article page about "The Truth About Processor Degradation" should just be deleted, it's not just misleading but mostly incorrect. Here's the core of the problem:

"As soon as you concede that overclocking by definition reduces the useful lifetime of any CPU, it becomes easier to justify its more extreme application."

Absolutely backwards. Overclocking does not by definition nor by any other nonsensical standard, reduce the useful lifetime of CPUs. It increases the useful lifetime by providing more performance so that processor remains at the required performance levels (which escalate) for a longer period, then eventually is retired before failing in most cases. It is wrong to think that if an overclocked processor would last 18 years without overclocking and 12 with modest overclocking, that this suddenly means "it becomes easier to justify it's more extreme application." It means you can do something sanely and have zero problems or use random guesses and do something "extreme" and then you will find a problem. Author is completely backwards.

"Too many people believe overclocking is "safe" as long as they don't increase their processor core voltage - not true."

There is no evidence of this. Show us even one processor that failed from increase in clock speed within it's default voltage and within it's otherwise viable lifespan.

Wrong. While it is true that a higher frequency will increase temps, it is not true that a higher temp (so long as it's not excessive) will cause the processor to faill within it's "useful life". On the contrary you have extended the useful life by increasing the performance. Millions upon millions of overclockers know this, a moderate overclock (or even a lot, providing the vcore isn't increased significantly) has no effect, it's always some other portion of the system that fails first from old age, generally motherboard or PSU. It might be fair to say that overclocking, through use of more current, is more likely to reduce the viable lifespan of the motherboard or PSU, or actually both long before the processor would fail.

Intel doesn't warranty overclocking because it is definitely possible to make a mistake though ignorance or ineptitude, and because their price model is based on speed/performance. It is not based upon evidence that experienced overclockers using good judgement will end up with a processor that failed within 8 years, let alone 3!

It also goes a long way to understanding why Intel has a strict "no overclocking" policy when it comes to retaining the product warranty. Too many people believe overclocking is "safe" as long as they don't increase their processor core voltage - not true. Frequency increases drive higher load temperatures, which reduces useful life.

AMD has recently proved this, and even Intel to some extent with P4's. AMD's recent chips have been near max, with almost no overclocking room (same for quite a few models of P4's) and they lived long lives. Proving you can run at almost max at default voltages with no worries.

Where does the author get his data? Just as you said. Prove it. I think Intel is tire of us overclocking the crap out of their great cores. With AMD not having ANY competitor they end up with all chips being able to hit 4ghz but having to mark them at 3.16ghz. What do we do? Overclock to near max and that pisses them off. :) Make a few phone calls to some people and tell them write a "10minute's of overclocking and your cpu blows up" article or you won't get our next engineering samples to test :) Maybe I'm wrong but it's sure suspicious. Recommending anything with Intel IGP for HTPC applications is might suspicious also. Yes, I read tech-report too. Also the same is on toms hardware! Check out this FIRST SENTENCE of their 3/4/08 article on 780 chipset from AMD:
"With today's introduction of its new 780G chipset, AMD is finally enabling users to build an HTPC or multimedia computer for HDTV, HD-DVD or Blu-ray playback that doesn't require an add-in graphics card. (AMD already included HDCP support and an HDMI interface in its predecessor chipset, the 690G.) The northbridge chip of the new 780G chipset also features an integrated Radeon HD3200 graphics unit that can decode any current high-definition video codec. As a result, CPU load is decreased to such a degree that even a humble AMD Sempron 3200+ is sufficient for HD video playback. Also, while Intel's chipsets get more power-hungry with every generation, AMD's newest design was designed with the goal of reducing power consumption."
http://www.tomshardware.com/2008/03/04/amd_780g_ch...">http://www.tomshardware.com/2008/03/04/amd_780g_ch...

OK, so for the first time we can build an HTPC without an add-in graphics card. Translation - IT can't be done on Intel! Ok, even a LOWLY SEMPRON 3200+ cuts it with this chipset! Translation - No need for an Intel Core2 2.66ghz-2.83ghz dual core! No need for a dual core at all. Before this chipset it took an A64 6400 DUAL CORE (on AMD's old 690G chipset and that chipset smokes Intels IGP) and still was choppy. Now they say a 1.8ghz SINGLE CORE SEMPRON only shows 63% cpu utilization WITHOUT choppy on the 780G! On top of that it will save you money while running. Even the chipset is the BEST EVER in power use. They openly tell you how BAD Intel's chipsets are at 90nm. But Anandtech wants us to buy this crap? BLU-RAY finally hit it's limit on a 1.6ghz SEMPRON at tomshardware. They hit 100% cpu in a few spots. I hope Anandtech's 780G chipset review sets this record straight. They'd better say you should AVOID INTEL like the plague or something is FISHY in HTPC/Intel/Anandtech world.

Don't get me wrong Intel has the greatest chips now for a year, I'm personally waiting on the E8400 to hand me down my E4300 to my dad (with runs at 3.2hz with ease). But call a spade a spade man. Intel sucks in HTPC. SERIOUSLY SUCKS after early this week! Reply

Kris exposes his personal preferences when he writes... "While there is no doubt that the E8500 will excel when subjected to even the most intense processing loads, underclocked and undervolted it's hard to find a better suited Home Theater PC (HTPC) processor. For this reason alone we predict the E8200 (2.66GHz) and E8300 (2.83GHz) processors will become some of the most popular choices ever when it comes to building your next HTPC."

But what are you going to plug that CPU in to??? An Intel motherboard with Intel integrated graphics? Look at the full picture and you'll see that if you're building an HTPC, the CPU just has to be decent enough to get the job done... the really important thing is your IG performance on your chipset.

"Between our integrated graphics platforms, the 780G exhibits much lower CPU utilization than the G35 Express. More importantly, the AMD chipset's playback was buttery smooth throughout. The same can't be said for the G35 Express, whose playback of MPEG2 and AVC movies was choppy enough to be unwatchable."

That's the problem with Intel's platform, at least without an add in card. I thought the new nVidia chipset would change all that, then I found out they are only using a single channel of ram, how retarded is that?

Then, for now, having the ability to run the add in card for games but then shut it down afterwards when you do not need it is sweet. I would wait though for the 8200 chipset since i know it will be easier to get working in Linux but may go still for the 780G for Windows Vista. Reply

I'll finally upgrade my x2 3800+ (@2.5GHz) very soon. Question is, for gaming mostly, will a high clocked dual core suffice, or a lower clocked quad will be faster as games become more multi-threaded? Reply

I think it really depends on how long you plan on keeping the new system. Since your current rig is a couple years old, you fall into the category of 90% of us. We don't throw out the mobo and cpu every time a new chip comes out, we wait out a couple generations and then rebuild. I'm running at home right now on an A64 3200+ (OC'd to 2.5GHz) so I don't even have dual-core right now.

My plan is that even though the duals offer potentially better gaming performance right now (gpu obviously still being the caveat), since I only rebuild every 3-4 years I need something to be more futureproof than someone who upgrades every year. It would be great to say I'll get a fast dual-core today and next year get a quad, but 4 out of 5 times the upgrade would require a new mobo anyway so I'd rather wait another month or two, get a 45nm quad and the 9800 when it comes out.

My biggest dissapointment with my last build was NOT jumping on the "new" slot and instead getting an AGP mobo. That is what has really hampered my gaming the last year or so. Once the main manufacturers stopped producing AGP gfx cards my upgrade path stopped cold. If I could go back to jan 05 I would have spent the extra $50-100 on a mobo supporting PCI-X, which would have allowed me to upgrade past my current 6800GT and keep on gaming. Right now I have a box of games I've never played (gifts from Christmas) because my system can't even load them.

So in short, the duals are right NOW the better buy for gaming, but I'd hedge my bets and splurge on a 45nm quad when they come out. In all honesty unless you play RTS' heavily, or we have some crazy change of mindset by game producers (not likely) the gpu will continue to be the bottleneck at anything above 17-19" LCD resolutions. I actually just got a really nice 19" LCD this past Christmas to replace my aging 19" CRT and I did it for a very good reason. All it takes is to see a game like Crysis and realize that we may not be ready yet for the resolutions that 20/22/24 display, unless we have the cash to burn on top of the line components and upgrade at a much more frequent rate.

You can buy a Radeon 3850 and triple your 6800 performance (assuming it's a GT with an ultra it would be just under triple). Check tomshardware.com and compare cards. You'd probably end up closer to double performance because of a weaker cpu, but still once you saw your fps limit due to cpu you can crank the hell out of the card for better looks in the game. $225 vs probably $650-700 for a new board+cpu+memory+vidcard+probably PSU to handle it all. If you have socket 939 you can still get a dual core Opty144 for $97 on pricewatch :) Overclock the crap out of it you might hit 2.6-2.8 and its a dual core. So around $325 for a lot longer life and easy changes. It will continue to get better as dual core games become dominant. While I would always tell someone to spend the extra money on the Intel currently (jeez, the OC'ing is amazing..run at default until slow then bump it up a ghz/core, that's awesome), if you're on a budget a dual core opty and a 3850 looks pretty good at less than half the cost and both are easy to change out. Just a chip and a card. That's like a 15 minute upgrade. Just a thought, in case you didn't know they had an excellent AGP card out there for you. :) Reply

I'm in the same boat with the X2 3800+. Anyway, when it comes to dual vs quad, the same rules apply back when the debate was single versus dual. Very few games support quad core but a quad will be more future proof and give better multitasking. The ultimate question is how much you want to spend, how long you intend to keep the processor, and what the future road maps for games and CPU tech are within that period.

I'm a long time AMD/nVidia man but I'm liking what Intel and ATI are putting out. I'm definitely considering these Wolfdales, especially that sub $200 one. I'm going to wait for the prices and benchmarks for the triple core Phenoms though before I begin planning an upgrade. Reply

This has been mentioned in a couple of articles, now, that what these processors will run at with no more than 1.45v core voltage applied is what really matters for most people buying one of these 45nm chips. So, it begs the question, what are the results at this voltage?

While the section on processor failure was somewhat interesting, I think that it should have been a separate article. Reply

"these processors will run (safely) at with no more than 1.45v core voltage applied is what really matters for most people buying one of these 45nm chips. So, it begs the question, what are the results at this voltage"

Very good point. Since these CPU's are deemed safe up to 1.45 volt, lets see how far they clock at 1.45 volts. 4.5 ghz at 1.6 volts is nice for a suicide run, but lets see it at 1.45. Reply

"We could argue that when it came to winning the admiration and approval of overclockers, enthusiasts, and power users alike, no other single common product change could have garnered the same overwhelming success."

Except that it was not. It was a knee-jerk reaction to the K8 release way back in 2003. It was too expensive to matter to anyone except for the filthy rich. The FX around that time was more successful. In recent years they just polished the concept a bit, but gaining admiration and overwhelming success because of it?? I think not. The Conroe architecture was the catalyst, not some expensive niche product.

"Our love affair with the quad-core began not too long ago, starting with the release of Intel's QX6700 Extreme Edition processor. Ever since then Intel has been aggressive in their campaign to promote these processors to users that demand unrivaled performance and the absolute maximum amount of jaw-dropping, raw processing power possible from a single-socket desktop solution. Quickly following their 2.66GHz quad-core offering was the QX6800 processor, a revolutionary release in its own right in that it marked the first time users could purchase a processor with four cores that operated at the same frequency as the current top dual-core bin - at the time the 2.93GHz X6800."

Speed bump revolutionary? Oh well ;)

No beef with the rest of the article, those two paragraphs just stand out as being overly enthousiastic, more so than informative. Reply

Well it's being updated from time to time. I think it is relevant since Cubase 4 is still the latest version used of cubase and the performance is the same today. What is important with this is that they measure up two equally clocked processors where the difference is in the number of cores. Yes, the quad is better at higher latencys but it loses the advantage at lower latencys and even gets beaten by the dual-core.
More of a reminder of the limitations of current day quadcores in some situations. This will probably change when Nehalem is introduced with its on-die memory controller, a higher FSB and faster DD3 memory. Reply

Yeah, you talk about games and maximum cpufrequency on dual core is important, but there are other areas that are much more interesting. Performance for sequencers where you make music (in DAW-based computeres) is seldom mentioned. It is very important to be able to cram out every ounce of performance in real-time with a lot of software synthesizers and effects using a low latency setting(not memory latency but the delay from when you press a key on the synt until it is procesed in the computer and put out from the soundcard for example).
Here's an interesting benchmark:

http://www.adkproaudio.com/benchmarks.cfm">http://www.adkproaudio.com/benchmarks.cfm(Sorry, using the linking button didn't work, you have to copy the link manually)
If you scroll down to the Cubase4/Nuendo3 Test you can compare the QX6850(4 core) with the E6850 (2 core). They both run at 3 GHz. Look at what happens when the latency is lowered. Yes the dualcore actually beats the quadcore, even though these applications use all cores available. The reason could be that all 4 cores compete for the fsb and memory access when the latency is really low. Very interesting indeed, as DAW is an area in much more for cpu than gaming... Reply

For that matter, when am I going to be able to BUY an E8xxx part? Newegg has the E8400 listed but always out of stock.

I tend to be the guy who people come to when they want to build a new rig, and right now I am telling them all to hold off and get an E8000 series or a Q9000 series CPU and a GF9000 series GPU. But right now all these parts arent REALLY available.