The fastest client computing platform, Sandy Bridge-E, won't be seeing successors for a long time now, according to a Bright Side of News report. The next-generation "Ivy Bridge-E" lineup, which will carry processor model number series Core i7-49xx, aren't due until the third quarter of 2013. Ivy Bridge-E will build on the existing Sandy Bridge-E HEDT platform with Intel X79 Express chipset. Existing LGA2011 motherboards will be able to run the i7-4900 series chips with BIOS updates.

Meanwhile, the closest addition to Intel's socket LGA2011 Core i7-3900 series is the Core i7-3970X, which displaces the Core i7-3960X from the top spot. This chip will hit the shelves in a few weeks' time. According to a leaked specs sheet, the i7-3970X ships with a nominal clock speed of 3.50 GHz, with maximum Turbo Boost speed of 4.00 GHz. The specs sheet also confirms that i7-3970X will be a six-core chip, with HyperThreading enabling 12 logical CPUs, allaying rumors that Intel will unlock two additional cores and the full 20 MB L3 cache on the Sandy Bridge-E silicon, making it an eight-core chip.

Only the E5-2687W...which is why it turbo's a bin lower than most other Xeons. AFAIW, excepting the 2690 (135w), all other high-perf CPU's are 130w or lower- including the 8 and 10 core E7's. 150W for a part with no/limited OC potential doesn't seem that outrageous tbh considering the power usage envelope of an unlocked multi i7.
My guess is that most of the 2687W's reside in wind-tunnel server racks far, far away from the ears of their users, or reside in epeen benchmark rigs under water. Neither should be a major concern for cooling.
I'd assume that the 3970X will supplant the 3960X in pricing, as is usual for Intel at the $1k price point ( 990X>980X>975XE>965XE for example), so I'm kind of doubting that the prospective buyers would be looking at the stock Intel liquid AIO for cooling duties.

We're talking about a mere 20W increase which is just 15-16% above the 130W TDP rating. I think that should be within the margin of "error" for most motherboard power delivery and cooling assembly designs.

As for the CPU cooler, SB-E doesn't come with one in the box, so it's expected of you to buy a decent part. The X79 platform isn't just for anybody.

Agree, but if the chip outputs 131W of heat you can no longer label it as a 130W TDP chip, it falls into the 150W category.

Click to expand...

Nope. Sry. It's rated for 150 W, to get a warranty, you need 150 W cooler. Use a 130 W cooler, warranty is invalid.

To me, thinking that way is bad thinking. I don't cut corners, or worry about whether one cooler costs $40 and is sufficient, but another cooler costs $140, and is "certified". Chances are, for me, the $140 cooler would be bought.

I get what you are saying though, my 3770K uses roughly 50 W in P95, but rated at 95 W TDP. I use a cooler capable of about 200 W.

Notice I say often that IVB doesn't have heat issues. That's because I picked the right components, and had realistic expectations.

I'd love to get one of these 3970X chips though. I wonder..maybe another 200 MHz in OC? The chip I'm currently using does 5 GHz on air...does that mean i might get 5.2 GHz?

Nope. Sry. It's rated for 150 W, to get a warranty, you need 150 W cooler. Use a 130 W cooler, warranty is invalid.

To me, thinking that way is bad thinking. I don't cut corners.

I get what you are saying though, my 3770K uses 50 W in P95, but rated at 95 W TDP. I use a cooler capable of about 200 W.

Click to expand...

Yeah, I was only giving a heads-up about how the ratings "work".

About the coolers, warranty aside, IMO manufacturers probably specify the absolute maximum TDP the cooler can handle (I'm just doing some logical thinking as how the marketing team always likes high numbers in specs ) so using that cooler on a CPU which outputs more heat will result in overheating (yes even if the CPU outputs 2W more than the cooler can handle, that will result in overheating).
And like you said, cutting corners here is bad

Correct me if I'm wrong but that crazy 8 core Xeon with 8 cores (16 threads 20MB cache) has 2 QPI's so that it in turn can talk to 1 or (more) a network of other crazy 8 core Xeon's. Any part for "desktop" use, but derived from an Xeon part has 1 of these QPI's cut out, but ya know what I notice. Anything Xeon even if the specs are identical has at least a $200-500 jump in price due to among other things that 2nd QPI. So a crazy 8 core I7 might be about $1500-1200 , still got sticker shock from that. Finally I can't see a use for 8 cores in a "desktop" most rigs with 4 cores (hyperthreading be damned) do just fine.

Correct me if I'm wrong but that crazy 8 core Xeon with 8 cores (16 threads 20MB cache) has 2 QPI's so that it in turn can talk to 1 or (more) a network of other crazy 8 core Xeon's. Any part for "desktop" use, but derived from an Xeon part has 1 of these QPI's cut out, but ya know what I notice. Anything Xeon even if the specs are identical has at least a $200-500 jump in price due to among other things that 2nd QPI. So a crazy 8 core I7 might be about $1500-1200 , still got sticker shock from that. Finally I can't see a use for 8 cores in a "desktop" most rigs with 4 cores (hyperthreading be damned) do just fine.

Click to expand...

Yes, you are right, it's just the way it is in server market. It's similar like the workstation GPU's.
The Xeons however also support ECC RAM (but then again AMD supports that with desktop parts).

There can be a use for these CPU's (one of the reasons is the obvious: I buy it just because I can). Someone that does heavy rendering works and similar would rather pay $1500 for a desktop chip than $4000 for a Xeon.

Someone that does heavy rendering works and similar would rather pay $1500 for a desktop chip than $4000 for a Xeon.

Click to expand...

Desktop chips aren't really built for taking 24/7 high loading, even though many users do so. Xeon chips are, adn hence are typically higher-quality wafers. I understand why there is a pricing difference, but at time it does seem that those pricing gaps between the two are rather high.

Desktop chips aren't really built for taking 24/7 high loading, even though many users do so. Xeon chips are, adn hence are typically higher-quality wafers. I understand why there is a pricing difference, but at time it does seem that those pricing gaps between the two are rather high.

Click to expand...

I don't know about the quality of the silicon itself (The Xeons do run at lower voltage) and how long it takes to degrade but by the time you degrade a desktop chip it will be really old and slow (would you use a Core 2 Duo for rendering or a quad IB).
The problem I see is the VRM (it needs to be made from quality components and very well cooled).
As long as you keep the CPU cool you won't see any major problems.
However, some people render with highly overclocked CPU's which IMO is a big no no (stability is a serious issue and VRM strain is huge). You can forget about using prime for stability, it's a kids game compared to long term heavy load.

I never should have had my heart set on an 8 core desktop CPU from Intel, by the looks of things they will be available in the kind of time frame that i was intending to be using one before considering the next upgrade

I know i should just settle for something more sensible but i fell in love with the idea of the 8 core CPU before i knew how many years Intel would take to release a desktop version and i can't justify the extra cost of a Xeon for desktop use.

May not sound too relative to the article but it's just that it was another reminder of my broken dreams so i could not resist sharing my sadness

I never should have had my heart set on an 8 core desktop CPU from Intel, by the looks of things they will be available in the kind of time frame that i was intending to be using one before considering the next upgrade

Click to expand...

I would love an eight core too. My i7 920 has aged well but moving up to ivybridge/haswell seems not enough of a performance jump for me.