Except if the Tweaktown review is any indication, not really. There were some benchs where the 2500k was a frame or two quicker. It was pretty much a dead heat. And it was only 2 or 3 watts more efficient under load and idle than the 2500k. It's a COMPLETE disappointment if you're expecting anything over the 2500k in terms of performance/efficiency.

What are you talking about ? In almost every benchs in the TD review, we see the 3570K ahead of the 2500K (even 2700K in certain cases) for less energy consumed. Sure it's maybe 5-10% more performance but for a guy like me who is building a rig for the first time I don't see the benefit of choosing a 2500K over a 3570K, apart from the fact that IB may end up bad overclockers.

What are you talking about ? In almost every benchs in the TD review, we see the 3570K ahead of the 2500K (even 2700K in certain cases) for less energy consumed. Sure it's maybe 5-10% more performance but for a guy like me who is building a rig for the first time I don't see the benefit of choosing a 2500K over a 3570K, apart from the fact that IB may end up bad overclockers.

The only things I give a hoot about. Gaming and power. I don't care about synthetic benchmarks for the irrelevant. It slightly nudged the 2500k in 3D Mark 2011, but was one frame slower in AVP. I'd call it a draw if ever there was one.

As for power it was 88/303 vs 90/305. So it was a whopping two watts more efficient during idle and load.

That's a pile of you know what for something that's a new die process and had all the rhetoric about 3D transistors.

EDIT: I'm not telling you to choose the 2500k over the 3570, just that's it's an absolutely worthless update from a performance POV. It's basically all about the IGP and notebooks it seems.

The only things I give a hoot about. Gaming and power. I don't care about synthetic benchmarks for the irrelevant. It slightly nudged the 2500k in 3D Mark 2011, but was one frame slower in AVP. I'd call it a draw if ever there was one.

As for power it was 88/303 vs 90/305. So it was a whopping two watts more efficient during idle and load.

That's a pile of you know what for something that's a new die process and had all the rhetoric about 3D transistors.

EDIT: I'm not telling you to choose the 2500k over the 3570, just that's it's an absolutely worthless update from a performance POV. It's basically all about the IGP and notebooks it seems.

I'm not sure it's relevant to judge Ivy Bridge CPUs performance in games based on a whopping one frame difference. Anyway, I agree with you, when it comes to upgrading it's not worth it if you're on SB. But let's see some reviews first, on retail chips, before making any final conclusion.

I'm not sure it's relevant to judge Ivy Bridge CPUs performance in games based on a whopping one frame difference. Anyway, I agree with you, when it comes to upgrading it's not worth it if you're on SB. But let's see some reviews first, on retail chips, before making any final conclusion.

I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).

I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).

I'm in this position (all of my parts are ready minus the CPU) and I really hope IB won't be what it is now, i.e not a very good performer compared to SB.

I'd wait until next time, there should be some actual performance improvements with Haswell.

Surprised at all the questions. IB was never about a big performance improvement in the first place, it's a die shrink(+trigate) and it was known that clock for clock would not significantly change months ago. The poorer OCing and worse-than-expected power consumption are the real disappointments here IMO.

I am also one of the many who has been waiting with their q6600, cant wait to build my new rig this witner.

I have two core 2 duo machines I've been looking to upgrade. With the new ati and nvidia cards plus this I can figure out something I really want to build to push my low end stuff up to my more modern hardware.

I don't know, we're getting to the point where you can play games at decent resolutions and settings at playable framerates on IGPs.

I guess you could blame that on consoles lowering the bar for PC game requirements and it will all be moot when next-gen starts and requirements see a spike, but who knows?

Yeah, I agree. Though while Intel's GPUs have made huge leaps with Sandy and Ivy Bridge, they're still in line with Intel's GPU slogan of "We suck, just enough." Heck, look at the news stories that came out last week saying that each Haswell GPU will include 64MB of VRAM. While this is great, and will obviously lead to the reported huge performance gains, 64MB is really not enough for serious gaming.

Personally I'm hoping that combined with the use of DDR4 system RAM running at the same speed as the processor, any downside of only having 64MB of VRAM will be minimized.

If things are slow now, sure. But you don't really need to wait for Ivy. Sandy gets 98% the performance in theory, but with games it's going to be nearly impossible to tell the difference because most games are GPU limited.

Originally Posted by 1-D_FTW

I'm not sure what we're really arguing. It's not like Ivy Bridge is a different architecture. It's not going to perform worse than the thing it's replacing. It's just a colossal disappointment if anyone was holding off on a 2500k purchase for the past 6 months thinking it was going to give the 10 - 15 percent performance boost, 20 percent power reduction that was being claimed (due to die shrink and 3D transistors).

And to these people I recommend just going ahead and getting Sandy. Why wait another 3 months for availability and weird issues to be worked out like Sandy had.

With Sandy you can tell which motherboards are reliable, order it now, and find OC settings online. Won't have to worry about firmware updates, etc.

For several years now. Anyone that pays attention to the market (not attempting to be condescending, no shame in not being obsessed with this stuff outside of work) knows that Ivy was going to be Sandy + Die shrink + integrated GPU upgrade.

The die shrink's room for more transistors all went into the integrated GPU.

One of the "improvements" with ivy is DDR3-1600 MHz...but if you OC Sandy you can already get 2133 MHz I think so it's not really better.

And Sandy Bridge was a massive improvement. I'll be thrilled if Haswell is as big of an upgrade. But really the future of computing is going to have to incorporate dozens of CPU cores + GPU design, or something that can do both.