Great review. I was surprised to see you using a GTX 580 as that card is PCI-E 2.0.

Any chance we'll get some comparative scores of a PCI-E 3.0 card like the GTX 680 on the new platform? It would give a real world indication as to whether it makes any difference on current high end tech.

We all know it wont make any difference for quite some time.
They covered that by stating it is for future generations of "extreme" gaming cards.

The inclusion of PCI Express Gen 3 is great, but what does that boil down to? Well, simply put, PCI Express Gen 3 provides a 2X faster transfer rate than the previous generation, this delivers capabilities for next generation extreme gaming solutions.

Intel is going from 32nm > 22nm , i think ivy isnt that bad at all as 95% fo the user aint gonna overclock. And as usual People aint gonna be happy with anything new.

This new chip has big issues thx to the new tri-gate transistor. They should have kept this tech in the labs and worked on it until Haswell. Not release a problem chip. But i guess we (the guineea pigs) have to suffer in order to evolve.
After all there's not any significant performance gain.

Personally, I wouldn't say that Ivy Bridge is a problem and that we're suffering. Sandy Bridge and Ivy Bridge are still awesome. Ivy Bridge is just marginally better than Sandy Bridge, though, and we did expect more. At least it's not a setback like Bulldozer was. I certainly won't buy a motherboard ahead of time again.

Maybe you know, Ivy Bridge are around the corner. Maybe you knows too, they are worst clocker then Sandy Bridge. We know why ...

As you know, Intel is with the 22nm production late. Production is not good, and there are big problems with the chips. The original "on paper" concept of 22nm chip with Tri-gate transistors is extremely low supply voltage. But, with current revisions Intel can not keep voltage in planned values. This is a problem. The chips have a higher voltage than planned, broadly comparable with Sandy Bridge. And that's wrong.

Tri-gate transistor needs to switch to a lower voltage. But for a correct recognition of the I/O status needs more current than planar transistor. Three-gate area is greater than one-gate and the current is several times higher than in Sandy Bridge chips. When Intel reach a planned low voltage, everything will be fine. Lower voltage means acceptable currents, less leakage and a great consumption. Unfortunately, it does not meet the current "E1" revision.

Current 22nm chips have high voltage, higher than they should have. The values are similar to Sandy Bridge chips. Properly should be the default voltage below 1V and it is not now. But Ivy Bridge needs a lower voltage, at the same voltage as Sandy Bridge consumption and temperature is significantly higher due to higher currents in the chip.

Basic Ivy Bridge idle voltage is above 1V, higher than Sandy Bridge. The load voltage is lower than that of Sandy Bridge and consumption is lower, but temperatures are higher. If the Ivy Bridge voltage increases, consumption and temperatures extremely jumps up. This problem can be solved only by improving the production, so maybe its time for another revision. Indeed it may be a potential problem in laptops with the highest third-generation Core i7 models.

In the desktop this problem occur with less overclocking than Sandy Bridge and significantly higher power consumption and temperatures. If you have a nice 5GHz + Sandy Bridge, keep it for now. Ivy Bridge ends with overclocking on the air somewhere around 4.6 to 4.7 GHz. But slightly lower overclocking then Sandies compensates higher performance per clock, so it is not a major problem.

So are you guys saying the older 2600k, 2700k are a better deal than the 3770k?

Yes, because you can reach 4.8 - 5.0 ghz on air with lower temps. Also the little to non difference in performance makes IB a really bad choice tbh. I wanted to get it, but now i think i'm gonna go with the 2700k for a year or so until a new revision of IB comes out.

If you can git 4.6Ghz with an i7-3770K, and 4.9Ghz with a i7-2700K, saying the i7-2700K is a bit of a moot argument...

In the article, the difference depends on what you are doing, but one of the bigger differences can be see in video transcoding.

55 seconds on i7-2600K, and 47 seconds on i5-3570. That is a 17 percent improvement! (I realise this isn't across all tests). For Handbrake its only a miserly 12.5 percent . That said though, its differences like these, and the others in the tests, that need to be taken into consideration with maximum achievable 24/7 overclock. If the IB can't clock as high, its a bit moot if the performance per Ghz is faster... (which it apparently is). They're probably around even, with the IB maybe slightly in front if you look at it that way.

The real questionable thing i5-3570K and i7-3770K. Going by the tests, it seems the i7-3770K doesn't give any advantage. Actually the results are quite dismal considering the i7-3770K is clocked higher.

Yeah, i'm interested to see this CPU OC under air with a Noctua C14. I want to know how high can i go with it until i reach 60-62'C under full load. If it's 4.6ghz then it's ok and i may still consider it.

I'm still searching the web to see if i can find more results. I found that 4.6-4.7ghz is doable on air in the range of 60'C under load. Over 60'C is killing the CPU so it's not worth.

Not really, constantly being over 75c probably is. Intel CPUs are quite robbust, don't forget some people at stock with stock coolers/25c~30c ambient easily hit 60s, probably more? so I don't think you have to be worried about that. if 60c load was killing your CPU, Intel would be bankrupt by now lol

Great review. I was surprised to see you using a GTX 580 as that card is PCI-E 2.0.

Any chance we'll get some comparative scores of a PCI-E 3.0 card like the GTX 680 on the new platform? It would give a real world indication as to whether it makes any difference on current high end tech.

NVIDIA's drivers do not support PCIe gen 3.0 just yet. But there wouldn't be much to measure anyway. PCIe gen 2.0 x16 is absolutely enough for the GTX 680.

Not really, constantly being over 75c probably is. Intel CPUs are quite robbust, don't forget some people at stock with stock coolers/25c~30c ambient easily hit 60s, probably more? so I don't think you have to be worried about that. if 60c load was killing your CPU, Intel would be bankrupt by now lol

CPU temp is different to core temp 75C CPU temp would be a little worrying, because the core temp would probably be 90's (just guessing about the 90C part, but it will certainly be higher).

I'm talking about the CPU DTS which measures the temperature of each core directly. The "CPU temp" you're talking about is from a diode that's not directly attached to CPU, but underneath or close to it.