Intel goes toe-to-toe with analysts, denies Haswell has power issues

This site may earn affiliate commissions from the links on this page. Terms of use.

This is why manufacturers hate early product reviews. A few weeks back, early Haswell performance figures and TDP targets hit the web at large. The chip was early silicon, but it gave us a decent idea what to expect from the next-generation processor. It also poses an enormous headache for Intel, as exemplified by JMP Securities’ Alex Guana’s decision to cut the company’s rating from Market Outperform to Market Perform. Guana based his decision on reports that Intel has had problems integrating a voltage regulator into Haswell’s core design.

“The Intel effort to integrate a voltage regulation (VR) stage into the Haswell architecture appears to have fallen short of desired results, and the company is consequently reversing course and not pursuing this power management scheme with subsequent products,” Guana wrote. Other analysts have hopped on the dog pile, with Seeking Alpha’s Tom Luongo pointing to Haswell’s higher TDP (84W as opposed to 77W) as reason to think that the company indeed has an issue.

Intel’s Chuck Malloy, meanwhile, denies all such claims. “Haswell is healthy and on track for an announcement. There’s no problem with the Haswell part at all. Also, we would expect that with Haswell we will see the biggest improvement in battery life in the company’s history when you compare it to the previous generation,” Malloy said. “It is expected to be a major milestone breakthrough. The key point is, it’s healthy and it’s on time.”

So who’s being truthful — Intel or the analysts?

Maybe both.

Haswell in context

Intel integrated a voltage regulator into Haswell to reduce overall power consumption and vastly improve clock gating. Having hardware on-package allows the chip to precisely control currents rather than relying on the motherboard to do so. This is a major step for Intel. It’s absolutely possible that the integration didn’t go as smoothly as the company would like. In point of fact, it’s even likely. Each major step forward in CPU integration has had hiccups along the way. Linking the Core i7-4770K’s 84W TDP to proposed “Haswell power problems,” however, is far too simplistic. It implies that because TDP rose in the desktop space, TDP must rise everywhere else. And that’s not true.

I discussed the situation with Michael Schuette over at Lost Circuits, who has a stronger background in chip design than I do. Integrating a voltage regulator on chip has three principle benefits: It gives Intel direct control over a critical piece of circuitry that’s typically stuck on the motherboard, it allows for better matching between CPU load and power consumption, and it means chip temperature and VR temperature will co-vary.

The downside? Moving the VR on-package also increases CPU power consumption and thermal dissipation. Note that total system power consumption hasn’t actually increased here. A component that was once part of the motherboard is now being counted as part of the CPU. It’s also possible that Haswell’s TDP rose slightly because Intel tuned the chip for lower power consumption as opposed to higher frequencies.

The point is, it’s entirely possible that Intel cut power consumption at the low end of the market by taking steps that slightly raised power consumption of its desktop parts. And let’s be clear — that would be the right decision. No one who would’ve bought an IVB chip at 77W is going to turn up their nose at 84W, especially if Haswell offers 8-10% better performance to compensate for the 9% increase in TDP. That’s in sharp contrast to the mobile market, where trimming 1-2W off CPU power consumption adds precious minutes of battery life.

It therefore follows that integrating a voltage regulator may not have cut power consumption as much as Intel hoped it would, but that it still delivered a large enough improvement to make the experiment worthwhile. The size of that improvement could be workload dependent, which is precisely why most publications incorporate multiple battery tests in different usage scenarios.

The simplest explanation, in this case, is that both sides are telling the truth. Power consumption and load balancing is the central challenge of microprocessor design; it’s absolutely possible that Intel delivered significant gains but had to make some trade-offs. Moving the VRM on-chip would explain the rise in TDP without automatically implying a decrease in battery life.

“It implies that because TDP rose in the desktop space, TDP must rise everywhere else.”
Furthermore, high TDP doesn’t mean high average power, which is the only meaningful metric for battery life. It only means high peak power.

That’s true. Actually measuring true TDP over time is a benchmark there is no standard for. It depends so heavily on the load placed upon the CPU over that period. Therefore, we use the peak TDP because nothing else is available. The minimum running TDP maybe should also be stated, as most of the time your processor will be operating at its highest frequency and load.

“…Haswell’s TDP rose slightly because Intel tuned the chip for lower power consumption as opposed to higher frequencies.”
Is that not backwards?

Joel Hruska

No. Higher frequencies often use transistor designs that allow for more leakage at idle.

A chip built on a low-power process may use less power at 1GHz than a chip built on a high-power / high frequency process running at 1GHz. At 2GHz, the situation often reverses — the low power / lower frequency chip is using more power than the high power / high frequency CPU.

It’s a question of what your needs are. If you *really* need low power at idle, and you don’t need to run above 1.4GHz, the low-power process is better. If you really need performance, and you want to control power consumption at high frequencies, you go with the higher power process.

I think this has been going on since the Ivy Bridge Processors came out. The processor itself comes in a package that has an external thin Metal and it seems that Intel assembled the package so they all run hotter than they should be. How the public found out about this is someone purchased some processors and pried the metal skin apart and cleaned up the processor and then put it back together with some new thermal paste and it ran a lot cooler. So did Intel try to skimp on money when they make the ivy bridge processor? I wonder what the difference was between the Sandy Bridge processors and the Ivy bridge processors? Do the processors run hotter or is Intel being run by people making an inferior product on purpose? Are they in collusion with the CPU Cooler Manufacturers?

“A component that was once part of the motherboard is now being counted as part of the CPU.” This is not entirely true. The motherboard will still have a voltage regulator to supply power to the CPU. (I highly doubt that Intel will have their chips take power from the 12 V bus.) All that Intel is doing is adding another converter in the power delivery path, but this time it’s on chip.

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.