Ultimately the power measurement's that matter are battery life in various use cases and how much heat/active cooling is required. 35W under full gpu load is probably acceptable as long as the idle power usage is competitive with haswell. (I'm doubtful that it is though)Reply

While I doubt anyone would dispute the fact that Intel allows its processors to turbo past their specified TDP (considering that, ya know, it's kinda part of Intel's specifications) your argument would be better served by providing actual numbers instead of inflating them into the realm of fantasy. Specifically, Notebookcheck's power numbers for systems using the i7-3667U show a delta of roughly 23W between maximum idle and maximum load. Meanwhile their review of the A10-4600M shows a delta of 44W between idle and load.

Regardless, going over specified TDP isn't really an issue, in fact it's typically a good thing. But it definitely means that actual power draw must be considered when comparing performance.Reply

Although the power usage can definitely exceed the TDP, it only does so for a short time. The effect on games or benchmarks taking >5 minutes is too small to make a difference. A processor with a TDP of 45w can be cooled by a heatsink rated at exactly 45w, but it might not always enter turbo mode for the full amount of time.

But that's not all, this review compares the graphics performance of an 17W tdp CPU with an 35W AMD part. The intel ULV parts are heavy power constrained. An 35W i5 part would probably be ~ 30% faster in games. Review notebookcheck, for example, starcraft 2 17W TDP ivy bridge: 31 fps. 45W part: 41 fps. Starcraft 2 does not scale above 2 cores. Check Dead Space or Hitman Absolution for more numbers that tell the same story.Reply

Bravo at finding a single example of a 17W SKU behaving abnormally in their review database. If you notice I specified a particular part which outperforms the one included in your linked review, and all reviews notebookcheck has done with that part show a delta of roughly 23W between maximum idle and maximum load.

Just because there's an outlying result doesn't mean that it's the norm.Reply

Are you seriously suggesting that Intel draws more power than TDP for long enough to soak the thermal mass of the CPU+recommend heat sink? The whole point of TDP means "thermal design power", and needs to be taken into account only for heatsink and mobile manufacturers.

If you want to know what maximum current an Intel chip will draw, pull the datasheet (not the stuff made for marketing or consumers: it should be a pdf with several hundred pages. If you really want to build a motherboard; sign the NDA for the real one with the bugs included). Intel will list the requirements for the motherboard's power supply, and don't expect them to draw more than that (even then they may include a certain capacitance and frequency limits for the power supply. Intel is still free to draw however much they wish as long as the average within one period of the switching frequency is within the limits and can be supplied by the capacitance).

Breaking the specs is huge sin in this industry. Breaking what consumers think the specs are is irrelevant.Reply

This reminds me of the whining that occurred when Intel first added turbo support to its CPUs --- apparently it's "unfair" to use the laws of thermal physics to improve the performance of devices.

Look, Intel has done an INCREDIBLE job of allowing its devices to run at short high speed bursts, for responsiveness, while generally using extremely low power. This is a tradeoff that meets most people's needs very well, even if it's not an appropriate tradeoff for a server chip that's going to be running at 95% utilization 24/7.Others should be emulating Intel, not complaining that what they are doing is "unfair".Reply

Or just post the Tomb Raider - Value number from that review. ~28fps. This is really a chip in search of a market. Richland can't replace a discrete setup except at the margins, and has lost it's DX11 leg over Intel as well. Battery life was not mentioned for a reason. I'm very curious Dustin, what the performance of the MSI looked like before you populated out the last RAM slot. It seems that most OEMs would rather save the few dollars rather than even deliver baseline performance with these chips. Also, is there any chance at all that the Richland ULV line will get a review from Anandtech sometime in the future?Reply

The APU is a good niche for AMD. These gaming numbers are pretty impressive IMO given the cost versus the Intel competition. I hope the continue to improve to the point that they can offer midrange discrete graphics card performance in a single chip (say Radeon 7790 levels of performance.)Reply

No, just knowledge what can be done and what are limitations of computing. (For converse example see GPU which increase performance by significant complexity; we all saw how that worked out with manufacturing and consumption...)

Note: I know I am quite late, but too many people comment without understanding.Reply

Wow, that IGP is much worse than I thought it would be. Seems to be roughly 30-50% faster than the ULV HD 4000. That means that its roughly it is as powerful as mobile GT2 HD 4600 ( HD 4000 SV is about 30% faster than ULV HD 4000 and HD 4600 is about 20% faster than HD 4000).

To be fair, there are only i5 and i7 mobile Haswells released. The i5s only come in the 15W (GT2/GT3) and 28W (GT3) variety. The GT3 comparison would be very interesting, but I'm not sure Anand has any of these parts.Reply

Richland appeals to those who want the best performance/value relationship in a laptop. Few people know or care if a laptop APU is properly rated at 35w or falselty advertised as 17w like Intel does. What most laptop buyers care about is running actual software and reasonable battery life. Richland delivers what the majority of consumers desire and at a price that won't break the bank. When you compare performance based on retail price, Richland is the winner not the $200+ more expensive Intel models.Reply

In the many years that I've been reading Anandtech this may well be the first article of genuinely disappointing quality. It's clearly a conscious decision on the part of the author to omit the Haswell GT3 and GT3e benchmarks of previous reviews, apparently justified by the opinion expressed on the third page of, "The only reprieve AMD seems to be getting on this front is the unusual rarity of GT3-enabled parts in the market." And since the Haswell SKUs with faster graphics were omitted from the benchmark results leaving only the low-end 15W GT2 SKU to fight for the title against AMD's top of the line 35W Richland we get the bizarre conclusion of, "AMD continues to offer superior mobile graphics." What's even worse is that the author then acknowledges the fact that AMD has markedly slower graphics in their A8 and A6 lines. (The commentary on desktop Haswell in a mobile review is also a tad bit grating.)

Anyway, I've always liked Anandtech as articles typically include all the relevant information along with informed commentary. Hopefully this review is merely a random anomaly.Reply

I agree with you in the sense that Dustin can/should get higher quality stuff out but are you taking into account the price difference for GT3/GT3e enabled solutions?.

The last time I checked, an Iris Pro 5200 enabled SKU was around ~600 USD with the price difference you can get a current gen discrete mobile GPU(from AMD or NVIDIA) then Intel doesn't stand a chance (GPU).

As far as results goes in the HD 4000 <> HD 5000 comparison article the only benchmark we can relate to is Futuremark 3DMark 11.

I the MacBook Air was power constrained (15W part). Iris 5100 is a better comparison not because of it's +0.1GHz nominal frequency change, but it's ability to operate at higher power (28W). This means it doesn't have to throttle down on power as frequently.Reply

Still don't think 5100 can do a lot to the AMD A10-5750M GPU, I believe that at best it can offer equal performance to it and AMD stills wins in the price/performance (take into account driver optimizations too since ). Also VLIW is a more parallel architecture, so it should fare better in non-mainstream GPGPU stuff, however who runs that kind of stuff on a mobile device is beyond me, the best usage case I can think of (for a notebook) is Nebula 3 and the Volterra kernels on CUDA(Professional Audio) but that is on NVIDIA hardware.

Still believe the market for both is a bit different, I mean Intel is focusing into getting more mainstream devices meanwhile AMD is kind of stuck in this kind of devices (GX60).

Even being better than the Intel solution price/performance-wise any design aimed at mainstream will prefer the Intel's lower TDP and power consumption.Reply

Look at the 4770k igp results from the iris pro article. It looks like HD 4600 SV is quite on par with even the top range of AMD's apus (4770k is around 10-15% faster than trinity, richland is marginally faster than trinity [sometimes slower] so it looks like 8650G is barely faster than HD 4600).Reply

The only comparison point we have here for now is 3DMark 11 (Sadly :/) and that puts Richland at 29% advantage vs the HD5100 in the MBA.

Again the peak theoretical performance difference between the HD4600 and the HD5000 is 63%. Between the HD5000 and the HD5100 there's is only 18.3% even taking into account throttling the best I can see the HD5100 doing is being on par with Richland.

Also take into account driver optimization and the optimization for AMD/NVIDIA's GPU architectures.

Still with Richland we're still talking about VLIW, with CGN (Kaveri) AMD will gain ground against Intel.

If done right AMD can get this performance into the 15-20W targets and beat Intel at 28-35W with less problems in CPU intensive games.Reply

Pricing is definitely a legitimate concern... too bad that we have no clue how much AMD's mobile parts even cost isn't it? They don't even publish recommended pricing. If we compare similar models that differ only in their processor manufacturer (Lenovo Edge E531 vs E535) then you're talking a $50 premium for the Intel chip on the base offering... though Intel's recommended pricing is the same for the upgraded i5-3230m as the base i3-3110m, so let's figure that the i5-3230m actually costs that recommended $225. In that case we arrive at roughly $100 for the base AMD Trinity and then something around $200 for the A10.

Regardless it doesn't matter too much. For comparison, Iris Pro models start at a recommended price of $440 for the i7-4750HQ. And then all the non-i7 28W Iris parts have a recommended price of $342. But yeah, who knows what actual prices are? The i3-4158U (28W Iris 5100 part) may well be cheaper than AMD's A10-5750m. And as esgreat already stated, expect Iris 5100 to be quite a bit faster than HD 5000.Reply

I'm pretty sure AMD is giving lower prices to OEMs (else they will get almost 0 design wins) many OEMs will try to get extra margins from the AMD. Also don't underestimate 50 USDs. In developing countries (being from one myself) what looks like "50 bucks" makes a huge difference for the avg Joe.

It's the difference between the guy actually buying or not the AMD one.Reply

Oh, I wouldn't be surprised if AMD is giving lower prices to OEMs for their low-end models. But they want to make more money on their 'high end' SKUs same as Intel. Heh, and if Intel's i3-4158U outperforms AMD's top end mobile offering then I'm sure that the AMD part will be priced a bit lower than it. But it's hard to claim that AMD has a massive lead in price/performance if that's the case. Regardless, it's all speculation since neither AMD nor Intel give real pricing information for mobile parts.Reply

However we can see it like this, maybe both sell the SKUs at the same price (let's say 5% more or less whoever gets it) and the OEMs get those "50 bucks" just cause it says Intel (there is a lot of change that is happening), then it's not Intel's pricing problem :PReply

Apologies. I actually missed that 47W Haswells were considered "mobile" when I read about it the first time. Still, given the price and TDP disparity, I don't think comparing Iris Pro to Richland is terribly interesting.Reply

WE never needed Bulldozer to begin with. AMD shot itself in the face with Bulldozer, just after its former CEO hoodwinked the shareholders that stuck with AMD and the company itself by making off with the manufacturing arm in the way of Global Foundries just as he was simultaneously making the entire thing a viable, moneymaking business.

WE need Bulldozer gone with, a long time ago. AMD, and thus us, needs a new much less power hungry architecture that can fit into a SOC like structure on the low end and makes Intel stop piddling around on the high TDP end. But unless both Kabini and the PS4/Xbone do very well this holiday season, AMD might not get a chance to produce that.

Still, maybe Qualcomm will be up for it. I could see them even buying out the GPU division if AMD goes under. And their updates Krait this year are perfectly competitive with ARM's new Cortex a15 AND Apple's Swift. Maybe they can just keep getting better. Start to put pressure on Intel from the bottom of the power scale up.Reply

But where are the notebooks with *just* a Richland and no discrete graphics?If I get an A10 I'll get it for the 'good enough' performance at a low price using just the integrated GPU.Does anyone make such a notebook, preferably that's not a piece of crap, build quality wise?Reply

All the "we need AMD" shtick is getting very long in the tooth. Maybe it wouldn't be so offensive if authors were able to provide some scholarly references to back up the claims that consumers are going to really be hurt by AMDs failure to compete. There are certainly plenty of industries with many vendors where competition isn't a huge factor in pricing, even in the tech sector. I'm nowhere near convinced that AMD should somehow derive credit for Intel's progress, and the baseless whining in this article has done nothing to convince me otherwise.Reply

Wow... Somebody forgets the 'good ol days' of the 90's when Intel basically completely owned the x86 market and routinely released their latest chips always around $900. For example the original Pentium in the 'cheaper' 60mhz variant was released in 1993 at $847. And no that was not an 'extreme edition' either. In today's dollars it's close to $1400. That is the x86 world without a viable competitor to greedy Intel. Reply

That was also a world where $3000 desktops were in "reasonable high-end" space, not "if you don't have a serious business case where you're maxing out the resources on this thing -- and you probably don't -- only buy it if you've got more money than sense" space.

AMD was only a viable competitor to Intel from the trailing end of the P3 era to the Core 2 launch. If Intel was going to jack up their prices when AMD stopped being a viable competitor, they've certainly taken their time at it. They released a dominant product 7 years ago, have only increased and broadened their performance lead, and still aren't doing it.Reply

I haven't forgotten those heady socket 7 days in the least. As I recall, one could buy x86 chips from IBM, Cyrix, AMD, and others. The $2000+ machines you're talking about were perhaps not marketed as "extreme", but they certainly performed remarkably well compared to the nearly as expensive 486 machines from Intel and others that they slowly replaced. Fast-forward 20 years, and we're down to two manufacturers and CPU prices are pretty much at an all time low. So, where's the correlation? There, meanwhile, are a dozen different motherboard manufacturers and prices have been rising like mad during that same time period. Again, where's the correlation?

If having a large number of vendors automatically precluded ludicrous pricing, there'd be no such thing as price fixing.Reply

I would have liked to have seen the (a) system running dual channel 1866 memory, since that would have offered an additional small boost to graphics performance. I'm surprised how much this evolutionary development over Trinity results in significant performance gains. Waiting for Kaveri now.Reply

Graphics performance will at best be slightly above parity, while CPU performance takes a bath.

As Intel's HD in this very article is roughly 2 times slower that AMD's APUs. (while gap between CPU's is about 1.5)

This means that if you occasionally play games you should avoid Intel's notebooks without dedicated graphic cards, while you're fine with AMD's without. And I have yet to find an app that I would run on a notebook, besides games, that would seriously benefit from a faster CPU.Reply

Ok let me just say something all these sites say the new cpu is the same as trinitys. but its not richland has improved thier cpu, and intergrated gpu so much that its at a comparason. to your mid range desktop. I would know I upgraded not to long ago and this spd increase is about 60+fos in my games. P.S. I do not have a dedicated video card in my computer.Reply

I just bought a HP AMD A10 laptop 2.5ghz cpu with 8gb of ram 1TB hard drive ATI Radeon 2500 with 768 memory, 8mb of of L2 cache, blutooth, multiple dvd writer usb 3.0 x2 usb 2.0 glossy screen. 5 hour battery life, hdmi port 10x card reader, loaded with windows 8. I bought it at Future shop, there were only 10 units available for $399.00 + tax= $480.00 This laptop retails on the web between $650 to $700 How is that for a great bargain, IT does not overheat, I leave it on all day, i play the most demanding games at medium resolution. For this price it does not get any better. webcat62Reply

AMD needs to bring something new to mobile APU market ASAP. If this APU was compared to a portable with Intel's 35watt Haswell processor with HD4600 or even HD4000 graphics, the massive lead of the APU in 3D games would disappear. I mean, A10 may still be a little faster, but not by a truly significant margin. At best, it competes with Haswell i3, which will be priced aggressively, considering Haswell i5 portables can go for $600 or less.Reply

No, we don't. Maybe you need it, but the majority of the computer market certainly doesn't. This is why the desktop/laptop/notebook market is dying rapidly: people only upgrade their machines either when the machines die (because a component fails) or miraculously they find themselves in need of a much-faster machine. Short of that all they do is buy a new one to fill a need for a new one, and then the hardware is plenty fast enough for most needs most of the time. The big issue becomes price, because there's no need to spend $2500 for a top of the line laptop. Oh sorry, $1000. I'm thinking of 2000 prices, 2005 maybe.

Phones. Tablets at most. That is where the market is. 35W laptops are an afterthought. Especially running Windows 8, especially for gaming. Reply

Currently I'm considering to take MSI GX60 3CC Destroyer laptop that comes with AMD A10-5750M and Radeon R9 M290X.... most review said that it is not a recommended laptop for gaming as for the AMD A10-5750M bottleneck issue with Radeon R9 M290X.... some said it is only suitable for single player game, not online gaming/online multiplayer... Any opinion on this laptop specs? Reply