Hopefully prices come down a bit; right now the only A10-4600M laptops I can find are going for over $700. They're decent chips overall, but I'm not convinced they're better than a dual-core Sandy Bridge with GT 540M. The Acer I used is clearly not the best representative of that market, as the 13.3" chassis is quite thin and just can't cool the CPU+GPU well enough to avoid throttling; pretty much any 15.6" chassis should do better.Reply

It seemed to me that the Llano laptops were marketed the same way. The A8 seemed to always be considerably more expensive than the A6 model, when the chips could not be that much different in cost. However, the A8 was usually better equipped with more ram.Reply

Pretty cool from A10-4600M to providing the same performance as Sandy Bridge with GT 540M for half energy consumption. Is it two weeks after Trinity's launch? Too bad still not possible to find these laptops.Reply

For people planning to play into Hell or Inferno difficulties. Be aware that elite monster packs will have 3 magical abilities on Hell and 4 on Inferno. You could also run into two or more packs of monsters at once, so you could be looking at 6, 8 or more magical effects. Also, some of the monster abilities cause them to replicate. I've definitely seen 30 or more monsters at once with the entire screen covered with fire, poison and lightning.

I don't play on a laptop or use an IGP, but I assume this could have negative impact on performance. Normal mode and even Nightmare mode would probably not be too bad though.Reply

I'm in Nightmare Act 4 and in the Keep (Tristram equivalent, the homebase) I get over 100fps. Taking on average mobs outside the Arreat Gate drops fps to 80. And that's on a 6950 , 1900x1200 everything maxed.

On my laptop with a 8600m GT I could play fine in Normal until I got to Act4. I can still play but it's annoyingly choppy when there are lots of mobs. The 8600m GT is mentioned as supported 'Low' on the Bliz site.

Problem is, to test on Hell I have to play through all of Normal, then all of Nightmare. I know people who have already done that, sure, but I only got the game two days ago and I have a family and a life outside of playing games. Hence the disclaimer at the beginning. I'll update the text to mention slowdowns on later areas.Reply

There is a basic reason why the game runs so well in Act 1 Normal.. play through Act 3 Hell then come back and redo your review. Only the 650M has a chance of playable frame rates in those levels and we haven't even covered multi-player. My 7870 OC to 1100 Mhz has some slowdowns in those levels under some high stress scenarios and basically the game becomes an absolute nut-fest in later difficulties. People will want to play through the later difficulties, its part of the game's progression. Now I get that its hard to benchmark through the randomness but you can make subjective comparisons or do several run throughs. I can say with absolute certainty, none of the apus have a chance in playable frame rates in scenarios where it will matter. D3 is a very unforgiving game, it can take a split second to die, smooth frame rates in non-normal difficulties is essential.Reply

Tell you what, guys: email me your account login and password and don't play the game for a day, and give me instructions on a good stressful area to play on Hell difficulty, and then I can test that area. Otherwise, I simply don't have the 40+ hours needed to get to that point in the game in less than a week.

And in case it's not clear, I'm mostly joking here. I've got several items I'm working on reviewing that are going to be higher priority than revisiting Diablo III performance in later acts. Perhaps this summer I'll have a chance to go back, but by then it won't really matter that much. So I'd suggest taking these figures as a way of getting relative performance from the various GPUs/IGPs, and then extrapolate from there. If you need to play on Hell difficulty on a laptop with maximum details enabled, you're probably going to want at least a GK107 dGPU (or perhaps Southern Islands).Reply

hehe, but it's not that hard - you don't even have to be on higher level difficulties - its only ACT 1 OF NORMAL, which is more like a tutorial and considerably less populated (and task manager is claiming ~300mb ram, which increases up to 1gb later, still on normal)

all the things mentioned later, like having freezing monsters or duplicates or 100+ creeps on screen are happening on nightmare also, and even on late normal, so it shouldn't be that kind of bother...

(joking of course, but you could give me YOUR user/password and authenticate it with one of those mobile apps while on chat, and i could level you up pretty fast, playing since diablo1. being on normal, you don't have much to lose, i'll even leave you some nice gear to start nightmare with - seriously, talking about few hours job)

and all is in good-faith, since i don't play d3 on laptop anyway :)Reply

We do have a couple people playing the game, so at some point we'll be able to test later levels. Give me a chance to: A) have a holiday (today), B) write a few other articles, C) enjoy the game without more benchmarking of it (which totally kills the fun!). Probably in a week or so I can come back with results from Act II or later.

My diplomacy skills are of the Europe 1914 level; the odds of my being able to sweet talk someone I don't know well into anything are slim to none.

Better results in a week or so isn't that bad a delay. I'm just mildly frustrated since I've had a few people ask what sort of hardware they needed to play the game; and it seems that all the numbers I can find are from very early in the game and thus non-representative of what's really needed.Reply

Actually Metro/WinRT won't be used for gaming, If you want a restricted environment there already is XNA so. Games will be too difficult and too less of an incentive or anything to gain to port to the WinRT framework. Or Windows Runtime as they call it. Game developers will never target Metro/WinRT if they don't have to and they don't on x86 machines, desktop is there, you can still build for Windows 7 etc where most users are and so on. Won't happen that much here until next gen consoles either. Plus macs have gotten a whole lot better in the department and plenty of game engines are available now. Taking those kind of engines and porting to C++/WinRT isn't something taken lightly it probably won't actually be possible without a rewrite which defeats the purpose. The performance wouldn't be good. The sandbox is probably too restrictive. It also means in practice it is a more restrictive environment then the mobile sandboxed OS's, several mobile OS's run Firefox for example. WinRT never will. WinRT never will run even IE.Reply

Did I miss the part where you talk about using an external monitor, or how else were you able to run all of these GPUs at all three resolutions? I'm not saying the data isn't important, as it could be relevant to different notebooks that use the same or similar hardware just with higher-res screens.

Also, I've played this game on an old desktop with with GTX 285 @ 1080p and everything turned up. While that is fairly smooth and playable, I still get quite a few moments of "stuttering" in Hell difficulty. I also play on basically the same Acer book with the GT 540M, and even at the lowest possible graphics settings and resolution in normal mode, it's hard for me to characterize that performance as anything other than horrible in comparison to the desktop.Reply

Yes, all of the higher than 1366x768 results were done on an external LCD where required (which was the case for Llano, Trinity, VAIO C, TimelineX, and Vostro; the other laptops had 1080p displays, except for quad-core SNB which has a 1600x900 LCD and I didn't run the 1080p tests).Reply

Good review for what it is, but I think it could have been a little more complete with some additional information:

1) Use Act 3 Bastion's Keep for the "intensive" case instead of Act 1 Old Town. I think this would be better representative of the game's peak demand. (probably just a run through of the signal fires quest since it's easy to get to)

2) Include a brief section on how much of an impact additional players put on the game. I find it can actually be quite significant. This doesn't have to be full-depth review just a quick.

Overall, I'm using an A8-3500M + 6750M crossfire (overclocked to 2.2GHz) @1366x768 and my framerates during battles (ie. when it counts) average about 1/2 to 1/3 what the reviewer posts because the game gets much more intensive than Act 1, and having a party also slows it down significantly compared to solo.

I have been testing this out on my W520 for the sake of seeing what I can do to play diablo and maintain decent battery life.

For what it is worth, turning off shadows, and playing @ 1366x768 on the HD 3000 results in roughly 28fps - more than enough to play the game through the first difficulty anyhow. I have been using this for some time now with 4 players in game. When running @ 1080P, it dips down into the low 20's, and occasionally is a problem in act 3 so I wouldn't suggest it.

Point is though, that anyone that has a notebook with SB and no video card CAN still play this game, even if it isn't ideal.Reply

Given this is a cross platform game, it would have been interesting to provide Mac results with similar hardware. I play using a GT330m and i7 dual core, and it runs pretty well. I'd like to see how it stacks up to the latest AMD chips and HD3000 on a Mac. Reply

Anecdotally, Brian and Anand have both commented that Diablo 3 on a MacBook Pro under OS X runs like crap. I'm not sure if they're running on latest generation MBP13 or something else, though, so that's about all I can pass along.Reply

Was there any doubt? OSX is severely lacking in the graphical driver support. Apple never gave a rat's rear about this crucial aspect of gaming support. They are always late with drivers and with the latest OpenGL spec.Reply

The recommendations / minimum requirements on Macs are discrete graphics with good drivers though. I.e. no nvidia 7300 / 7600, ATi X1600 / X1900 etc. Starting point is 8600 GT. Obviously no integrated Intel graphics is enough there. OpenGL3.2 or OpenGL 2.1 with extensions should be fine for game developers and the drivers handle it, nVidia and AMD can put in performance improvements if they have the feedback. They could even launch their own "game edition" card for the Mac Pro with their own drivers outside of Apples distribution channel. Nvidia releases drivers on there site from time to time. That said both the game engine port and drivers are a bit less optimized then their Windows and Direct3D counterpart. They [drivers] are quiet robust and well working but might not be that fast. It's mainly a problem for the developers today though as most macs has somewhat decent graphics with maintained drivers and have pretty good driver support and support pretty much all the features you need any way.

The OS is very dependent on OGL so the support it self is decent and fairly up to to date even if it is not OpenGL 4.2/3.3 yet. Latest OpenGL 4.2 is not even supported by much of any hardware that Apples uses either so. R700, R600, GF 8M, GF 9M and the desktop versions does not support more then OpenGL 3.3 any way which it self is a backport of as much as possible. 3.2 is a decent level there. Apple always support the whole API in the software renderer too so they have no joy hunting the latest features, though the vendors can use any extensions they wish to add those features, all the supported gpus supports the API too. Intel drivers on Windows do not have OpenGL 4.1/4.2 drivers. It's a lot better driver support then for say Intel graphics on Linux and in some regards even on Windows. Intel drivers on Windows don't support OpenGL 3.2 yet.Reply

Some people are severely space limited; others are casual gamers and don't play enough to justify having two computers. As a very mass market game; DIII will be selling a huge number of copies to people in the latter group.Reply

Many people have both laptops and desktops, and regardless of the desktop obviously being superior for gaming, they'd still like to be able to game a bit on their laptop when they're out and about and don't have access to their desktop.

Then, there are people who don't have the space for a desktop, or simply prefer the freedom of being able to move around their home but would still like to play games.

The question is, why do other people's usage models bother you so much? You don't care about the gaming ability of laptops. Fine. Don't pay attention to articles about it. Meanwhile, there are plenty of others, such as myself, who are highly interested.Reply

I am confused about the GT540M vs the GT630M. Isnt the GT630m just a re-badged GT540m with higher clocks? Or is it the new G-force architecture? I believe only the 640M and above have the new architecture.Reply

The GT 630M is technically a straight up rebadge of GT 540M. However, the GT 630M in the ASUS N56VM that we have shows clocks that are quite a bit lower than other GT 540M GPUs. Strangely, those lowered clocks don't appear to matter much in Diablo III, so either NVIDIA's control panel (and GPU-Z) are reporting incorrect information, or the shader cores aren't as demanding as you might expect.Reply

Correct. The Acer is definitely not hitting Turbo speeds, let along the base 2.3GHz clock. So the combination of faster CPU + slower GPU works out in favor of the N56VM in this instance. I really wish I had a good way to test the N56VM with the fully clocked GT 540M, though.Reply

Google 'ThrottleStop'. Many of the Acer Aspire users that experience throttling problems in games use it to prevent the under-clocking. Note it will get hot though and you might want to raise the back of the laptop off the desk with a binder or something to improve airflow (intake).Reply

Funny enough, I actually reran tests with ThrottleStop already and it just didn't seem to help. I'm not sure what's going on, but even with TS enabled and set to a 20x (or 21, 22, 23, Turbo) multiplier, the system still seems to just plug along at the 800-1500MHz range during testing. I've done everything I know of to make it run faster, and it just doesn't seem to matter. But this particular laptop has always been a bit quirky; I might try looking for a new BIOS again just to see if anything has changed, but I doubt it.Reply

And funny enough, after additional investigation, the issue isn't throttling on the Acer but rather a higher clock on the GT 630M compared to the GT 540M. NVIDIA's updated specs page for the 630M lists 800MHz as the clock, but oddly their control panel is only reporting 475MHz on the ASUS. According to GPU-Z's Sensors tab, however, it really is running an ~800MHz core clock (1600MHz shaders), which accounts for the higher performance compared to the 672MHz GT 540M. I've updated the text in the article to explain this.Reply

No offense, enthusiasts(in the real sense of the word) are always more extreme than your average MBA wielding blogger. If they wanted something light they would spare no expense and would have gone with a VaioZ, or some crazy Japanese Fujitsu that is lithium made, or a moded Sony UX. PC hardware enthusiasm has nothing to do with Apple commodities that try to be as "safe" as possible.Reply

Suddenly? They were never marketed as gaming rigs, most don't even have dGPUs and Diablo 3 isn't even one of the 5 most demanding games this year. I dunno what you're getting at, ultrabooks are still great for the propose they're meant for. Can you get just as much done with an uglier/thicker/heavier $700 laptop? Sure, you might even get a dedicated GPU to go along with it... They're serving entirely different markets tho.Reply

I have to stick to a subset of the possible resolution/detail settings or I'd be testing a single game 24/7 for a week. I've already spent probably 20 hours benchmarking Diablo III, and let me tell you: running the same three minute sequence at least a dozen times per laptop gets to be mighty damn tedious. I did run tests at some other settings, which I commented on I believe, but here's a bit more detail.

For example, on the N56VM, 1080p with all settings maxed but Shadow Quality set to Low results in performance of 20.1 FPS/18.5 FPS for our test sequences -- so that one setting boosted performance by over 50% compared to having all settings at High/Max. What's more, I also tested at max detail 1080p but with Shadow Quality set to Off, and the scores are 27.1/24.8 -- another 35% improvement over Low shadows. Everything else combined (e.g. 1080p but all other settings at low) only accounts for probably 20%. I could test that as well if you really want, but I have other things to do right now.Reply

I'm mostly thinking that a large majority of laptops sold, even now, have 1366x768 displays. It looks like all of the non-Intel laptops handle playable framerates with low detail at that resolution, so I'm curious how that performance falls as the detail goes up.

In particular, can Llano and Trinity handle high detail at 1366x768? They are (or will be) sold in budget laptops that won't get high-res screens.

However, I understand the time constraints your working under. Thanks for the comparison, anyway.Reply

I agree with this. I understand time constraints, but honestly, the paradigm that's being followed here (and with a lot of reviews) is simply not representative of real-world usage. It's not the case that people play with low details at low resolutions and high details at high resolutions. *Especially* when you're dealing with laptops. Generally, you're going to have the resolution at the display's native resolution, and going to work with the settings from there.

In any case, the article is still appreciated, and it's possible, at least, to make an educated guess at how the game will run at various resolutions and settings based on the presented info. Definitely going to grab myself a nice Trinity-powered laptop soon as one meeting my desired specs comes out.

Also, yet again we see that HD4000 does not match Llano, let alone exceed it, as I've seen some people spreading around.Reply

This is the whole purpose of having three different settings, discussing what settings we selected and why, etc. Consider the Value setting a "near-best-case" result while still looking decent; in this case, the only thing you can really do to further improve frame rates is to turn off shadows and/or lower the resolution further. If you look at our Mainstream results, you can see what happens as you start to turn up the dials, and the same goes for Enthusiast. I've discussed in the article exactly how much the various elements impact performance, going so far as to include additional results at "Enthusiast 1080p" but with Shadow Quality on Low/Off.

If someone can't get at least a decent idea of where to start in terms of settings and what to expect from their laptop hardware with the information in this article, I'm not sure what I could do to help the situation. Hold their hand and walk through each and every specific setting? Because that tends to come off sounding very condescending if I write that way, and I think most people who care enough to read our articles are much smarter than that.Reply

Fair enough, I understand that. However, I'm not suggesting you write in a hand-holding, condescending manner. Just having three bars on the graph for each resolution (one bar for value, mainstream, and enthusiast settings) would be fine. I understand the time constraints, though, as I said. That would be the ideal, however. Reply

You won't have to worry about that soon for nVidia chipped laptops as nVidia is rolling out that automatic best game play settings in their drivers.That's going to be a wonderful thing for the majority of gamers and laptop users who don't have a clue on game settings - I hope it helps increase the user base so computer games overall gain strength.

Amd needs to follow suit quickly, to help all of us with a larger user base, instead of being stupid and lame on the driver side as usual. Of course, I'm scowling at the idea amd could possibly man up in that area.Reply

Thanks for that bit of info, ignore the fanboy, and continue your observations please - as we have already been told in the last card reviews by so many users here they have 1900X1200 monitors and they are by no means rare and "all real enthusiasts" have sought them.

So the information you have there is very valuable to all the amd fans that own their 1900X1200 here that only lost to the new nVidia flagship by 9% at that resolution instead of 14% overall loss at 1920X1080, which anand doesn't show.

Please ignore the sniping, cursing rude person and continue the observations as that one surprised me.Reply

"What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft, DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings."

WoW got a major graphics upgrade for the expansion pack Cataclysm, and it is now one of the few DX11 capable MMOGs released. You're overall point is valid in that Blizzard makes games so that people with lower priced systems can play them, but a bit out-of-date when it comes to WoW.

Anyone old enough would have remember those article about Mobile Graphics, How they sucks, how every year we were suppose to get 50 - 100% performance improvement. How Quake 3 didn't work, we could only play SimCity 2000.

And by todays standard, Diablo 3 isn't even ground breaking in terms of Graphics. And yet, most of these laptop dont even play the game at acceptable frame rate ( 30fps ), ( And we are already excluding ANY of the ACT 3 / 4 loads in the game )

And we even have Retina Display resolution coming. We are talking about 2 - 4x Pixel Density.

I really do hope Haswell will provide 3x the performance of Top HD4000 numbers. This way we could push and ensure that everytime i select a discrete graphics in Notebook, i am guarantee to get at least decent graphics performance numbers. Reply

BTW That's the reason why consoles are better than PCs for gaming. You invest once and it guarantees you (unless it is XBOX 360 and rings red of death) that you will be able to play all available games until you have it. For the price of a console you cannot even buy a good graphics card.Reply

What's your intended resolution and what level of detail are you willing to run at? If you're okay with 1366x768 and Low Shadow Quality, you should be able to play through at least Normal and Nightmare difficulty on any Llano, Trinity, or possibly (if you're tolerant) HD 4000 laptop. If you want higher quality settings or a higher resolution, you'll want probably something with at least a GT 630M level GPU.

It's a bit larger (and should therefor run a bit cooler/better) than the 3830TG used in the benchmarks for this article. At a price of $600, I don't see Trinity A10 surpassing it any time soon, though I do suspect the number of 4830TG units currently available is all that's left, so they might go out of stock in the next month or so.Reply

Secondly, many gamers now are on laptops, not by choice, but some by necessity. There are quite a few gamers that are, in fact, space-limited and simply don't have the space for a full desktop setup. I am actually one of those at the moment. I will have more space in the future, where I will then get a desktop, but I don't have enough space right now. That is why a laptop is ideal for me. Secondly, a laptop has almost everything integrated and makes it easy to be mobile; speakers, trackpad, keyboard, screen are all in one unit. You can't be lugging around a desktop everywhere. If you're going to a friend's house or visiting somewhere, a laptop allows you to game on the go.

Lastly, laptops are all about cooling. An Acer that's throttling is not going to cut it. The Act 1 benchmarks are not realistic. In Act 3 or 4 with tons of mobs on screen, that will stress both the CPU and GPU a lot more. A properly cooled and properly designed laptop should be hitting max turbo speeds almost always, and should not be throttling at all. Properly cooled the laptop should be running at minimum on base clocks when hooked to the A/C adapter. If you're gaming on the battery, than that's a bad idea. Gaming should be done hooked up the adapter when possible.

With an i5 or i7 hitting max turbo clocks, combined with a 540M or 630M Geforce, D3 should run smoothly at medium/high settings even in Act 3 or 4. If your laptop is throttling, then of course that's a different story. So in the end, it is possible to game pretty well on a laptop, as long as the laptop has strong cooling. Reply

You'll note that with further investigation into the performance, it does not appear that the Acer is throttling. It simply isn't hitting max turbo during testing because the game doesn't require it. The performance of the much faster CPU in the N56VM is never more than 20% faster than the Acer, and that accounts for the GPU clock speed difference.

As for later acts, give us a bit of time and we'll return to the benchmarks with results from late in the game. We have some other stuff that's higher priority, but we are aware of the fact that the Act I numbers are not fully representative of Diablo III performance. It will probably be a couple weeks, though.Reply

Hey, it pays to work in the industry. One of our hardware contacts managed to get me a code -- actually had to buy a box, open it up, photograph the key, and email that to me. Hahahaha... Something about the address on my Battle.net account not matching the billing address for the CC, so that was just easier than trying to figure it out. Thank goodness for that as well, as there's no way my wife (with a newborn) would be letting me buy Diablo III.Reply

Well, just to give some perspective on the low end laptops, I try and play D3 on a HD3200 & 2ghz dual core AMD, and its pretty horrible most of the time. I even have it set to 800x600! and get about 10-20fps... Im looking for a better, but inexpensive upgrade. I have a q6600 & HD4870 desktop that runs the game pretty well with all setting low or off at the highest resolution and it looks great and runs ok. Im wanting to get a laptop with a 6750M...Reply

HD 3200 is sadly very old by today's standards; it's actually not much better than Intel's Arrandale HD Graphics (original Core i3/i5 dual-core laptops). HD 3200 was fine when it came out in early 2008, but then AMD didn't release a significant update (just similarly clocked HD 4200/4250/4290) until the launch of Llano last June. That's over three years without a real improvement in IGP performance, which is pretty much an eternity for GPUs.Reply