There have been a few recent product launches, with more to come in the near future, from AMD, Intel, and NVIDIA. On the CPU side we have Intel’s Ivy Bridge and AMD’s Trinity, both arguably more important for laptop users than for desktops—and in the case of Trinity, it’s currently laptops only! The two products both tout improved performance relative to the last generation Sandy Bridge and Llano offerings, and in our testing both appear to deliver. Besides the CPU/APU updates, NVIDIA has also launched their Kepler GK107 for laptops, and we’re starting to see hardware in house; AMD likewise has Southern Islands available, but we haven’t had a chance to test any of those parts on laptops just yet. With all this new hardware available, there’s also new software going around; one of the latest time sinks is Blizzard’s Diablo III, and that raises a question in the minds of many laptop owners: is my laptop sufficient to repel the forces of Hell yet again? That’s what we’re here to investigate.

Before we get to the benchmarks, let’s get a few things out of the way. First, Diablo III, for all its newness, is not a particularly demanding game when it comes to graphics. Coming from the same company as World of WarCraft and StarCraft II, that shouldn’t be too surprising: Blizzard has generally done a good job at ensuring their games will run on the widest array of hardware possible. What that means is cutting edge technologies like DirectX 11 aren’t part of the game plan; in fact, just like StarCraft II and World of WarCraft (note: I'm not counting the DX11 update that came out with Cataclysm), DirectX 10 isn’t in the picture either. Diablo III is a DirectX 9 title, and there should be plenty of GPUs that can handle the game at low to moderate detail settings.

The second thing to bring up is the design of the game itself. In a first person shooter, your input is generally linked to the frame rate of the game. If the frame rate drops below 30 FPS, things can get choppy, and many even consider 60 FPS to be the minimum desired frame rate. Other types of games may not be so demanding—strategy games like Civilization V and the Total War series for instance can be played even with frame rates in the teens. One of the reasons for that is that in those two titles, mouse updates happen at the screen refresh rate (typically 60 FPS), so you don’t feel like the mouse cursor is constantly lagging behind your input. We wouldn’t necessarily recommend <20 FPS as enjoyable for such games, but it can be tolerable. Diablo III takes a similar approach, and as a game played from a top-down isometric viewpoint, 30 FPS certainly isn’t required; I have personally played through entire sections at frame rates in the low to mid teens (in the course of testing for this article), so it can be done. Is it enjoyable, though? That’s a different matter; I’d say 30 FPS is still the desirable minimum, and 20 FPS is the bare minimum you need in order to not feel like the game is laggy. Certain parts of the game (e.g. interacting with your inventory) also feel substantially worse at lower frame rates.

Finally, there’s the problem of repeatability in our benchmarks. Like its predecessors, Diablo III randomizes most levels and areas, so finding a section of the game you can benchmark and compare results between systems and test runs is going to be a bit difficult. You could use a portion of the game that’s not randomized (e.g. a town) to get around this issue, but then the frame rates may be higher than what you’d experience in the wilderness slaying beasties. What’s more, all games are hosted on Blizzard’s Battle.net servers, which means even when you’re the only player in a game, lag is still a potential issue. We had problems crop up a few times during testing where lag appeared to be compromising gameplay, and in such cases we retested until we felt the results were representative of the hardware, but there’s still plenty of potential for variance. Ultimately, we settled on testing an early section of the game in New Tristram and in the Old Ruins; the former gives us a 100% repeatable sequence but with no combat or monsters (and Internet lag is still a potential concern), while the latter gives us an area that is largely the same each time with some combat. We’ll be reporting average frame rates as well as providing some FRAPS run charts to give an overall indication of the gaming experience.

And one last disclaimer: I haven’t actually played through most of Diablo III. Given what I’ve seen so far, it would appear that most areas will not be significantly more taxing later in the game than they are early in the game, but that may be incorrect. If we find that later areas (and combat sequences) are substantially more demanding, we’ll revisit this subject—or if you’ve done some informal testing (e.g. using FRAPS or some other frame rate utility while playing) and you know of an area that is more stressful on hardware, let us know. And with that out of the way, let’s move on to our graphics settings and some image quality comparisons.

Update: Quite a few people have pointed out that later levels (e.g. Act IV), and even more so higher difficulty levels (Hell) are significantly more demanding than the early going. That's not too surprising, but unfortunately I don't have a way of testing later areas in the game other than to play the game through to that point. If performance scales equally across all GPUs, it sounds like you can expect Act IV on Hell to run at half the performance of what I've shown in the charts. Give me a few weeks and I'll see if I can get to that point in the game and provide some additional results from the later stages.

Diablo III Graphics Settings and Image Quality

This is a laptop-focused article, and for good reason. [Spoiler alert: most desktops with a discrete GPU will be fine running the game; if you have a desktop built within the past five years with a DX9 graphics card, particularly if you purchased at least a midrange (~$150) card with your PC, then it’s very likely you can run Diablo III at 1080p with moderate to high details.] Earlier this year, we created a new set of standards for our mobile gaming tests. Running games at absolute minimum detail settings can often produce playable frame rates, but if the result looks like something from 2005 rather than 2012 in the process (StarCraft II, I’m talking about you!), it may not be an enjoyable experience. We decided to ditch our previous “Low” settings and instead settled on moderate, high, and maximum detail in the games we test, which we’ve labeled Value, Mainstream, and Enthusiast to avoid name space conflicts. Our standard procedure is to test at 1366x768 for Value, 1600x900 for Mainstream, and 1920x1080 for Enthusiast, and we’ll continue that here.

Other than resolution, there really aren’t all that many dials to fiddle with in Diablo III, and many of the dials don’t dramatically affect performance. One of the biggest impacts on frame rate will come from the Shadow Quality setting, which has Off/Low/Med/High available. Clutter Density also has Off/Low/Med/High settings, though it doesn’t appear to impact performance nearly as much as Shadow Quality; the remaining settings are all limited to either Low or High, along with Anti-Aliasing (On/Off) and Low FX (On/Off—enable for a moderate increase in frame rates at the cost of effects quality). An interesting side note is that where many games take a pretty serious hit in performance when enabling antialiasing—particularly on lower end graphics hardware—that does not seem to be the case with Diablo III; even at 1920x1080 on integrated graphics hardware, we only saw about a 5-10% drop in frame rates with antialiasing enabled.

In order to differentiate our settings, we selected the following configurations. Our Value setting has everything set to Low, no antialiasing, and Low FX enabled. (You can still gain a few more FPS if you turn off Shadow Quality and Clutter Density, but we’ve skipped that as the lack of character shadows make for a rather drab appearance.) For Mainstream, we switch most of the settings to High (the maximum), turn off Low FX, but put Shadow Quality and Clutter Density at Medium; antialiasing remains disabled. Our Enthusiast configuration has everything set to High (the maximum available), with antialiasing enabled. Or if you prefer, we grabbed screenshots of our settings (at 1600x900 for the captures, though the actual tested resolutions are as indicated):

So what does the game end up looking like at the various settings? We grabbed screenshots at our three detail settings and at 1600x900 resolution (so you can cycle between them and they’re all the same size), using Intel, AMD, and NVIDIA graphics hardware. You can see all of the images in the following gallery, and we’ll discuss image quality below.

As far as image quality comparisons between the three graphics vendors are concerned, there’s not much to discuss. Diablo III isn’t a graphical tour de force, and in our experience at least all three vendors produce similar/identical image quality. For that matter, even comparisons between our Value, Mainstream, and Enthusiast settings suggest the end results are largely the same. The big factor that’s immediately noticeable is the quality of shadows under characters/creatures. Low Shadow Quality gives a blobby shadow, Medium results in a more detailed shadow, and High gives the most accurate shadow. We’ve also included a couple shots at the end with High settings but with Shadow Quality at Low/Off; we’ll discuss what that does for performance later.

We also snagged a few more shots (using just one set of hardware, in this case an NVIDIA GT 630M), including one location showing the spell effects. The latter gives a better indication of how the “Low FX” option does, as the spell blast is missing some detail. If you’re not toting hardware that’s capable of handling maxed out settings, our first recommendation would be to turn down the shadow quality. The High setting looks nicer, sure, but in the heat of battle you’re unlikely to notice the detailed shadows. The other settings often have very little impact on performance, so unless you’re really running on low-end hardware, in most cases the only other item that will have a significant impact on performance is the target resolution. But let’s not get ahead of ourselves; on to the benchmark results.

Diablo III Mobile Performance Compared

So far we’ve determined that Diablo III isn’t a particularly taxing game, especially early on—at least not for your GPU; your mouse buttons might be a different story!—and that AMD, Intel, and NVIDIA graphics solutions deliver comparable image quality. The only question that remains is how quickly they can deliver that result to your display. We’ve used quite a few different laptops to see what sort of performance you can expect with Diablo III. Here’s the quick rundown.

First up, from AMD we have a Llano prototype with an A8-3500M APU and integrated HD 6620G graphics. There are faster clocked Llano APUs in terms of CPU performance, but by default all of the A8 GPUs run at 444MHz with 400 Radeon Cores. Second is our Trinity prototype laptop with and A10-4600M (HD 7660G graphics), running 384 Radeon Cores at a substantially higher 686MHz clock. A third option from AMD is the discrete Radeon HD 6630M, and we tested three laptops with that GPU; first is the Llano A8-3500M APU, second is a Sony VAIO C with a faster Intel i5-2410M CPU, and third is a Sony VAIO SE with an i7-2640M. This will at least give us some indication of whether or not CPU performance is a factor in Diablo III performance.

Unfortunately, we do have to make a note on the drivers for the HD 6630M laptops: all three laptops aren’t able to run the latest AMD reference drivers, as they all use some form of switchable graphics. The prototype Llano system (with drivers from June 2011) can be excused, as there’s not much point for AMD to invest a lot of time improving the drivers or end user experience on that laptop, but Sony’s laptops continue to be a concern with their often-over-six-months-old drivers. The VAIO C is using drivers that date back to June 2011 (released in October) while the VAIO SE actually is lucky as it had a driver update from Sony earlier this month; unfortunately, the driver build still appears to date back to December 2011. We didn’t notice any rendering issues with any of the 6630M laptops, but bear in mind that it’s possible performance is lower due to the outdated drivers.

From the Intel camp, we tested three different laptops. On the low end of the spectrum is a Dell Vostro V131 with i5-2410M CPU and HD 3000 graphics. We also tested with a quad-core i7-2820QM and HD 3000 graphics to see how much the slightly higher IGP clocks and significantly faster CPU matter with Diablo III. The third laptop is the ASUS N56VM Ivy Bridge prototype, with an i7-3720QM CPU and HD 4000 graphics. We do have a fourth Intel option on hand, an Intel Ultrabook with IVB ULV, but we can’t report the CPU model yet and I’m not sure about talking performance, so we’ll hold off discussing that for a few more days. Anand did test an ASUS UX21A in Diablo III and you can read his comments, but he used a different test sequence and again we can’t name the exact CPU he used, so stay tuned if you want to find out how dual-core (and potentially less expensive) Ivy Bridge matches up against Llano and Trinity.

Finally, from NVIDIA we’ve got the same ASUS N56VM with i7-3720QM, only this time we’ve enabled the GT 630M graphics. We also ran some tests with an Acer AS3830TG that has an i5-2410M CPU with GT 540M graphics. The Acer is known to have issues with CPU throttling in some games, but it does have higher clocks on the GPU than the N56VM, so this will give us some indication of how much—or how little—CPU performance matters with Diablo III. Finally, we also have in a second Clevo W110ER in for review, this time from AVADirect, with an i7-3610QM and GT 650M graphics. Overkill for Diablo III? Most likely, but it’s an awfully compact laptop for that much hardware!

Here are the benchmark results; again, keep in mind that the in-town comparisons are using an identical FRAPS run whereas the Old Ruins area is slightly randomized as far as monster locations and quantity and is more prone to variance between runs. Note that we didn’t bother running Sandy Bridge HD 3000 at our Enthusiast settings with the i7-2820QM; it was already struggling at our Mainstream settings, and the i5-2410M results will tell you everything you need to know about how well HD 3000 handles maxed out settings.

Update: As noted earlier, many are saying the later stages and higher difficulty levels can really start to drop frame rates. Take the following graphs as a reference point, and plan on dropping some detail settings and/or resolution later in the game on lower end hardware.

There’s plenty of data to cover, so let’s just start at the top with the discrete NVIDIA GPUs. Not surprisingly, the GT 650M powers through Diablo III without any issues; even at maximum detail and 1080p resolution, it’s still pulling nearly 40 FPS. The second set of GPUs, the GT 630M in the N56VM and the GT 540M in the Acer AS3830TG, are in theory supposed to be roughly the same performance. However, we've seen in the past that the Acer sometimes has issues with throttling, so potentially the GT 540M is running with a thermally constrained CPU in the AS3830TG. The charts above clearly show that the Acer can’t keep up with the Ivy Bridge solution. Either Diablo III is very good at using multi-core CPUs (doubtful, given what we saw with Blizzard’s StarCraft II, not to mention a quick look at Perfmon with Diablo III), or the Acer is once again not hitting higher clock speeds.

Update #2: So it appears that the ASUS N56VM is not running a lower clocked GPU; in fact, the opposite is true. NVIDIA's control panel reports 475MHz on the GPU core, 950MHz on the shaders. I've been a bit confused about the performance since day one, but several other utilities reported 475MHz as well, including GPU-Z. Interestingly however, I just ran GPU-Z with the sensor logging option while doing a FRAPS run in Diablo III. Instead of 475/950MHz, the sensors tab is instead reporting 797.3/1594.7MHz. Mystery solved: the GT 630M in the N56VM is actually clocked almost 20% higher than the stock GT 540M. That would explain the differences seen above.

We did a quick check and found that the typical CPU clocks for the i5-2410M during our test sessions typically ranged from 800MHz to 1.7GHz range, which you can see in the above image. (Side note: we also tested with ThrottleStop active, which is what the above chart shows; it was set to a 21X multiplier, but clearly that didn't work as intended.) The average clock speeds of the two cores during our test sequance are a rather slow 1200MHz and 1085MHz, so clearly the CPU isn't really providing the sort of clocks we usually see on i5-2410M. However, Diablo III doesn’t appear to need a ton of CPU performance; given the new information we have on the GT 630M clocks (see update above), it appears that Diablo III simply doesn't push the Acer hard enough to activate higher CPU clocks most of the time.

The second grouping of scores is mostly in red/orange, representing the AMD GPUs/APUs. For the red bars, Trinity and Llano both provide acceptable performance at our Value settings, and they’re still fast enough for the Mainstream settings—remember as we mentioned in the intro that Diablo III is actually quite playable at anything above 20 FPS. Once we hit our Enthusiast settings, both drop quite a bit; Trinity remains tolerable, but Llano definitely can’t keep up and you’d need to drop the Shadow Quality to Low at the very least for 1080p. Another really interesting piece of information we discover is that Trinity with it’s integrated GPU is still faster across the board than the HD 6630M (though there’s a possibility HD 6630M is being hurt by the outdated drivers). As for the three way HD 6630M comparison, CPU performance does appear to help a bit—the i7-2640M is typically slightly faster than the i5-2410M and A8-3500M—but the largest spread is only 15% at our Value settings; at Mainstream the gap drops a bit to 10-12%, while at Enthusiast it’s under 10%. Given the frame rates, the extra 15% never really means the difference between unplayable and playable; all three laptops with HD 6630M tend to handle up to our Mainstream settings quite well.

The final three lines are the blue Intel IGP results. HD 4000 with quad-core Ivy Bridge trails Llano across all settings, though it’s often close enough. Performance at Mainstream is a bit questionable; sure, you can play Diablo III well enough in our experience at 20-25 FPS, but it’s not going to be the smoothest result. Llano may only be 3-4 FPS faster at Mainstream, but that 12% performance increase is just enough to make the result a bit smoother. Your best bet with HD 4000 is ultimately going to be turning the Shadow Quality down to Low/Off, and then running at 1600x900.

As for Sandy Bridge’s HD 3000 IGP, perhaps the less said the better. Even at our Value settings, it only qualifies as tolerable, and at Mainstream it’s quite choppy—you could still play Diablo III at 13-18 FPS in a pinch, but I wouldn’t recommend it, and I doubt it would work well in multiplayer. Once frame rates drop below 15 FPS, it appears the engine starts to slow down rather than just skipping animations. Our New Tristram run usually takes around 20 seconds to complete (even at 20.1 FPS on the HD 4000), but when frame rates are in the low teens the time for the town run increases to around 30 seconds. Single-player is still possible, but that’s as far as I’d go—and it will take longer for everything you do, thanks to the moderate slowdown. When the HD 3000 drops below 10 FPS, what was sluggish takes a major nosedive; the town run required just over 60 seconds to complete, and the Old Ruins run that usually requires about 100-110 seconds clocked in at 308 seconds. Yup, there’s a reason we didn’t try suffering through the Enthusiast benchmark a second time on HD 3000!

Other Performance Tests

We did a few other tests to round out our performance information, though we didn't repeat the tests multiple times or run them on all of the systems. For one test, we used our Enthusiast settings but with Shadows on Low/Off with the HD 4000; the result of the testing is scores that are slightly better than the Trinity scores with Shadows on High. With Low shadows at 1080p, New Tristram scored 20.1 FPS and the Old Ruins scored 18.5 FPS; drop the shadows to Off and New Tristram runs at 27.1 FPS with Old Ruins at 24.8 FPS. In total, the difference between High Shadow Quality and Low Shadow Quality is over 50%, and going from Low to Off is another 35%. The other test was to use our maxed out settings but at 1366x768, again on the HD 4000. The frame rates were 17.3/16.4, or around 35% faster than at 1080p.

Given those results, it appears that Shadow Quality is the single most demanding setting, trumping even resolution. On HD 4000, you can basically double your performance at 1080p by turning off the shadows. Without doing in-depth testing (remember, we're looking at about five minutes to set up and run each benchmark setting, so I've already spent around 10 hours just doing the basic set of results shown above, not to mention testing other settings!), I can't say for certain, but my general impression is that the results are similar with other IGPs/GPUs.

Detailed FRAPS Runs and Closing Thoughts

For those of you that want a different view of the gaming action, we’ve selected the highest quality but still playable result for each GPU. In general, that means we wanted average frame rates of 25 or higher, with minimum frame rates always above 15 FPS. Obviously you could tweak settings in other ways and still get playable results (e.g. by dropping the resolution, you might be able to run our Enthusiast settings at 1366x768 instead of Mainstream 1600x900), but we’ve stuck with our three basic categories for the following charts. We’ve ordered them in terms of increasing performance/quality.

Given what we’ve said already, your best results will generally come by keeping minimum frame rates above 20. Assuming there are other segments of the game that will be more taxing than our benchmark sequence, you might still drop into the upper teens, but as long as you’re above 15 FPS you shouldn’t “lose sync”. Even at our Value settings, HD 3000 is already dangerously close to dropping below 15 FPS at times; you might have to give up on Shadows altogether to get acceptable performance. HD 4000 at our Mainstream settings ends up staying above 20 FPS for the most part but rarely gets above 25 FPS; by comparison, Llano’s HD 6620G ranges from around 22 FPS to nearly 30 FPS. For a smoother experience, though, you’ll still want 30 FPS or more, and that’s where the HD 6630M and Trinity’s HD 7660G fall, with Trinity averaging just slightly better performance despite one large dip to the low 20s.

As shown in our earlier charts, the real winner in terms of gaming performance looks like NVIDIA, though the use of Ivy Bridge CPUs for our two fastest test laptops leaves room for debate. The Acer doesn't appear to have any real issues with throttling in this game, however, despite my earlier fears; it looks like Diablo III (at least early on) just doesn't tax the CPU enough to routinely need more than a moderate 1.2-1.6GHz on the i5-2410M. The 15~20% performance advantage of the N56VM over the 3830TG instead comes from a higher clocked GPU, despite earlier indications that the opposite was the case.

Closing Thoughts

Wrapping up, while Diablo III isn’t the most demanding new release, it can still bring basic laptops to their knees. Unfortunately, unlike desktops it’s often not possible (or at least not practical) to upgrade a laptop’s graphics capabilities. I’ve had a couple friends ask for help with running Diablo III on their old Core 2 Duo laptops, and they’re basically out of luck unless they want to purchase a new system. That’s something we’ve tried to explain in our laptop reviews, and Diablo III drives the point home: buying at the bottom of the barrel in terms of GPU capabilities may not matter for you right now, but kids and/or future applications may eventually make your IGP-only laptop insufficient.

In the case of Diablo III, even a moderate HD 3650 or GT 330M should still be able to handle the game in single player on Normal difficulty, but IGP solutions from two or more years back are likely going to come up short. Naturally, anything faster than the GPUs we’re testing here will allow you to increase details/resolution, and it’s nice to see “mainstream” mobile GPUs like the GT 540M/GT 630M able to handle 1080p gaming for a change.

And again, in case you missed it, the later stages of the game, particularly on Hell difficulty level, are said to be quite a bit more strenuous. If you're the type of player that intends to defeat Diablo not once but three or more times at increasingly difficult settings, our results from early in the game are probably not representative of what you'll experience later. Performance does appear to stay relatively consistent among the various GPUs, though, so if you take half of our performance results as a baseline of what to expect, you're probably not far off the mark.