Alienware recently updated their Bronze Editor's Choice award-winning M17x R4 gaming notebook to include Intel's Ivy Bridge processors and optional AMD Radeon HD 7900M series graphics or NVIDIA's new top end GK104-based GeForce GTX 680M GPU. With the move we also get mSATA support inside the chassis. The big draw with our review unit is the NVIDIA GeForce GTX 680M, which promises a substantial performance improvement over last generation's top end GTX 580M/675M, the kind of generational jump we haven't seen on the mobile side in some time.

At the same time, Alienware's M17x R4 remains largely unchanged while Clevo, MSI, and ASUS have all continued to incrementally update their gaming notebook designs. I've also had the privilege of owning my own M17x R3 over the past year and have new insights to offer on what it's like to live with this chassis design after an extended period of time. Is Alienware smart not to mess with what looks like a winning formula, or is the world passing them by?

With the M17x R4 I'm going to be a little lazy and refer you back to my review of the M17x R3. Why? Simply put, I can't find any changes between the two chassis designs...at all. Internally Alienware has definitely updated the M17x, but externally this is the exact same notebook and while I was in love with this design before, time hasn't been as kind to it as I'd like.

Typically we get Ivy Bridge systems with the entry-level Intel Core i7-3610QM, but our Alienware systems tend to be a bit better equipped and that's true here. The i7-3720QM is a healthy step up from last generation's i7-2720QM, able to turbo up to an impressive 3.4GHz on all four cores or 3.8GHz on a single core. It also brings with it Intel's HD 4000 integrated graphics, and NVIDIA leverages them with their Optimus technology.

Speaking of NVIDIA technology, the big draw with the M17x R4 is the GeForce GTX 680M. Unlike last generation's GF1x4 derivative GPU, the GTX 680M is based on NVIDIA's current top end silicon. The GK104 in the GTX 680M is the same chip that powers the GTX 680, although here the 1536 CUDA cores have been cut down to 1344. That's about the only cut made, meaning this is basically the same silicon in the very impressive GTX 670, just run at substantially reduced clocks. The core clock now runs at only 719MHz with a boost clock of 758MHz, but the most painful cut is the GDDR5 clock. Where NVIDIA was able to hit a staggering 6GHz on the desktop (and their memory controller allows you to pretty much push the GDDR5 chips to their limits; my GTX 680 is at 6.7GHz on the memory), the GDDR5 on the GTX 680M is running at a paltry 3.6GHz. That means that while generationally shader and texturing power have increased tremendously, the memory bandwidth increase has been much more modest, and that's on a chip that's already throttled largely by memory bandwidth. Interestingly, the GTX 680M in the M17x R4 sports 2GB of GDDR5 while the GTX 680Ms offered by other vendors have 4GB, but this shouldn't be counted against it as even desktop GTX 680s with 4GB of GDDR5 haven't proven to need the extra video memory.

Of course, Alienware offers alternatives to the GTX 680M, which is a hefty $550 upgrade. The default GTX 660M is a glorified desktop GTX 650, built on the Kepler architecture and sporting 384 CUDA cores, a healthy 835MHz core clock, and 4GHz on the GDDR5. That chip really is a fine starting point for gamers, but leave it to AMD to offer what's probably going to wind up being the best price/performance option (just as they tended to last year), the Radeon HD 7970M. That's $200 for an upgrade to a fully enabled Pitcairn GPU that offers a notable improvement in performance over the $250 GTX 675M upgrade, which is just a rebranded GeForce GTX 580M. We have a review of the 7970M in the works right now, but there's really no reason to shell out an extra $50 for worse performance with the GTX 675M unless you absolutely have to stay in NVIDIA's ecosystem and don't want to spend up on the GTX 680M.

Outside of these components, the other gains are largely incremental. Most of the parts are actually identical to the M17x R3 (including the display), while memory speed has gotten a bump to DDR3-1600 and an mSATA port has been added along with Intel's SRT (Smart Response Technology) caching. We get a boost to Bluetooth 4.0 and the audio software/hardware has been bumped to SoundBlaster's Recon3Di. Nothing too exciting here.

Post Your Comment

61 Comments

You mentioned that ASUS improved their gaming line with respect to performance and cooling. Which of Asus's notebooks has a 680M GPU? Oh that's right, none of them do! The cooling in the M17x-R4 and it's bigger brother M18x-R2 are among the best in the industry (if not the best). Clevo (and its respective clones) are close but their newest P and HM models aren't quite as good. Reply

I mean maybe it's worse when they both have 660s, but...unless you're actually comparing them like that, it's not valid.

The price comments are strange too.

I just configured an M17x-R4 with a better CPU and a GTX 680 versus what the Asus G75 ships with, and it was only a few hundred more. That by comparison actually makes the R4 seem like the better deal.

MOST disappointed though in this Optimus stuff! Inexplicably Anandtech keeps referring to it as though it's a good thing, when of course it's horrible. Geez, IN THIS REVIEW it talks about a big game not working right...and STILL acts like Optimus is a positive?

The ONLY reason I don't own an R3 is because Asus' G74 DIDN'T use Optimus. Apparently the G75 does...which is a huge setback.

We'd JUST finally gotten to the point where notebooks were getting real driver support...something Anandtech long championed...and now that work is all being destroyed by Optimus and Enduro. I NEVER see people in real life wanting those, and yet Anandtech continues to ignore the problems and, and refuses to slam notebooks for including it? Where just a couple years ago they stood up against the ridiculous driver support situation?Reply

And again, we have Wolfpup railing on Optimus. Tell me what specifically doesn't work with Optimus? Don't tell me Total War: Shogun 2 because it works on the older driver set fine, which you can revert to if you happen to play that particular game at max settings. That's a driver bug and it will get fixed -- and unlike Enduro (up to this point) you get regular driver updates for Optimus laptops. Every single Optimus (and non-Optimus) NVIDIA GPU gets a driver update at the same time as the destops. The only exception is brand-new laptops where a new driver comes out within a couple weeks of the laptop launch; you might have to wait an extra month or two for the next Verde driver release before it gets support.

Personally, every person I know who plays games on a laptop and has used Optimus likes the feature. I don't know you, but you're in a very small minority I think. Even your last comment (in the AMD article) had someone specifically call you out and say "I bought a laptop specifically because it had Optimus." My guess: you actually don't realize that all NVIDIA Optimus laptops get driver updates the same time as other discrete-only GPUs. Which would show how much your opinion is worth, as it's based in a misunderstanding of reality.Reply

See, I don't understand where this discrepancy is coming from. ALL I see on message boards are non-stop Optimus problems. Blue screens. Games crashing, games refusing to use the dedicated GPU, stutter.

As near as I can tell it's a very, very small minority who LIKE Optimus-I literally only see positive things about it in some reviews like this-my assumption has been the hardware hasn't really been "lived with" with these positive comments about it-maybe I'm wrong though? But what's causing so many people to have problems with it?

I literally see people dealing with BIOS mods to get rid of switchable graphics-that appears to be common among people sophisticated enough to be caring about GPUs in the first place. As in some notebooks have no connection between the GPU and the display, but some do, and with modded BIOS can use it...NOT something I want to do, but there are active threads about multiple systems where that's just considered "normal" if you're going to consider a particular system. (I think the M14x physically CAN'T use the GPU without Optimus, but some HP models and I think the higher end Alienware models apperently can).

And yes, I know Nvidia supports this with driver updates/profile updates, etc. I never said they didn't. That doesn't change the problem-this at best hurts performance and causes issues. Outside the world of reviews, people "in the know" (which granted is probably 1% of people buying systems...but probably higher than that for people buying EXPENSIVE systems) avoid it.

I'm honestly not sure where the discrepancy is coming from, as really, Anandtech for me has always been the gold standard for doing honest, through reviews, avoiding/disparaging synthetic benchmarks and the like, but try finding anyone on say notebook reviews that HASN'T had a problem with Optimus on an M17x.

I guess I feel like this must be coming from just not living with the systems that are being reviewed, or something...maybe not using a broad enough swath of games on them. I'm not accusing you guys of lying, I really do trust you...but there really is some widespread problem here or I couldn't be seeing thousands of posts about fixing problems with it all the time.Reply

ASUS now supports GTX 670M in the G75, and while it's not at the same performance level as the GTX 680M, it's not that far off in terms of power draw. What I want to see is ASUS with a GK106 based notebook, and we'll likely see that soon enough. I just don't know why NVIDIA used up the GTX 670M and 675M with rebadges, as now there's no spot for the GK106 to slot. GTX 670M SE? Guess we'll find out some time in the next couple of months.

What I really don't get is why ASUS has never made the effort to get the top-tier GPUs into their G5x/G7x of late. Ever since the 5870, all of their notebooks have used third tier mobile GPUs (meaning, three steps down from the fastest cards), and yet their cooling appears to work very well. All they need is a bigger power brick and a slight ramp in fan speed and they should be set. My guess is that there just aren't that many high-end gaming notebooks being sold, so ASUS would rather target the $1400 market instead of the $2000+ range.

And yes, I know the top-end G75VW-NS72 costs over $2000. It also comes with 16GB RAM, 3720QM, Blu-ray, and a 750GB HDD with a 256GB SSD -- and a 3GB GTX 670M. I wouldn't necessarily go out and buy one, since you can get the same result by purchasing the G75VW-NS71 and upgrading the RAM ($45), HDD ($95), SSD ($195) and Blu-ray combo ($85). So $1400 plus $420 gives you the equivalent of the $2100 NS72. :-\ Or you could get 95% of the way there by just buying an SSD for $195 and sticking with 12GB RAM, 500GB HDD, and no Blu-ray support.Reply

Yeah, I'm starting to get disappointed that they're not offering better options. Even the GTX 675MX sounds like it would be a pretty great choice (or heck, the 675M).

I've been VERY impressed by the cooling in my G74 (yes, purchased because it DOESN'T have Optimus)...I've literally had it Folding on both the GPU and GPU 24/7 since January, and...zero issues. It appears to be running identically today as it did 10 months ago. The entire time the CPU has even maintained a turbo boost (or whatever it's called) from 2 to 2.5GHz, the temps are fine, etc.

Dear Alienware, please fire your marketing department and hire me. I'm sure I can come up with something consumers will remember, unlike "M17x R4" It doesn't exactly roll off the tongue. Take 10 people and tell them all about the new "M17x R4" and a day later ask them to tell you the name of the product. They won't remember. A certain highly profitable computer company realized this years ago. Reply

I think it's fine compared to what many other companies do.M designates mobile, 17 screen size, and r4 for the 4th revision. It's easy to identify and short. On the other hand companies like asus and hp have bizarre naming.Reply

Since they've got multiple laptop sizes it'd end up having to be something like the "17 inch Alienware 4"

However I'm not optimistic about Joe Fragfest's ability to remember anything more than "Alienware" and possibly "17 inches"; and don't see any reason to compromise the model number when only geeks/tech support are going to use it anyway.Reply