Post Your Comment

61 Comments

You mentioned that ASUS improved their gaming line with respect to performance and cooling. Which of Asus's notebooks has a 680M GPU? Oh that's right, none of them do! The cooling in the M17x-R4 and it's bigger brother M18x-R2 are among the best in the industry (if not the best). Clevo (and its respective clones) are close but their newest P and HM models aren't quite as good. Reply

I mean maybe it's worse when they both have 660s, but...unless you're actually comparing them like that, it's not valid.

The price comments are strange too.

I just configured an M17x-R4 with a better CPU and a GTX 680 versus what the Asus G75 ships with, and it was only a few hundred more. That by comparison actually makes the R4 seem like the better deal.

MOST disappointed though in this Optimus stuff! Inexplicably Anandtech keeps referring to it as though it's a good thing, when of course it's horrible. Geez, IN THIS REVIEW it talks about a big game not working right...and STILL acts like Optimus is a positive?

The ONLY reason I don't own an R3 is because Asus' G74 DIDN'T use Optimus. Apparently the G75 does...which is a huge setback.

We'd JUST finally gotten to the point where notebooks were getting real driver support...something Anandtech long championed...and now that work is all being destroyed by Optimus and Enduro. I NEVER see people in real life wanting those, and yet Anandtech continues to ignore the problems and, and refuses to slam notebooks for including it? Where just a couple years ago they stood up against the ridiculous driver support situation?Reply

And again, we have Wolfpup railing on Optimus. Tell me what specifically doesn't work with Optimus? Don't tell me Total War: Shogun 2 because it works on the older driver set fine, which you can revert to if you happen to play that particular game at max settings. That's a driver bug and it will get fixed -- and unlike Enduro (up to this point) you get regular driver updates for Optimus laptops. Every single Optimus (and non-Optimus) NVIDIA GPU gets a driver update at the same time as the destops. The only exception is brand-new laptops where a new driver comes out within a couple weeks of the laptop launch; you might have to wait an extra month or two for the next Verde driver release before it gets support.

Personally, every person I know who plays games on a laptop and has used Optimus likes the feature. I don't know you, but you're in a very small minority I think. Even your last comment (in the AMD article) had someone specifically call you out and say "I bought a laptop specifically because it had Optimus." My guess: you actually don't realize that all NVIDIA Optimus laptops get driver updates the same time as other discrete-only GPUs. Which would show how much your opinion is worth, as it's based in a misunderstanding of reality.Reply

See, I don't understand where this discrepancy is coming from. ALL I see on message boards are non-stop Optimus problems. Blue screens. Games crashing, games refusing to use the dedicated GPU, stutter.

As near as I can tell it's a very, very small minority who LIKE Optimus-I literally only see positive things about it in some reviews like this-my assumption has been the hardware hasn't really been "lived with" with these positive comments about it-maybe I'm wrong though? But what's causing so many people to have problems with it?

I literally see people dealing with BIOS mods to get rid of switchable graphics-that appears to be common among people sophisticated enough to be caring about GPUs in the first place. As in some notebooks have no connection between the GPU and the display, but some do, and with modded BIOS can use it...NOT something I want to do, but there are active threads about multiple systems where that's just considered "normal" if you're going to consider a particular system. (I think the M14x physically CAN'T use the GPU without Optimus, but some HP models and I think the higher end Alienware models apperently can).

And yes, I know Nvidia supports this with driver updates/profile updates, etc. I never said they didn't. That doesn't change the problem-this at best hurts performance and causes issues. Outside the world of reviews, people "in the know" (which granted is probably 1% of people buying systems...but probably higher than that for people buying EXPENSIVE systems) avoid it.

I'm honestly not sure where the discrepancy is coming from, as really, Anandtech for me has always been the gold standard for doing honest, through reviews, avoiding/disparaging synthetic benchmarks and the like, but try finding anyone on say notebook reviews that HASN'T had a problem with Optimus on an M17x.

I guess I feel like this must be coming from just not living with the systems that are being reviewed, or something...maybe not using a broad enough swath of games on them. I'm not accusing you guys of lying, I really do trust you...but there really is some widespread problem here or I couldn't be seeing thousands of posts about fixing problems with it all the time.Reply

ASUS now supports GTX 670M in the G75, and while it's not at the same performance level as the GTX 680M, it's not that far off in terms of power draw. What I want to see is ASUS with a GK106 based notebook, and we'll likely see that soon enough. I just don't know why NVIDIA used up the GTX 670M and 675M with rebadges, as now there's no spot for the GK106 to slot. GTX 670M SE? Guess we'll find out some time in the next couple of months.

What I really don't get is why ASUS has never made the effort to get the top-tier GPUs into their G5x/G7x of late. Ever since the 5870, all of their notebooks have used third tier mobile GPUs (meaning, three steps down from the fastest cards), and yet their cooling appears to work very well. All they need is a bigger power brick and a slight ramp in fan speed and they should be set. My guess is that there just aren't that many high-end gaming notebooks being sold, so ASUS would rather target the $1400 market instead of the $2000+ range.

And yes, I know the top-end G75VW-NS72 costs over $2000. It also comes with 16GB RAM, 3720QM, Blu-ray, and a 750GB HDD with a 256GB SSD -- and a 3GB GTX 670M. I wouldn't necessarily go out and buy one, since you can get the same result by purchasing the G75VW-NS71 and upgrading the RAM ($45), HDD ($95), SSD ($195) and Blu-ray combo ($85). So $1400 plus $420 gives you the equivalent of the $2100 NS72. :-\ Or you could get 95% of the way there by just buying an SSD for $195 and sticking with 12GB RAM, 500GB HDD, and no Blu-ray support.Reply

Yeah, I'm starting to get disappointed that they're not offering better options. Even the GTX 675MX sounds like it would be a pretty great choice (or heck, the 675M).

I've been VERY impressed by the cooling in my G74 (yes, purchased because it DOESN'T have Optimus)...I've literally had it Folding on both the GPU and GPU 24/7 since January, and...zero issues. It appears to be running identically today as it did 10 months ago. The entire time the CPU has even maintained a turbo boost (or whatever it's called) from 2 to 2.5GHz, the temps are fine, etc.

Dear Alienware, please fire your marketing department and hire me. I'm sure I can come up with something consumers will remember, unlike "M17x R4" It doesn't exactly roll off the tongue. Take 10 people and tell them all about the new "M17x R4" and a day later ask them to tell you the name of the product. They won't remember. A certain highly profitable computer company realized this years ago. Reply

I think it's fine compared to what many other companies do.M designates mobile, 17 screen size, and r4 for the 4th revision. It's easy to identify and short. On the other hand companies like asus and hp have bizarre naming.Reply

Since they've got multiple laptop sizes it'd end up having to be something like the "17 inch Alienware 4"

However I'm not optimistic about Joe Fragfest's ability to remember anything more than "Alienware" and possibly "17 inches"; and don't see any reason to compromise the model number when only geeks/tech support are going to use it anyway.Reply

Desktops? Yes. Notebooks and laptops? Only if you're willing to go with one of the Clevo or MSI offerings, which both have a large number of issues. Alienware's designs definitely aren't perfect, but I can guarantee you that the M17x R4 is better than the Clevo P170EM in so many ways that it's not even close. The only thing going for Clevo is pricing, but to save $300 on a $2500 notebook you have to get an inferior keyboard, touchpad, firmware (power management), and chassis.Reply

Not true - some of us simply can't lug desktop & screen across the world, and are at the mercy of what laptop makers offer.

My M18X R2 is head and shoulders above my last three Clevo's, in terms of construction, performance, and audio [oh, but the glossy screen!].

Without it, I'd never game, as I couldn't be carrying around both Xbox / controllers / PSU / games, and a laptop.

In addition, our in-house software is very heavy on CPU / memory, crashes frequently (necessitating reboots), and i7 Extreme, twin SSDs in RAID0, & 16GB of memory make a nice combination for getting things done, which is a BIG part of the draw for me on a purchase such as this.

If you know where I can get this performance, in a mobile package, for less, please enlighten me. Do try to remember it must get pass the lady at the 'check in desk', and a desktop & monitor won't cut it.

As for the name, suits me fine, but as a biker, I'm used to number/letter-names.

Wrong. I am in the industry and have built all my laptops up to this point. Yes, Alienware is expensive, heavy, and usually needs a cooling pad. I bought m17x r4 because of the form factor and packaging. All of the other gaming laptops are boring! Samsung's series 7 gamer is a boring piece of Charcoal color. Asus isn't any better. It's like buying a car. Am I gonna buy a Jeep Liberty or a Dodge Nitro when given the choice. I go for looks and packaging. Alienware's aesthetic is slick and sexy. In essence I'd say Alienware's marketing works just fine. Reply

M17x is the generic name, and as redchar mentions, the M even stands for mobile.

Really the 'x' is the only part of the name that doesn't seem to mean anything. Personally I think this is one of the absolute best computer names on earth since it actually means something and isn't 308ch792y8-du219 like most computer models are lolReply

Clevo should fire their marketing department... why is there no P170 based system on your benchmark comparison... Only two others with 580Ms as comparison and the one in the MSI barebones chassis (with the 675M) throttles like a b**ch.

Wheres the other models with 680M or 7970M. Clevo's competing products offer better price/performance and the cooling is up to scratch (I have P150HM/2760QM/GTX580M)

Similar hardware for a good discount, they have their issues (keyboard...) but it is just a glaring omission for this review not to consider ACTUAL competing products from the same class, either Clevo's for not supplying them (my suspicion) or AT for not putting them on. Even last gen would be worthy comparison, but for the only Clevo on this table to be an 11 inch with a mid range GPU is nuts...Reply

It looks better in images... Dustin hasn't actually used it in person I don't think, and I can attest that the new keyboards actually feel worse than the old ones (and continue to have wonky layout issues). They fixed the number keypad but screwed up the Windows key, took out the context key, and put two backslash keys on the keyboard. I understand Clevo targets an international community, but they should just have a few separate hardware layouts for different regions rather than reusing the same layout and relabeling keys.Reply

Either I'm confused or there is an error in those graphs. The Samsung Series 7 and the Clevo W110ER both have 62Wh (according to the charts) and the Samsung beats it in all the tests. Yet when you normalized min/Wh the Clevo appears to be more efficient.Reply

The Samsung has a 77Wh battery; I've updated the charts for Dustin. As for the W110ER, I'm still not sure how Vivek got those numbers, so you'll have to ask him. Unfortunately, he no longer has the Eurocom Monster 1.0 -- I wonder if Eurocom actually managed to fix the battery life somehow and other Clevo W110ER units are still getting crappy power optimizations? I tested a W110ER from another company and got half the battery life; they ended up asking for the unit back to "look into the problem" and never sent another, so I assume there's a core issue that Clevo isn't fixing.Reply

Thanks for always replying to my posts about the W110ER's battery life. :)

I have no idea how many of them have been sold but I have to consider it "popular" for a niche device so if anyone at Anandtech wanted to look into it deeper I think there would be an audience eager to read their results. Maybe even see how a BIOS mod changes things (http://biosmods.wordpress.com/w110er/).Reply

Sadly, as you can imagine no one is really interested in sending us an "older" laptop like the W110ER, especially if all we're going to do is double-check the battery life (and probably end up disappointed). For the record, my test results from a system we ended up sending back before completing the review show the following with a 3610QM CPU:

Idle: 217 minutesInternet: 209 minutesH.264: 187 minutes

That last item tells you just how bad the battery life is (was?) optimized on that particular unit, as H.264 battery life is typically 2/3 of the Internet battery life, which in turn is about 80% of the Idle battery life. Based off of the "estimates", assuming the H.264 result is a good starting point, the W110ER should be getting 280 minutes Internet and 350 minutes Idle, and of course the H.264 result is already low to begin with.

Vivek's numbers on the Monster 1.0 actually seem perfectly legit (407, 338, 218 means Internet is 55% better than H.264 and Idle is 20% better than Internet). So the question is, how did Eurocom get such good results when no other W110ER seems to? Clevo is often pretty lousy at power management, and the P170EM and P150EM are right in that same categorization. They should be paying Eurocom for whatever fix is present in the Monster.Reply

General settings: use the power saver profile, minimum CPU set to 0%, maximum set to 100%, and cooling set to passive. HDD set to go to sleep after 1 minute. WiFi is set to maximum power savings. Display is set to 100 nits (not sure where that is on the W110ER -- I think it's two or three steps down from max), and the display shouldn't turn off or dim. System critical battery life should shut down at 1% (or if you can't set that low, 3%) battery life, 0% reserve battery, and no sleep warnings. Basically, we're setting things up for best-case battery life.

Idle testing: run laptop until it's out of power (<3% battery). Audio should be muted, WIFi disabled, and that's it. I use a batch file to spit out the time every minute to a text file, and then you just subtract the start time from the finish time to get battery life. (Note that this is not truly idle, as the Batch file needs to access storage every minute.)

Internet testing: we open saved versions of four web pages every 60 seconds in Internet Explorer. Again I use a batch file to do this, that also spits out the time every minute. IE is set to empty temp files on exit. The batch file closes IE and restarts it, with the home pages set to these four: http://images.anandtech.com/reviews/mobile/interne...

Sounds like Monster does have better battery life than stock W110ER, though, given you're able to watch four hours of movies.Reply

I'm able to set most of those settings to spec. Wifi adapter is disabled and wifi turned off. I've set display at 40% brightness on battery life.It is setup for 1% battery life shutdown, sleep is disabled. Everything else set to your suggestions.

The laptop is currently running. 11 minutes have passed to reach 96%. Projected idle is around 260 minutes. Will report again once it is complete.Reply

Ok, there's a distinct possibility I have been testing in High Performance mode this entire time. If that is the case, then there's a battery floor for 266 minutes, and Vivek's numbers make more sense.

Additionally, I am distinctly not able to guarantee 100 nits.

I cannot do further testing today. Either way, we have another benchmark. Reply

So you measured 266 minutes at idle? Seems rather low compared to Vivek's numbers, unless some other setting is messed up. I just wish I had done the Monster 1.0 testing so I could respond with confidence in regards to the numbers.Reply

My setup is not a perfectly clean Win 7 build. I discovered I still had lavasoft ad aware, and some other processes in the backround (using minimal CPU, not doing active scans). I also suspect I have been doing high performance mode, requiring minimum 100% CPU as opposed to 0-5% and the brightness seemed higher than 100 nits even at 40%. That last bit is a subjective look, as I do not have light measuring equipment on hand.

I have the AUO matte display, which likely has different brightness characteristics than the one you and Vivek received.

So there's multiple variables. Mine look like there's been CPU usage, as its about what I expect from watching videos.

Coasting by on the same chassis is not good. Happens across this company's entire lineup; my M18x R2 shares the same glaring design flaw (melting SLI cable hehe) as the R1, something they should have taken the time to fix. Good thing there's a workaround for people willing to pop the lid on the machine.

Speaking of price, if you're in the military you might be surprised at what you can get these machines for, but take my advice, skip out on the SLI laptops. The SLI cables aren't built last.Reply

I'm envious of your AW experience :) My R1 and replacement R2 both had cables that fried on top of the heat-sinks. It was a big sigh of relief when I discovered how easy it was to fix. I'm a happy AW camper now though.Reply

I think that's the same laptop platform that they reviewed in iBuy Power branding a month ago. Specs aren't identical with laptopmag's test model; but reviewing two models of the same platform offers little value compared to looking at a second vendors design.

high clock rates suck power; in mobile platforms they want to minimize that as much as possible. They're also not stupid and I see no reason not to assume their default GPU/memory clocks were picked to give the maximum average fps scores within their target TDP.Reply

I've owned and spent hours and hours with both single and dual configurations of the 7970m and 680m (have an m18x with dual 680ms right now). In single configuration, on an Alienware, 7970m is the way to go: drivers are nearly as good as nVidia's and performance with the 680m is neck-and-neck (which 680m being about 7% faster on avg.). The $250 lower price of the 7970m vs. the 680m wins.

That said, a single 7970m has problems on Clevo laptops in the form of AMD's god-awful power management system, Enduro. You see around 10% reduced performance on Clevos/Sagers when compared to Alienware's which do not have the Enduro issue. If you're going Clevo/Sager, go 680m.

Finally, 7970m Crossfire drivers are a bit of a mess: lower gains than nV and, in some games, no gain at all over a single GPU. If you're going dual GPU in a laptop, go 680m SLI.Reply

Enduro can be even more than a 10% loss of performance -- try more like 30-50% in some games (depending on settings). However, AMD is aware of the problem and tells me they're working on a fix that should hit in the next month. We'll see... 7970M Clevo review coming soon from me, though!Reply

Really a pity they didn't do any updates except replace the innards. If there every was a time to buy a gaming laptop, it would be today, 60 fps @1080p with their hardware being up to date until late 2013 when the new console generation hits. Not bad.

Holding on to my M17xR2 until a "good" change comes along. I can't give up my screen resolution as I use mine for both work and home. I need the extra space. The second issue is that all the new versions are rehashes to me, as what you've said. I wouldn't mind a complete overhaul of the platform as it is beginning to get stale, outdated and as someone had mentioned before litter with defects from previous if they haven't address them (whether yours have them or not).

The R2 still runs great and I can get better performance even I do a video and cpu upgrade in the future. With this is the mix, I'm not looking forward to buying another anytime soon. Changing the screen back is a win/buy for me, instantly. They just have to do it :)Reply

The Alienware m17xR2 has a 16:10 aspect ratio RGB LED screen, and has 2 mxm slots for SLI and Crossfire. The Bios is also rather flexible as it allows up to 16gb of ram and gpu upgrades. I know people who have gotten m7970s to work in crossfire on the R2. It is the modder's laptop. :)

Just wondering if Alienware 680M's have a mod for the Vbios like the MSI and Clevo 4GB models that are letting guys get over 8,000 3D11 scores? If not does the chip overclock well with the standard voltage?Reply

As an M17xR4 owner I had to point out an issue to potential buyers. I purchased the WD My Passport 2TB external USB 3.0 drive for this and it has a problem where it will constantly disconnect. Sometimes, it can transfer large files, but as it changes to a second or third file it disconnects.

I have troubleshot this in a forum with others since July. There has been an Alienware tech there that was not much help, and I called in on my own, but got nowhere. They do not support third party hardware, even though it is "universal." They will not tell us the results of their testing, or give us a work-around.

The problem is not just the big drive I have, but nearly all of the WD line. My guess is that the WD SES driver conflicts with the Intel chipset eXtensible driver, but I get no traction from WD either. The drive works fine in other computers, and it works fine when connected to the M17xR4 in the USB 2.0/esata port.

What irritates me most is the complete unwillingness to do anything. Support is one of the reasons I justified the price of purchase. I found a work-around: just use a USB 3.0 hub, powered or not. I got a small 2-port hub, and while it is a another adapter I need to carry, it is not too bulky. Why could WD or Alienware not suggest that? I think they do not want to admit any kind of responsibility.

While most of this has been negative, I do otherwise love this laptop. I chose it over the Clevo units to to design and sound/thermal characteristics. I do love the keyboard, although only an Alienware has me wanting more. I wish the colors would smoothly transition constantly rather than from one to another and then pop back. It is certainly bigger than the standard 17" size that most bags support, but it never fails to impress people when I pull it out.Reply

A laptop which uses previous gen graphic card 675 (580) for more than $3000... thats what is in the alienware india site..It seems as if they dont want to sell any laptops in India...charging RS 1,70,000 and dont even have the option of the 680m card...They are dumping r2, r3 etc which the US customers dont buy on the Indian consumers and that too for moreReply

I'm really gonna sound like an idiot here, because reading your posts, I understood about 1 out of 10 words/numbers you put there so I'll try my best. I'm a young gamer and I've been saving up for my own laptop for a while now and I really love the looks of alienware... I'm looking for high performance and speed. I'm not a huge whiner about screen quality... But i would love the extra convenience of having something like that... Would this laptop be a good recommendation? I just need a pointer because I really want something quality :)Reply