Also a big point of discussion for the Clevo w230st are the CPU options.. 4700, 4702, 4800 & 4900... It seems most on the forum are recommending the 4800.. but reading this it sounds like even the 4702 can get quite toasty....Reply

Damn, not sure I can wait that long before ordering it... Oh well, would love to see it run through the paces of an in-depth review all the same. Tossing up on the CPU's, the lower TDP of the 4702 is tempting over faster clocks.Reply

I don't expect it to be much faster at all -- we're still looking at i7-4700MQ with GTX 765M. With that CPU and GPU and a 256GB mSATA SSD + 750GB HDD, the cost is $1800. You also get a 1080p LCD. The display is really going to be the reason to go with the Alienware 14 I think, and to get that you'll have a heavier laptop that's twice as thick. On the other hand, I expect it will run cooler and quieter thanks to the size, but over six pounds means you can readily compare the AW14 to 15.6" options out there.

The Clevo W350ST is a 15.6" laptop with similar dimensions to the AW14, and with similar components it can be had for around $1500. Not that I expect the Clevo W350ST to be built better than the AW14, but the dimensions are 14.72" x 9.84" x 0.64"~1.68" and it weighs 5.95 pounds where the 14" AW measures 13.31" x 10.17" x 1.58"~1.64" and weighs 6.12 lbs. So the Clevo is wider but not as deep and has more of a wedge shape, and it weighs a bit less.

Anyway, we're definitely going to see about reviewing the Alienware 14 and 17 as soon as we can.Reply

From all the reviews and user commentary I've read, the Alienware 14 is definitely not exactly "cool and quiet" some users have complained about how loud it is. Also, be prepared for the Alienware 14 to play games at lower frame rates than the Razer due to 33% more pixels at native resolution.Reply

"Cool and quiet" with these kinds of specs in a reasonably portable machine doesn't really seem possible with the current crop of CPUs/GPUs. Do you want quiet or do you want adequate cooling? I'm not really comfortable at all with a brand new machine hitting 98 degrees Celsius under load, or even the 93 degrees the Blade hit. I'll deal with a louder fan if it means I don't have to worry as much about throttling (or heat induced crashes). Especially on a gaming focused machine where I'd have the sound cranked up a bit or headphones in while using it the way it was intended.Reply

The current retina gets quite warm. I used one to play starcraft and its keyboard it unusable due to how hot it gets under load. The design is beautiful but it does not displace heat well at all. If you intend to game I don't think the retina is the way to go.Reply

It totaly baffles me not only that they include such a crappy screen (1080p-panels can be had for like $50, should be even less if you buy thousands of them), but also the glossy plastic exterior. MSI has been doing this for such a long time, you'd think they would understand that this doesn't appeal to the European- or US-market as much as it does to the Asian-market. The only thing they really get right, is the price. Although for me this still has to many downsides. At least the CPU+GPU is a better pairing then their GX60, where the AMD-cpu halfs the gpu-performance because it's so slow.Reply

Trinity/Richland simply aren't that powerful. Enduro might be part of the problem, but we can't test the notebook wit Enduro disabled. We're working to get a desktop simulation of the system going, but I don't have the necessary parts so it has to be someone else. I think Ian (over in the UK) may be doing this at some point in the next month or so.Reply

It seems that the 14" size is still not that common and all available panels are older mediocre TN-film ones with HD+ resolution. But, as Lenovo Carbon X1 shows, at least it can have a decent brightness output.Reply

While your tests are normally quite comprehensive, one thing I don't understand is why you don't have a battery gaming test? (ditto for the Blade 14 review) After all this is a gaming laptop. A likely use case is to use it for, er, gaming on a long plane or train ride...

Suspect the gaming loads are going to be signifiantly different from your Heavy test with full GPU usage. To me whether I'm going to get 1 hour or 2.5 hours of Skyrim has a significant impact on my purchase decision. I suspect others will be likewise.Reply

If you want decent gaming performance while on battery power, I don't know of any specific laptop off hand that will get more than an hour or so. Consider: the CPU has a 37W TDP, and under gaming loads it will at least get pretty close to that number. The GPU meanwhile also has a TDP in the realm of 35-40W. Add in the display, HDD/SSD, motherboard, chipset, etc. and under a gaming load (without throttling the GPU way down to, say, 1/3 the normal performance), you're looking at 80-100W of power use. Result: less than one hour of gaming.

I'll go ahead and run a gaming workload tonight -- I'll probably just load Skyrim and sit in Whiterun and let time pass. My guess is at 200 nits (80%) with the GPU set to "Prefer maximum performance" ("adaptive" in the NVIDIA control panel will lower clocks on battery power), we'll be lucky to break an hour. Stay tuned....Reply

Okay, so that didn't take long to see what happens in terms of performance. Plugged in, Skyrim is running at around 67FPS in Whiterun, at the "High" preset with 0xAA and at 1600x900. Unplug it, and even with the GPU set to "Prefer Maximum Performance" it drops down to 30FPS. Looks like NVIDIA is detecting battery power and shooting for 30FPS, and anything more than that is unnecessary. It's not actually a bad way of doing things, but don't expect the same performance on battery as what you get plugged in.

Specifically: plugged in, the GTX 760M is running at 718.5MHz/4008MHz GPU/RAM, at 0.893V. The GPU load is 90-97% at these settings. Unplug the laptop and the clocks drop to 627.1/4008MHz, but more importantly the GPU load is at 40-50% and FRAPS is reporting a steady frame rate of 31FPS.Reply

I would be very surprised if the Razer doesn't do something similar to the GE40 on battery power. It's basically the GPU not running at full speed because it doesn't need to in order to hit 30FPS -- so in games like Metro, it will actually run at full speed but in others it won't. Anyway, I did't test the Razer so I can't say for certain, but I've yet to see a GTX notebook that doesn't adjust the GPU clocks/performance down on battery power.Reply

Cheers for the response - much appreciated. Sorry for sounding a bit grumpy earlier!

The reason I ask is I've been getting wildly different estimates of gaming battery life for the Razor 14 all over the web - some websites say 1 hour, some websites say 2-3. It would be good to have some definitely numbers.

I don't think gaming battery life is /always/ that bad. I can get 2-ish hours gaming on my m11x on battery power. Definitely for classic "gaming laptop luggages" where the battery is effectively a 2 kilo UPS system you're lucky to get beyond 45 mins, but for smaller form factor gaming laptops I think you can do better than that.

1 hour isn't really sufficient because as soon as you boot up you have a lurking paranoia about the batt meter going down. 1.5+ is where you can have a proper "session" between destinations...Reply

I can play casual games (think world of warcraft, portal etc) on my X230 for about 2.5h. I'm expecting the haswell version to do slightly better. Note its on a low res screen, hopefully these are a dying breed but every laptop seems to continue to come out with them. Higher res screen is going to use more GPU and battery so it might be a wash.Reply

That was sort of the point I was trying to make: if you have more than just a CPU with iGPU (or an APU in the AMD world), you can't get reasonable gaming performance and great battery life. Games will max out the TDP of both the CPU and GPU, generally speaking -- though it's possible to throttle the GPUs to help reduce the load, which both AMD and NVIDIA do by default. If you have a 70Wh battery and a CPU/APU that has a 17W-35W TDP, though, you can get 2+ hours. You just won't be running at high detail and 1600x900.Reply

Jarred, please do a review of the Alienware 14 and 17. I think they have started to offer some decent IPS screens with matte options and I would love to read Anandtech’s take on these laptops before making a purchase.Reply

1. 768p panel is too far the other direction and it's not high quality, but performance at that res is great.2. 1080p panel will be slower than the Razer 14. If you ask me 1080p for gaming on a 765M is not going to give much longevity. The panel itself is above average so if you are planning on doing work on it and gaming is secondary it's probably a better overall option.3. I've read the Alienware 14 is hot and loud. Razer 14 is hot, but not that loud.4. Alienware 14 is much bigger and 2 lbs heavier.5. Alienware 14 can be cheaper or about the same price as the razer depending on config. The razer is only configurable storage size.

I don't think it's fair to include the ge40 in that discussion as it's outclassed by both of them from a gaming standpoint.Reply

I think anandtech needs to give more information about the cooling in notebooks reviews.

For starters, only the temperatures are reported. But i think the temperatures are actually largely irrelevant. It doesn't matter to me if the CPU is hitting 70° or 95° as long it does not throttle down.

I usually load up prime95 and take a close look at the 'package power' meter in HWinfo. If it ever drops below the TDP, the cooling will get in trouble when running basic CPU-only simulations.

Stressing the GPU in a controlled enthronement is a bit harder to do. I like RTHDBIBL, but the scene complexity is a bit low for modern GPU's. Maybe try one of unigine's older tests. Im not sure if they support benchmark looping.

If it can't run prime95 and a GPU stress test at the same time without throtteling on the CPU part, it's probably not up to the task of playing games all day long. That's valuable information, much more valuable than the temperature of the CPU. The GE40 reaches 98° unload load. But at what clock speed is it running? 800 Mhz? 2.x Mhz?Reply

I would have reported throttling had it been evident. The CPU speed is dropping down from max turbo, but it's staying in the rated 2.2-3.2GHz throughout all the testing that I logged. In fact, for Metro Last Light, out of 500 data points over 1000 seconds, there was only one instance of 2.3GHz; everywhere else the CPU close were at least 2.4GHz and the average was 2.8GHz. I won't go so far as to say that the GE40 will never throttle while gaming, but it's definitely doing better than the Dell XPS 15 or Samsung Series 7 managed to do under similar testing.Reply

Just in case you do review the Alienware 17, please do note that it does throttle above 77C. The same goes for my current m17xr3. Despite reading almost every review on the net, i found out about this only after buying it. None of the reviews mentioned this.And this is not a throttle that can be ignored by someone like me from India where the ambient temperatures are quite high. I regularly hit 77C and it is a huge pain.So in your reviews please mention this.Also it would be great if in your gaming laptop reviews you would mention the gpu temperatures for different ambient temperatures (like doing the tests with AC on and then off).And, as always, keep up the excellent reviews :)Reply

Agreed that 1080p is unlikely to be usable in newer games, but it should be usable for other tasks (older games, video, etc.) and it's always good to have options, especially cheaper ones!

From the very limited youtube video hands-on looks at it, the screen is supposed to be fairly decent; hopefully it'll be better than those on the GE40 & Razer Blade.

IPS is nice, but only really addresses viewing angle washout and not a must-have for me. Image quality would be more important, and non-IPS can equal or better IPS panels in image quality - look at the Sony Z for great non-IPS displays (the Sony S13 screens however...).Reply

According to Gigabyte site, P34G will have 14" FHD AHVA display, which is sort-of-IPS from AUO. Notebookcheck tested Clevo W740SU also with 14" FHD AHVA and it had excellent viewing angles and color reproduction although brightness and contrast were only decent.Reply

Thanks, hadn't seen that one yet. With what's currently possible in hardware, these are getting closer to an ideal machine for me (until integrated graphics can do 900p+ gaming @ high settings). I _really_ like the understated design of that one. I'm not saying it's beautiful, but compared to most gaming laptops it looks worlds better. Hopefully the non-IPS panel is of decent quality. Too bad about the tiny 47Wh battery though, that will be a deal breaker. Take the same design, throw in a high quality 3200x1800 panel (game at 1600x900), drop the vga for mini-DP, and put in a ~80Wh battery and I'm sold. I think we'll see something like that soon. At least I hope we will.Reply

Once again...no Thunderbolt. These machines would make great travel video editing machines...if they would include some way to actually stream video into them. (You can convert Firewire to Thunderbolt)Better LCD, include Thunderbolt... I'll be ready to buy.Reply

Streaming video from what? Many cameras just use USB for streaming because even USB 2.0 can handle most streams. Some older cameras do have firewire, it's true, but it can be kind of niche. But if you're oversaturating a USB stream then the camera usually comes with an Ethernet port anyways and the only time that happens really is if you're shooting 4k uncompressed. Kind of sounds like your gripe is more you just wanting something more than needing it?Reply

Better Thunderbolt (and Thunderbolt 2) usage is be external GPU's. Imagine these thin-and-lights sporting a decent 760m, so you can carry it around and game as you want (while plugged in), but then bumping it up to SLI 780s once you get home? Gigabyte's got Thunderbolt on the P35K, but unfortunately not the P34G...

They just need to fix the inability of eGPU's to output to the laptop screen... though, then again, if you have a eGPU setup, you'd likely have a 21"+ monitor too.Reply

Careful when upgrading radios, it seems they're all putting whitelists pointing to new parts they sell...I have a 3G radio in my X220 which works great, popped it into the X230 I *had* to upgrade to (call me weak) and sure enough it wouldn't boot up. The "old" card wouldn't work. So I had to pony up 125 bucks for the "new" card...which is the exact same speed lol.Reply

The bigger OEMs (Acer/Gateway, Dell, Sony, HP, Lenovo) all definitely have whitelisting in the BIOS and it's difficult to upgrade the WiFi. I know you can often cover one of the pins on the adapter so that the card will always be on (I've done this on an Acer in the past), but that's not a perfect solution either. The whitebook laptops (like MSI, ASUS, Clevo) in my experience are less likely to lock out other WiFi adapters, but short of trying it out (or Googling), I don't know if they'll actually work as well. Hence why I say, "You could try to upgrade WiFi as well—I don’t know if there’s any device whitelisting in the BIOS by MSI; hopefully not, as slapping in a better 802.11ac WiFi adapter would be a handy upgrade."Reply

The biggest issue with LCDs is that many of the lower end laptops ship with single-link LVDS conncetors that basically max out at 1366x768 -- they don't have the wires in half of the cable to transmit data. If you have a dual-link LVDS cable, you can pretty much use anything up to WUXGA resolution in theory, but again there are other aspects to consider.

One potential problem is with the whitelisting you mention. It's there on WiFi in theory to keep people from using unauthorized hardware -- FCC requires certification for any wireless devices. Well, there's also stuff in the firmware sometimes for LCDs. I had a Dell XPS 15z a while back where the LCD cracked. I got a replacement and while it would connect and power up fine, it never displayed content properly -- it was all garbled. The company that sold the LCD panel had me ship the original cracked panel back to them and they were able to copy over the firmware or something for the display so that the laptop would recognize it and work. Welcome to the proprietary world of laptop displays and wireless networking. Ugh.Reply

actually... if you are just doing some upgrade of the RAM, MSI will still honor the warranty.. i had experience of that before. I called and spoke to the tech support, and they replied saying the RAM upgrade doesn't void warranty.Reply

What's starting to annoy me is still, a lot of these laptops that aren't ultrabooks and whatnot, come with a VGA port. I mean, sure, a lot of external displays can take it, but it just feels very out of place. Throw in a Mini-DP and another USB port!Reply

Yup. There are a ton of projectors out there that pre-date HDMI and DP. It would definitely be nice to be able to run three digital displays off of a laptop, but most companies are so busy cutting corners that it's not even a minor consideration.

Engineering: "For $5 extra, we can add two more digital video ouputs."

Management: "What!? Forget that -- we can save $0.05 by using 100Mbit Ethernet instead of Gigabit Ethernet. That's what we need to do! And while we're at it, find a cheaper LCD -- there's no sense spending $75 for a display that most people won't notice! All they care about are mega-whatevers and giga-things, and we need to give them lots of those. Screw abstract ideas like build quality and color accuracy...."Reply

Hahaha...you brought up one of the things that drives me bonkers. There's no excuse for 100Mbit Ethernet STILL showing up in so many laptops. The BOM difference has to be almost non-existent these days, and yet we still see machines north of $500 only offering 100Mbit. I've seen so many machines in the past year that look kind of interesting as low cost general compute devices, then I scroll down and see "Fast Ethernet". It's like a cruel joke. Reply

We have exactly opposing sources for annoyance :-) I am looking for these kinds of machines because I want something that I can take with me on business trips: powerful enough to game on when in hotels on evenings and good for doing my work when on client premises. I need to hold a lot of presentations so connectivity to projectors and external screens is important. Some of my collegues use these dongles to connect HDMI or some other port to those VGAs that are everywhere, but often those dongles are lost, or broken, or twist so that suddenly picture disappears... not good. I have had to pass on quite a few machines because they do not have VGA port anymore (for example that Razer Blade does not have one). Reply

Alright, great review in general - helpful, objective and well-written and I really mean all that. Thank you for that.

However, every once in a while I come across a comment or conclusion in one of the articles I read which makes me go - whaaat?! Why did they (this is not aimed directly at Mr. Walton) write this...

I don't mean to be disrespectful, but how can you say "The truth is, we’re probably still a couple years away from seeing this level of performance in a laptop this size that runs cool and quiet;" when such a laptop exists today, not to mention your colleague review it and reached the opposite conclusion.

Not to mention the following statement or, more importantly, what it implies - "... physics can be such a drag,", yes, indeed physics not only can, but always is a drag. Why would you, however, have your audience believe that laptops such as the Razer Blade 14 and the MSI GE40 are the best thing for thermal design since heatsinks? Yes, the Razer is very well-engineered (the MSI is just bold and reckless) but neither of which pushed the envelope to the limit. I'm not trying to play wannabe thermal engineering here but if we have to talk about the facts of physics better thermal solutions for thin laptops can be designed right now, not a couple of years from now. Yes, it's easier to cool more efficient and thus cooler parts but thermal design isn't stuck, we're far from greatest in this field. The fact that it hasn't been done so far doesn't mean it's impossible. I suspect it's a matter of unwillingness/cost rather than a physics barrier. Designs such as the rMBP 15 and the Blade 14 are but a glimpse into what can be done with custom cooling solutions.

I'm sorry for this rant, but it's really unpleasant to read such a conclusion to an otherwise excellent review. I hope that at least a few people agree.Reply

Which laptop have we reviewed that's thin, light, and powerful without running either hot and/or loud? The Razer Blade 14 hits 93C under stress testing, just slightly lower than the GE40. It could cool more with higher fan speeds, but I consider "cool and quiet" to be more like 70-80C max on the CPU and GPU with fan noise well under 40dB. We are nowhere near that level right now.

The rMBP 15 is larger and while it doesn't get quite as hot, it's not exactly cool under a full load. More importantly, it's slower on the GPU than either of these laptops by a sizable gap. GT 650M is half the cores of the GTX 760M/765M -- it's not a bad GPU, but it will struggle quite a bit with higher detail gaming. I'm not trying to say any of the current designs are bad for trying, but if you want something that's as fast as the GE40 in a similar size chassis, and you don't want it to run loud or hot? It's just not going to happen right now. You either need to be thicker/larger to get more cooling, or you are going to make some noise to cope with the heat, or you're going to get hot. We've seen different laptops balance those factors to varying degrees, but really we will likely need one or two more generations of iGPU upgrades and at least one more process shrink to get to the point where we can have a combined 35W TDP for the CPU and GPU with the performance of the GTX 760M.

The Crystalwell parts from Intel offer a bit less performance than GT 650M. Maybe Broadwell can get us there, but more likely it will be Skylake -- and when Skylake arrives, we will want twice the performance of it's iGPU, just like today, in order to run all the games at acceptable levels with moderate-to-high detail settings.Reply

Thank you for responding; you should know it was not my intention to be offensive. I agree with the vast majority of what you say, and I also like how you speak your mind freely and tackle different issues - $5 extra for video outputs vs. $0.05 less for Ethernet – so true. I’m not trying to question your knowledge, on the contrary – I take your professionalism for a given. It’s more of an issue of presentation and possible misinformation. Which, as I’m sure you’ll agree, is not the intended effect. It’d be unfortunate if I’m being perceived as some sort of crazy internet troll, I’m just trying to give feedback here.

The thing is that your conclusion might be read and understood as ‘if you want a thin & light gaming laptop the MSI GE40 compromise is your best bet for now and that’s because it’s physically impossible to do better’. None of which is actually true (again we might be arguing linguistics here); and I’m not acting crazy, I’ve had 4 people, with and without technical knowledge, read the last page of your review and they all agree that’s a very likely interpretation.

Yes, Blade 14’s CPU goes into the lower 90s and while that’s hot, it’s not out of spec. In my opinion, there are more important factors to consider. Your colleague concluded that Blade 14’s cooling solution is quite adequate and capable of handling the heat. He also noted there’s room for improvement which is absolutely true. It’s not just a matter of raw numbers, for example, both Blade 14 and GE40 hit the 90s, however the former has a far superior cooling solution and shouldn’t have problems in the long run.

If a certain temperature doesn’t cause throttling and is safe to run at for years, and provided the surface of the product it’s in doesn’t get hot to the touch, what is the problem? No, it isn’t optimal, it isn’t perfect but it gets the job done; let’s say it’s ‘consumer-ready’. I’m not saying we should be content with running microprocessors at 90+C, but I also don’t think we should accept huge “laptops” *cough*Alienware*cough* and wait for integrated solutions to get better for ‘thin & light performance’. Besides, as you said, by the time iGPUs get at this level of performance it wouldn’t be sufficient anymore.

I also cannot accept what you seem to continually suggest – that we cannot get better cooling in the same form-factor. Getting thicker is the crudest way to better cooling performance and not the only solution. We can use denser and wider (not taller) heatsinks, bigger and more efficient fans, better materials throughout, etc. Those things would take up more space, but space management is not optimal nowadays either. We could go even further into more exotic solutions and by exotic I don’t mean crazy expensive. There’s a whole world of possibilities out there but it takes time, money and innovative engineers.

Yes, you’re correct, we probably won’t get better products right now but not because of physics.

I apologize if I’ve been a little too harsh in my quickly-written first post. I sincerely hope to get another response from you. Thank you for your time.Reply

I figure when Razer and Apple have products at $2000 and can't get cooling to the point where we think it's working ideally, it probably won't happen right now. There's definitely a balancing act and space management could be improved, but there are tradeoffs with changing the way space is used as well. Go with mSATA or M.2 SSDs for instance and you drastically limit what options are available. Better fans and heatsinks are certainly a possibility, which is something I hope a revised GE40 would look into, but that's probably a 5C or maybe 10C difference at best.

Anyway, I've tweaked the conclusion a bit to make it clear where I think the GE40 stands. Razer is a bit cooler, quieter, faster...and a lot more expensive. Both need better displays. GE40 also needs improved industrial design IMO. Long-term, the Razer is less likely to have problems I think, but what I've learned over the years is that "long-term" is very hard to judge. Fans can fail on expensive as well as cheap laptops, and while I'd hope the fans used in more expensive laptops would last longer... well, I wouldn't make a bet on that! (I have had plenty of $400-$500 GPUs that had fans wear out after a year or two.)

So which laptop is truly "better" out of the Blade 14 and GE40? Probably the Blade, but roughly $700 extra to get there makes it pretty much a wash in my book. If the Blade had an IPS panel, then it would be just about perfect -- expensive, yes, but with no huge flaws. The GE40 needs far more work before it could get the same sort of recommendation.Reply

The revised conclusion looks great, the amount of detail you go into and the way you’ve put it all is very neat and clear. And yes, I absolutely agree with you -- the GE40 need a major cosmetic overhaul, the Blade is too expensive in general and the lack of a proper LCD in such a product is just outrageous. I genuinely wonder what happened, perhaps they were way into developing a 14.0” chassis and when they realized they couldn’t source a good screen of the same size it was already too late to go back to the drawing board?! Then again, their standard-sized 17.3” model lacks an IPS screen too. Anyway, I never argued these points, it seems as though this has turned into a GE40 vs. Blade along with Apple thrown into the mix sort of thing and that was not my intention.

“Long-term” is a very elusive notion indeed. Hardware can always fail, but having a solid design and build quality goes a long way towards reducing the odds of failure, enables the use of more powerful components (at a certain cost, yes) and helps maintain optimal functionality throughout the lifetime of the product. Yes, fans are finned plasticy spinny little devils which can and do fail -- what can a man do... I know this reads like a bunch of marketing nonsense, but I really believe in those design ideals.

Razer and Apple sell $2000+ laptops, they also pocket significant margins (Apple for sure) so their main priority probably isn’t optimal cooling, but one that just gets the job done; that seems to be the unfortunately reality of today. Despite this they are one of the few companies making advances in “slim-chassis cooling solutions”. I only quoted the rMBP 15 since it was the first laptop to feature a 45W quad-core CPU in such a slim chassis. I wouldn’t bet on Apple, however, to go any further than perfecting their current design as they probably won’t need to cool a hotter CPU (or GPU for that matter) in the future.

Limited options and upgradability have become synonymous with truly optimized and thin designs. I am all for swappable memory and storage and I believe the industry can come up with a very slim, single-sided module which shouldn’t have a negative impact on slim designs. In my eyes, mechanical storage is a no-go for such designs (even a 7mm 2.5” drive); even if there’s enough space for it, it’d make more sense to put more battery instead. Nowadays, even mSATA looks big -- the smallest M.2 implementation looks better, especially for Raid 0 designs. Once you delve deeper even the keyboard becomes a problem, and you just can’t ship a great keyboard without proper travel. All these cannot become standard soon enough.

I would like to thank you for listening to feedback and taking the time to revise the conclusion. I hope I haven’t been too much of a nuisance. Have a great vacation :)Reply

Excellent article. I just ordered a Sager-branded W230ST (4700/765m) which is considerably thicker than the GE40 and supposedly has a good LCD. I'm hoping that the increased thickness will potentiate sufficient cooling and therefore help avoid both throttling and excessive temps in general. There are no reviews yet, so wish me luck. Reply

Nice article. I'm in the market for something this small and this was on my list (not anymore). LCD is junk? Classic!Please review the Clevo W230ST and fast! Hopefully the screen is as promising as it sounds plus 765m can hopefully be decent on 1080p.Reply

Even if W230ST's LCD is not as good, you can easily swap it with a 13.3" IPS 1600x900 screen from LG; this will also increase performance. You can't do this with the Razer Blade 14 or the MSI GE40 because of their unusual screen size -- there are currently no 14.0" IPS screens (900p or 1080p) on the market. Hope this helps.Reply

I just saw that the W230ST is already available at Mythlogic and they list the screen as IPS. Their pricing tends to be among the best; customization options are plentiful too, e.g. you can get an 840 Pro mSATA drive.

Clevo's cooling is generally better than MSI's so if you have to order right now and you've been considering the GE40, it's a pretty safe bet to get the W230ST.Reply

Just a side note; the battery life of this laptop is so good because there is an extra battery in the optical drive slot. Although this is a welcome addition to a laptop, is shouldn't be overlooked. This laptop isn't much more efficient than any other, it just has a bigger battery. Reply

Sorry, but that's just not true. The optical slot is where the 750GB HDD is sitting. I took pictures of the internals (see the first gallery on the first page) and there's definitely no extra battery to be found.Reply

When I am playing a game, I am willing to tolerate the temperature and noise. But I don't want to have high temperature and noise when I am just accessing the internet, or editing a document. Is there any way I could set it to a low temp, or low noise in these case, or is it automatically running slow under these conditions so it will be cool and quiet?Reply

Hi Jarred, I just purchased an MSI GE40 850GTX with 4GB of Ram here in China. The Guy sent me an extra 4GB of Ram and told us we would have to install it. Can you post some more pictures on how you removed the back cover? I don't want to break anything doing this job.Regards AlexReply