In the latest wave of high-end, ultra-thin laptops, the sexy stat hasn't been processing power or even battery life. High-PPI resolution has set tongues wagging, and the sweet spot as of late has been 3200×1800, a pixel-quadrupling of 1600×900 that has blown past the latest MacBook Pro and excited buyers of systems like the 13.3-inch Yoga 2 Pro.

It's a great spec for smooth-looking Web and document browsing, but so far, these high-res, portable screens haven't come with the specs needed to render 3D action on the full screen. Cue the Razer Blade laptop line, which has spent the past two years working toward a promise of thinness and portability without sacrificing performance to deliver a 3200×1800 display. It's a multitouch display at that, with power to spare thanks in part to Nvidia's new line of graphics cards.

Next month's updated Razer Blade, starting at $2,199, will launch with an Intel i7 quad-core processor, a GeForce GTX 870M video card with 3GB of GDDR5 memory, and a 14-inch multitouch display. For that boost, the Blade actually grows a bit larger than its predecessor, jumping from 0.66" to 0.7" in thickness and gaining a few ounces to 4.47 lbs.

Its bigger brother, the Razer Blade Pro (starting at $2,299), will see a similar spec bump, but it might be the jealous older brother this time. For one, its 17.3-inch display lacks multitouch and tops out at 1080p. Its own brand-new Nvidia video card, the GTX 860M, maxes at 2GB of GDDR5 memory, though Razer has yet to confirm whether this card will sport the new Maxwell architecture. At the very least, the Pro model will again include Razer's Switchblade mix of mini-screen buttons and an LED touch panel to the right of the keyboard, along with pre-installed productivity software courtesy of Adobe, GIMP, and Maya.

The "smaller" model is supposedly soaking up higher specs to power gameplay on ultra-dense resolution settings. The bigger question, really, is whether the system will come out of the gate with Windows 8.1 optimizations for touch use. Our recent tests with a 3200×1800 Windows 8 touchscreen have proven difficult, particularly in trying to touch on-screen elements that haven't been adjusted for a denser resolution. Razer would be wise to make menu buttons that are much easier to hit.

The text has been updated to remove references to Maxwell GPU architecture in both Razer Blade models.

The 14" Blade sounds fantastic. When I was shopping for a notebook 18 months ago there was nothing remotely like this. High res, high end GPU and thin and light. I'm sure it gets hotter than the sun and louder than a jet engine but it's still impressive.

This is the sort of thing that might make me reconsider building a new desktop when mine gets too old. One machine to replace both desktop and notebook.

I'm sure it gets hotter than the sun and louder than a jet engine but it's still impressive.

It doesn't. I have the first gen 17", the second gen 17" and the first gen 14". Razer made a lot of cooling improvements between the first gen 17" and its successors. They're audible at load, but it's not a high pitched fan.

The 14" vents heat up between the chassis and the screen. The right palm rest gets a bit warm, but no surface gets uncomfortable as long as the laptop is on a desk. Can't actually put it on your lap when playing a CPU/GPU intensive game as the cooling fan intakes are on the bottom of the chassis. For regular websurfing and office work, though, it doesn't get warm.

So is the Pro going to actually have better 3d performance (including games) since it has so much less resolution to push? Personally I'd be quite happy if I could get a 1920x1200 on that 14" - would be about perfect. The majority of applications can't really use the fully beauty of the 3200, and pushing less pixels should help with performance AND battery life.

This came at the annoucement of the new 8xx Nvidia cards...right when I get a laptop in Dec (and the one I got is getting a direct upgrade to the new GPU). The new cards will have more dynamic battery drain, locking in at speeds to meet your target fps/res at your current game. Play a older game? The system dials it all back to save on battery life. Interested to see how it pans out. At least the 7xx are getting Shadowplay support.

I doubt it will make any sense to run games at the actual native res, but the qHD+ display means you can run games at 1600x900, and with simple pixel doubling achieve a very sharp image. Nvidia's new mobile GPU should let most games run on max settings at that resolution too.

The "R" key, for example, looks like an uppercase Cyrillic/ Greek "G"! This looks stylish to people who only speak English, but is not impressive to people who use multiple scripts!

Is there any possibility of a Dvorak keycap layout? As an option, straight out of the box? Or is it possible to rearrange the keycaps? It's about time for Dvorak to be on almost equal terms with QWERTY, at least to be a serious option for people buying a premium laptop willing to pay $10 or $20 extra for this option! This is particularly important with illuminated keys! Who will be the first manufacturer to supply a decent product to this unsatisfied market niche?

Great for people who want to look at videos and pictures, total waste of time for anyone wanting to do real work.

Let's face it, any resolution in excess of what is required to render text in sufficient detail to avoid eyestrain is a waste of resources on a work machine for anyone not doing graphics of some kind. All it does is cost processor cycles, increase GPU costs, increase heat and noise, and reduce battery life.

Not saying I'm considering buying *any* laptop for this level of performance but I work with digital photography, 3d graphics, animation, and video for a living and for fun/personal projects.

If I was in a position where I didn't have access to a more affordable desktop with similar specs and a nice, big, high-resolution IPS monitor, I would most definitely want something with a decent GPU and a high resolution monitor.

I see you did acknowledge people who might be "doing graphics of some kind" but that can cover a lot of people these days. Even if I wasn't doing professional graphics/multimedia work, I could think of several reasons why I might want a laptop that was more capable at both displaying and rendering detailed graphics.

Don't get me wrong...I'll take a desktop for that sort of thing over a laptop any day of the week. It's just a lot easier and more affordable to get that sort of performance. Still, loads of people either need to be portable or they find the option to be worth the price premium. I'd wager this falls along the same lines.

Now, if you want to talk about that weird green "techie" font on the keyboard, I'm shaking my head as well.

Great for people who want to look at videos and pictures, total waste of time for anyone wanting to do real work.

Let's face it, any resolution in excess of what is required to render text in sufficient detail to avoid eyestrain is a waste of resources on a work machine for anyone not doing graphics of some kind. All it does is cost processor cycles, increase GPU costs, increase heat and noise, and reduce battery life.

Correction: the 870M isn't Maxwell, it's Kepler. It's a cut-down 780M (but bigger than the 770M) along with Nvidia's new battery boost tech.

Can't tell if the same holds true for the 860M because Nvidia have annoyingly created TWO chips (one Kepler, one Maxwell) that are both called 860M as detailed on this AnandTech article. Yes, it's an absolute dick move

It's so close too perfect, but 8GB of non-upgradable memory on a $2200 laptop is a deal breaker for me as a multitasking developer. I cannot comprehend why OEMs insist on hamstringing great laptops with insufficient memory: 16GB versus 8GB is a $80 difference that I would gladly pay for.

Correction: the 870M isn't Maxwell, it's Kepler. It's a cut-down 780M (but bigger than the 770M) along with Nvidia's new battery boost tech.

Can't tell if the same holds true for the 860M because Nvidia have annoyingly created TWO chips (one Kepler, one Maxwell) that are both called 860M as detailed on this AnandTech article. Yes, it's an absolute dick move

Let's face it, any resolution in excess of what is required to render text in sufficient detail to avoid eyestrain is a waste of resources on a work machine for anyone not doing graphics of some kind.

Most people can tell the difference between text printed on paper at 300 dpi or 600 dpi. So if the screen has not yet reached 600 dpi, it isn't good enough yet.

You misunderstand printing and screen DPI. Printers can only print one color per dot while screens can display the entire spectrum on one dot, therefore a printer need to combine multiple dots, often three or four, to achieve the desired color.

For example, to print a shade of gray a printer can't actually mix the black ink with white ink or dilute it somehow, the printer must instead produce a very fine raster that fools the eye into thinking that it sees a gray rather than a number of black dots.

(edit: there are techniques to achieve more than the regular four CMYK colors in printing, but you are seldom going to see more than 64 different colors per point, even in a high-end printer).

Except that when printing text, generally raster based greys and color reproduction are not an issue. That is because text is generally solid black and doesn't require any variance in each individual letter. When you look at a laser printer that is printing black text at 300DPI and then compare it to the same printer printing black text at 600DPI, the letters will be much sharper with a much more defined edge and almost jump out of the page at you.

Personally, I would rather have the original Blade 14's resolution with a better quality display and the 870m.

I really don't see many people gaming at native res even with AA turned off. That means running at less than native res and relying on adapter or display scaling. My understanding from reading through MBPr gaming threads is that NVIDIA does some form of bilinear filtering as opposed to true pixel doubling or nearest neighbor filtering. When asked about it, supposedly NVIDIA reps said they aren't even thinking about it because there's no demand. As for display scaling, in general I hate it, especially when it comes to old games that are stuck in 4:3 land.

On the one hand, demand will surely help push Microsoft to improve Windows' HighDPI modes (they still mostly suck), and NVIDIA to implement a better resolution scaling model. On the other hand, I really don't want to be among the waves of early adopters for this.

The original Blade 14 had great battery life in non-gaming scenarios. I am very curious what the new model will be like with the in both battery life and heat production/fan noise.

Really really do want a nice highres display in a smallish package. Can take or leave the touchscreen.

But why in the name of all that's holy are you jacking around with the keyboard in such a way?? You put some magical LCD touchpad thing on there and still felt you had to jerk around with the lower-right area of the keyboard and replace buttons I use all day long with your stupid backlighting (or whatever) button and some arrow keys?

Put the keyboard back how it's supposed to be, give me a trackpoint and a normal numeric keypad in place of that horrible LCD thing, and I'll order one tomorrow.

How's the DPI scaling going, though? Quite a few of the applications I use in my Windows 7 VM on a rMBP have horrible support for 150% or 200% scaling to the point where I navigate by guessing what UI elements do. Jamming 'retina' resolutions into a screen only works well if you can actually use the system.

How's the DPI scaling going, though? Quite a few of the applications I use in my Windows 7 VM on a rMBP have horrible support for 150% or 200% scaling to the point where I navigate by guessing what UI elements do. Jamming 'retina' resolutions into a screen only works well if you can actually use the system.

Developers have been half-assing the high DPI support for years now. Sometimes they do it part way and then click the button in the resources that claims high DPI support, but then QA never tests it and it bit-rots.They've been getting away with it because Windows computers have not been using high resolution displays.Now that there's more demand I expect customer pushback and bug reports will make them do it right.

Why is it so difficult to find a quad core notebook that isn't huge/doesn't cost an arm and a leg now?

This notebook and the 15" retina MBP are the only two that come to mind that have quad core CPUs and aren't 10lb behemoths. I was looking to buy a notebook and couldn't find anything that was quad core that wasn't over $2000 or that didn't weigh 10lbs (exaggeration).

Why can't we have close to ultrabook weight with a quad core CPU? This notebook ticks all the boxes for the specs I was looking for (wish I could find something without a separate graphics card, rather something that uses iris pro...linux doesn't play too well with graphics cards), but JFC it is expensive. It costs more than the MBP. So does the Samsung ATIV 9. What kind of topsy-turvy world are we living in where Apple notebooks are no longer the most expensive on the market? LEL.

So far the only notebook that has this gorgeous screen and isn't hyper expensive (relative) is the Yoga 2 Pro; which has poor battery life and a dual core CPU.

In summation, I have given up my hunt for a notebook that has the specs I want and decided to wait for the next mac mini. If it doesn't have iris pro and AC wifi, gigabyte brix it is.

Why is it so difficult to find a quad core notebook that isn't huge/doesn't cost an arm and a leg now?

This notebook and the 15" retina MBP are the only two that come to mind that have quad core CPUs and aren't 10lb behemoths. I was looking to buy a notebook and couldn't find anything that was quad core that wasn't over $2000 or that didn't weigh 10lbs (exaggeration).

Why can't we have close to ultrabook weight with a quad core CPU? This notebook ticks all the boxes for the specs I was looking for (wish I could find something without a separate graphics card, rather something that uses iris pro...linux doesn't play too well with graphics cards), but JFC it is expensive. It costs more than the MBP. So does the Samsung ATIV 9. What kind of topsy-turvy world are we living in where Apple notebooks are no longer the most expensive on the market? LEL.

So far the only notebook that has this gorgeous screen and isn't hyper expensive (relative) is the Yoga 2 Pro; which has poor battery life and a dual core CPU.

In summation, I have given up my hunt for a notebook that has the specs I want and decided to wait for the next mac mini. If it doesn't have iris pro and AC wifi, gigabyte brix it is.

It takes money to develop these type of ultrabooks I imagine. You have a quad core CPU in there so you have to consider heat output while not making it uncomfortably hot or loud for the user. This takes money for research, hence the cost.

Why is it so difficult to find a quad core notebook that isn't huge/doesn't cost an arm and a leg now?

This notebook and the 15" retina MBP are the only two that come to mind that have quad core CPUs and aren't 10lb behemoths. I was looking to buy a notebook and couldn't find anything that was quad core that wasn't over $2000 or that didn't weigh 10lbs (exaggeration).

Why can't we have close to ultrabook weight with a quad core CPU? This notebook ticks all the boxes for the specs I was looking for (wish I could find something without a separate graphics card, rather something that uses iris pro...linux doesn't play too well with graphics cards), but JFC it is expensive. It costs more than the MBP. So does the Samsung ATIV 9. What kind of topsy-turvy world are we living in where Apple notebooks are no longer the most expensive on the market? LEL.

So far the only notebook that has this gorgeous screen and isn't hyper expensive (relative) is the Yoga 2 Pro; which has poor battery life and a dual core CPU.

In summation, I have given up my hunt for a notebook that has the specs I want and decided to wait for the next mac mini. If it doesn't have iris pro and AC wifi, gigabyte brix it is.

It takes money to develop these type of ultrabooks I imagine. You have a quad core CPU in there so you have to consider heat output while not making it uncomfortably hot or loud for the user. This takes money for research, hence the cost.

I really don't want a touchscreen in my laptop, but that is personal preference. You can see by looking at that Dell that the i7 model starts at $1800 and has a graphics card (not iris pro...in linux this is a really important thing. Graphics card drivers have become much more stable over the years, but they still have a long way to go).

Just looking at that Dell I would say that for the money it isn't a bad deal. I actually looked at that model for someone else and the only reason I shy away from it is the graphics card. If I could get that exact model without the touch screen + same resolution - graphics card + iris pro, I would buy it in a heartbeat. OTOH, the 15" MBP (lower resolution screen...that will probably change soon), comes with Iris pro, retina display and a PCI-e SSD (huge difference in speed there) and no touch screen and is $100 more than the i7 Dell. At that point it comes down to personal preference. AFAIK you can't upgrade the RAM on the MBP and I would wager that you probably can't do it on the Dell either (soldiered to the MoBo...boo). I think I am still just going to hold out for this next crop of Mac releases. Chances are probably pretty good that Apple will bump the screen resolution soon for the laptops and the mac mini will likely get iris pro and AC wifi. Mac mini or gigabyte brix means I need a VESA mount + monitor and place to put that stuff, but it's significantly cheaper with the specs I want. No portability. .

I wish there were a laptop maker out there that would literally let you customize every single modular piece of their laptops. Meaning if I want an i3 CPU + graphics card + QHD+ display, let me do it. For most people dual core i3's are going to be fine for what they use their laptops for. It's us power users that get shafted .