91 Comments

Assuming it's TDP is close to that of the 750m, yes, it could end up in the new rMBP's.

However, what apple did with the 750m in the rMBP is that they over clocked it, and used its lower TDB compared to the 765m to keep heat down. In that sense, we may very well see an OC'd 850m in the new rMBP.

Only time will tell. Razer did manage to put the 765m in their Blade 14" laptop, so if Apple was willing to take the slight hit on battery life, they may decide to give the 860m a shot :) Reply

But we'd still have to pay 700 extra for it, presumably :/The Iris Pro is good enough and all, but for some work (even non gaming, and not quite so high end as for a workstation card to be worth it) a Geforce or Radeon is just required so you have to pony up that huge extra fee. Reply

Ha. I just started a thread on the AT forums about the MSI GS60 saying that the only other laptop i'd look forward to is a refreshed Razer Blade 14 and here were are. Very opportune timing. If that Blade has a significantly better screen than the last generation, I may finally buy a laptop again. Reply

Jarred. In regards to the ultra high-res displays, have you already or do you plan on putting together an article regarding gaming and upscaling on 3K and 4K screens? More and more laptops are coming out with these "retina" style displays where the graphics card won't be able to run games at native res. I'd like to see how the games would look/perform with upscaling. For example, running the game at 16x9 or 1080p with various settings on a 3k screen.

I think it'd be an important article because it would affect my buying choice when offered multiple screen options with the same laptop. Reply

I commented on this in the Dell XPS 15 review. Basically, upscaling (or running at non-native) is much less noticeable in many respects as the pixels are so small. The other side of the coin is that there are way too many Windows apps that I use that don't work well with high DPI. In a few years, I think it will all get sorted out, but right now high DPI is "bleeding edge" stuff.Reply

Possibly, but I thought he was primarily concerned with making the chassis and waxing eloquent on the joys of the material choices and such. Speaking of which, what are you working on these days? Hahaha....Reply

Wallet is never prepared. I just club it on the head and drag it out. This will be one hell of an expensive purchase if I end up buying it. I'm really curious about the heat and noise generated by this beast. Looks like it's shipping in April? I'm hoping AT gets a review sample.Reply

This is great news! I look forward to seeing laptops become thinner and more powerful. I know they will never fully reach desktop performance, but it makes me smile seeing them gain. I recently purchased a sager np9150 with a 7970m and I am honestly using my desktop less and less while I enjoy the benefits of mobility. Its an awesome time for technology!Reply

I'm still rocking a big Alienware 17x R3 with 580M so I have been eyeing the upgrade paths. But, I can't help but be very disappointed the "880M" is not Maxwell technology but just re-branded old stuff. I don't care if its 20% faster. I don't mind a carrying around a large laptop. I'm not a frail girl.Reply

Not only in time for back-to-school, but in time for Broadwell. The power improvements of Broadwell + Maxwell at the very high end will be an incredibly refreshing change in gaming-laptop battery life.Reply

Hm, I think the only disappointing factors are the use of Kepler and Maxwell parts (although not surprising) and the lack of distinguishing between the two architectures at the 860M level. However, I am looking forward to coverage of 860M (Maxwell) parts as I think that will finally be a good sweet spot for me. I have an aging gaming laptop (Dell M1530 with an M8600 GT), and I've grown very used to my ASUS Zenbook... especially its weight (or the lack thereof). So, I'm hoping for a (relatively!) light-weight laptop that has some decent gaming power under the hood.Reply

Kepler? How about that Fermi SKU? They have to be blowing the dust off of some old chips to still be selling those. Really the most disappointing thing about the 800 series so far is the fact that there still is no word on desktop parts. As most people are saying they're likely waiting for TSMC 20nm. That is a long gap. The entire desktop gaming GPU cycle has been shifted half a year because of it. I can't imagine nvidia and amd expect to get "back on track" by releasing the 900 series parts a few months after the 800 series.Reply

You can't just look at the CUDA core counts -- the consoles are using AMD chips for one, and AMD shader cores have almost always been "slower" individually than NVIDIA shader cores. I suspect the GTX 750 Ti will give the PS4 and XBOne a run for the money in GPU performance. Couple it with a decent CPU and the only thing really holding us back on PCs is the OS and the need to support "infinity" different hardware configurations.Reply

it plays titanfall on medium settings @1080p 60FPS.. The Xbox One is using similar settings but only rendering at 760p then upscaled. So yes, it's faster if you dismiss variables like CPU and Platform performance.

No real way to compare unless someone pops in a Steamroller cpu with 750 TiReply

While that would be nice to see, I don't think that's the point. The OP said the 750 Ti can't make for a viable gaming platform compared to current gen consoles, because it is "slower."

Clearly it can. It's outperforming both consoles with some CPU/platform setups. The 860Ms in these laptops would likely be paired with i5s and i7s, which will yield much better CPU performance than the Steamroller APUs in the consoles. Reply

Exactly. Especially since the "860M (Kepler)" is crippled by its memory bus. What's the point of putting such a large, powerful and expensive chip in there when many shaders are deactivated and not even the remaining ones can use their full performance potential. You also can't make a huge chip as energy efficient as a smaller one by disabling parts of it.

GK106 with full shaders, 192 bit bus and moderate clocks would have been more economic and at least as powerful, probably a bit faster depending on game and settings.

Yet regarding power efficiency "860M (Maxwell)" destroys both of these configurations, which makes them rather redundant. Especially since it should be cheaper for nVidia to produce the GM107. Do they have so many GK104 left to throw away?Reply

GeForce Experience is pretty awesome. It used to be that whenever I got a new game, I'd have to spend a lot of time trying to figure out the right settings for it; the default settings would usually either run way too slow or way too fast. But with GeForce experience, they've already tested the game with the same CPU and GPU that I've got, and its defaults are generally a good balance of quality and performance. So what used to be an involved process of play-tweak-play-tweak-play is now just a "mash button and go".

That said, it could still use some polish. I don't know if it's still in beta, but it feels like it. It's not uncommon for it to start reporting "game cannot be optimized" for games that it DOES support (and that you have previously optimized), which usually requires a reboot. And a few months ago nVidia did a self-update that caused it to go completely nuts, locking up the machine (a trip to their forums indicated it happened to everybody who got the update before they fixed it).Reply

"Where things get a little interesting is when we get to the GTX 860M. As we’ve seen in the past, NVIDIA will have two different models of the 860M available, and they’re really not very similar (though performance will probably be pretty close)."

no... performance won't be close. This is a laptop, where power efficiency is part of performance, and you me, and the author all know power consumption of the "Maxwell 860M" is going to be less than the "Kepler 860M" the article should be absolutely SLAMMING nVidia for calling two very different parts the same thing. The video card numbering schemes are confusing enough to laymen (I get asked to try to explain it regularly, since I'm the go to hardware guy in my circle of friends, relatives, co-workers, casual acquaintances and all their friends...) It's going to be impossible to tell them that it depends on which GTX860 they get and they probably can't tell which they'll get until they get the computer...Reply

We've complained many times about overlapping names in the past. There were for instance two completely different versions of the GT 555M (which later became the GT 635M I believe). And performance and battery life are not "the same" -- particularly since the GPU is usually off when you're on battery power. If you want to play games while unplugged, well, there it could be a different story. Anyway, we pointed it out, said it was a dumb overlap more or less (the "interesting" was meant as a sarcastic interesting, not a "wow, this is really interesting"; perhaps that wasn't properly conveyed though I'd suggest the rest of the text supports that), and moved on. If the Kepler variant is widely used, we'll certainly complain about it.Reply

With Cherry Trail, they will be able to put the HD 4400 level of performance in Atom chips.

Both Nvidia and Intel have secret sauce to tremendously improve performance/watt in the next few years or so to push HPC.

Broadwell should be the first result for Intel in that space, while Nvidia starts with Maxwell. The eventual goal for both companies are 10TFlop DP at about 200W in 2018-19 timeframe. Obviously the efficiency gains gets pushed down in graphics.Reply

No, I just don't 2MB is going to effectively hide the fact that you're using 2GB of textures and trying to deal with most of those using a rather tiny amount of memory bandwidth. Does the Xbox One's eDRAM effectively make up for the lack of raw memory bandwidth compared to the PS4? In general, no, and that's with far more than a 2MB cache.Reply

It can help, sure, but you're comparing a chip with a faster GPU and the same RAM to a chip with 640 Maxwell shaders at 1189MHz to a chip with 768 Kepler shaders at 1032MHz (plus Boost in both cases). Just on paper, the GTX 750 Ti has 4% more shader processing power. If bandwidth isn't the bottleneck in a game -- and in many cases it won't be with 86.4GB/s of bandwidth -- then the two GPUs are basically equal, and if a game needs a bit more bandwidth, the 650 Ti will win out.

Contrast that with what I'm talking about: a chip with less than 20% of the bandwidth of the 750 Ti. It's one thing to be close when you're at 80+ GB/s, and quite another to be anywhere near acceptable performance at 16GB/s.Reply

"Speaking of which, I also want to note that anyone that thinks “gaming laptops” are a joke either needs to temper their requirements or else give some of the latest offerings a shot."You realize that you are speaking to the "PC gaming master race", right? :PReply

Because outward facing bandwidth is scarce and expensive. Even with thunderbolt 2.0 (which has seen a very underwhelming adoption from OEMs) the GPU will be spending a great deal of time waiting around to be fed.Reply

the few videos on youtube show that it is possible to run graphics cards over TB1 4x pice, and be able to achieve 90%+ performance out of the card when connected to an external monitor and 80%+ performance when feeding back through the TB to the internal monitor

so basically eGPU could really be a thing right now, but no-one is making a gaming targeted pcie tb connector. (sonnet's one needs an external psu if you are trying to put a gpu in it)Reply

Because GPU Makers can sell mobile chips for a huge increase over desktop chips.A 780M, which is roughly comparable to a desktop 660 nets Nvidia a hell of a lot more cash.A 660 can be picked up for arround £120 these days, whereas on Clevo reseller site in my country a 770-780M upgrade costs £158.

Nvidia knows that a large percentage of very high end gaming laptops (780M) just sit on desks and are carried around very infrequently, which could easily be undercut piecewise with a eGPU.

My 13inch laptop, 750Ti ExpressCard eGPU, and PSU for the eGPU (XBOX 360) can still easily fit into a backpack for taking round to a friends, and costed much less than any similar performing laptop avaible at the time, and when I don't need that GPU power then I have an ultraportable 13 inch at my disposal. Reply

So I take it nvidia hasn't hinted at the possibility of g-sync chips being included in laptop panels? I think they'd make the biggest impact in laptops where sub 60 fps is practically given on newer titles.Reply

I asked about this at CES. It's something NVIDIA is working on, but there's a problem in that the display is being driven by the Intel iGPU, with Optimus working in the background and rendering the frames. So NVIDIA would have to figure out how to make Intel iGPU drive the LCD properly -- and not give away their tech I suppose. I think we'll see a solution some time in the next year or two, but G-Sync is still in its early stages on desktops, so it will take time.Reply

I'm surprised you guys didn't say anything about the 850M not supporting SLI. I was expecting a paragraph deriding that decision by Nvidia. I'm really upset. I would have loved to see how energy efficient SLI could get. Lenovo has had that laptop with 750M in SLI for about a year now and I've thought that was kinda stupid.

But considering how power efficient Maxwell is maybe that could actually be a good idea now.

Maybe they'll still do it with underclocked GTX 860M's.

Hm, I bet that's what Nvidia wanted. To discourage OEM's from buying 2 cheaper GPU's/laptop instead of ONE hugely expensive one. Prevent them from SLI'ing the best GPU in their lineup.

I've never been much of a fan of SLI in laptops. Scaling is never perfect, often there is more difficulty with drivers and game compatibility, and battery life can take a hit as well. I mentioned it didn't support SLI, but I can't say it bothers me much. 860M Maxwell supports it (apparently) and will use the same chip, so really it's only going to be a small bump in price to go from two 850M to two 860M -- assuming an OEM wants to do SLI 860M that is.Reply

I think the thesis of this article is right on: it's a great time for gaming notebooks. The idea just four years ago that I could play the newest games at highest settings at 2560x1600 at more than respectable FPS was unthinkable. My Sager with dual 780ms does that with legs to spare, and I should be able to do this with a SINGLE GPU once high-end Maxwell mobile chips come later this year -- simply amazing.Reply

Only if you're using the iGPU to do something, but since I'm discussing system bandwidth vs. GPU bandwidth I didn't get into that. I suppose something like Kaveri will end up with about the same 16GB/s of bandwidth from the system RAM (with the remaining bandwidth going to the CPU), but really Kaveri will still only be a "moderate" GPU performance level.Reply

Does anybody see why Maxwell part is the starting point and Kepler holding the higher end of the range ?. They are squeezing blood out of Kepler before a complete switch to Maxwell. It is a very clever trick to pull and buys NV time to carefully craft the performance of higher end Maxwell parts to suit the performance/price model they wanted. This release alone seems enough to maintain their discrete gpu market on laptops while AMD struggles with their mobile market. It keeps Intel IGP at bay except for non gamers who do not care about discrete graphics.Reply

I am one of those people that think laptop gaming "is a joke" still. 1hr gaming is still limited but more so is most gamers are going to want to use a mouse not trackpad and therefore use a table. At the point you're gaming at a table why not build a solid desktop and buy an ultraportable for less? Are people unable to spend a 1k to 2k budget better in this regard?

I am greatful to see real performance hitting laptops, lower 850m = 580m in the illustration for example. Coming from a purely cost sense I am unconvinced an upgradable desktop and cheap slim low power laptop is not better for the vast majority while also being cheaper. Especially in the long run as you can upgrade a desktop while a $200-$300 laptop won't depreciate like a 1k-2k one will. For example, who is going to buy that 580m laptop for even half its list now? A cheap laptop is almost always worth $100.Reply

Jarred: Having a gaming laptop with Optimus, I'm not convinced of any benefit of Optimus. Since I leave my machine plugged in all the time, it really doesn't give me any benefit. On the contrary, I have the following main issue with Optimus: sometimes software doesn't use the dGPU or the dGPU fails to kick in. I've experienced this with at least 2 titles: AutoCAD 2013 and Zen Pinball 2. In the first case, AC displays an error dialog saying it can't find a real GPU and exits to the desktop. I've actually resorted to running AC2013 on my old ThinkPad that has a discrete GPU. In the second, Zen Pinball 2 (via Steam) apparently finds and runs on the HD4000 iGPU, but is horribly laggy. (Other Steam-based titles seem to run fine.) Optimus for big, power-hungry laptops is probably a half-baked idea given how well the recent major GPUs drop to idle anyway, and I will be shopping for a discrete GPU only in the future.Reply

Hey Jarred, I'm confused a little with this quote:"One nice benefit of moving to the GTX class is that the 850M will require the use of GDDR5"It doesn't seem to be true since geforce.com states both DDR3 and GDDR5 are possible for 850m.Reply

Good grief OEMs aren't listing which 860m is in which laptop SKU which leaves me believing thatbtheyre actually mixing maxwell and kepler bins. That pisses me off greatly because one chip has a higher value than the other and they're leaving it literally up to luck. If I get a laptop this summer I will return it if I get a Kepler chip without hesitation and try a different site to buy from.

Also the quoted max of 2GB for the 860m is interesting because gigabytes "p34g v2" product page shows the 860m with 4GB DDR5.Reply