Hard Choices Update: Mobile Gaming On The Cheap

Share this:

We’ve dabbled with mobile gaming machines in Hard Choices passim. Gaming lappies are great, but they’re also punitively pricey. What if laptops with tolerable gaming chops were on the verge of an epic price drop? That might just be the case courtesy of Intel’s upcoming Haswell processors and a funky little software layer from a little known third party. For clarity, the context here is relatively low-end gaming portables, not full-on desktop replacement sorts. But if you’re strapped for cash or just looking for something casual for away days, read on.

This story starts at the tech jamboree that is Intel Developer Forum in San Francisco back in September. It was the first I’d missed in the better part of a decade I’ve been a hardware hack. But even from my dislocated vantage point in Blighty it was obvious that IDF 2012 was a dud. Intel doesn’t do exciting CPUs any longer. It’s struggling to get its chips into phones, which is its number one priority right now. Not a lot going on.

But Intel did dish some deets on the upcoming Haswell architecture which is basically its next CPU design for boring old PCs. They’re due out early next year and likely to be known as the Intel Core i-something 4000 series. The CPU side of Haswell looks super boring. No more than four cores, a few tweaks to release a little IPC, maybe 10 to 15 per cent more performance. Yadda yadda.

Now, as we all know Intel’s mainstream CPUs include an integrated graphics processor on-die. Haswell will be no different. Intel hasn’t completely unloaded regards the full details of Haswell’s updated 3D engine. But it’s divulged enough dirt to provide some insight.

Combine that with the work a certain boutique graphics outfit known as Lucid Logix has been doing to improve frame rates on low end GPUs and you have the prospect of properly playable integrated graphics.

And integrated graphics are cheap. In fact, they’re more or less free. So let’s look at the details, starting with the hardware part of the equation. Haswell carries over pretty much the same graphics execution units as the current Ivy Bridge gen of Core i-somethng 3000 chips. That includes the Intel Core i5-3570K which is the RPS gaming chip of choice on the CPU side.

So it’s mostly clocks and unit counts that will separate Haswell from earlier Intel graphics. For the record, there will be three hardware options, GT1, GT2 and GT3. They’re thought to offer 10, 20 and 40 graphics execution units each. For context, the fastest current Intel Core processor has 16 graphics units. Sorry, I know this stuff is a bit dull, but it’s worth understanding.

Intel’s claiming Haswell graphics will be twice as fast as Ivy bridge, so the assumption here is that the 40-unit version will be a little down clocked in the quest for better power and thermal management. Whatever, it’s a big step up in terms of hardware.

The other part of the package is Lucid’s new Dynamix software. Lucid is the graphics upstart that has enabled, among other things, Intel to fix its broken integrated graphics so that you can use the QuickSync hardware video encode engine with a proper discrete graphics card installed. Lucid’s party trick is basically getting GPUs to run in parallel, even when they’re made by different companies.

Its latest ruse involves a software layer that sits in front of a PC game and has a sniff of everything being asked of and sent to the graphics card. Every frame in the graphics pipeline is analysed and optimised for performance.

Not exactly the obvious choice for a gaming portable

And yes, this does mean reducing image quality. According to Lucid we’re talking a few digits of image quality in percentage terms in return for – wait for it – a doubling of performance. Lucid was showing the technology off to IDF attendees and according to a chum of mine, it really works.

Whether you’ll be happy to stomach – or even notice – the drop off in quality is, of course, the key question. I haven’t seen it in action, so can’t comment. But if it is acceptable, then you have the prospect of integrated Intel graphics four times faster than today’s. And I would guestimate that will put you in playable territory for a majority of games.

The implication is that all manner of laptop form factors will suddenly become viable for pukka PC gaming, including Ultrabooks and maybe even tablet convertibles, though we may have to wait one more processor generation for the latter to come.

There’s an obvious one snag, of course. Intel’s record for producing good, reliable graphics drivers with broad game game compatibility. It doesn’t have one. Things have undoubtedly been getting better and this is one area where the carry-over execution unit architecture will help. It’s not a new design needing all new drivers. So, you never know.

45 Comments

You may laugh, but personally I quite like being able to relax on the sofa in the living room playing BF3 as well as I do on my desktop (albeit with marginally lower settings, but given that the screen is 10″ smaller but the same resolution it actually looks better).

Then be able to hop on the train and mess around on Just Cause 2 for the duration of a journey, lovely thank you.

1920×1080 on a 26″ monitor was fine until I saw the same resolution on a 17″ Laptop :P

Besides, if I got a monitor that supported a higher resolution I’d need a graphics upgrade to something that could play well on the higher resolution, so it’d cost a few hundred pounds to get a new higher res monitor and new card. Why spend all that when I’ve just got a laptop that can play things perfectly well and because of the high amount of pixels in a smaller space, still looks nice and shiny.

Just as a side note, until I actually got a decent spec laptop, I was firmly in the ‘Desktop or nothing’ camp, I couldn’t see why someone would want something less powerful for more money.

But after a week of using a decent laptop, I started to wonder how I managed being cooped up at a desktop all the time. Now I can actually game and not have to go hide in a nerdcave to do it!

Also, since it’s on your lap, much closer than a screen at a desk, you don’t really need anything more than 17″, so unless you’re really fussy about keyboards all you need is an external mouse (or gamepad) and you’re good to go.

IMO, a laptop can never beat a desktop, due to a desktop sporting a full size keyboard, big monitor and mouse by default. On top of that, the same specs for desktop vs laptop will cost a lot more for the laptop.

That said, when i go to visit buddies for some gaming, it’s nice when my laptop doesn’t struggle too much. Despite it being 12 months old and only costing me £600 at the time. It even runs ArmA 2/Day-Z smoothly at low settings!

It’s also especially handy now that a lot of my gamer buddies live in London and i’m working in Munich. Makes the commute for our (1-2 times) yearly LANs much simpler. :)

My desktop is a beast, but my laptop does a good job if i take my mouse with me. I’d never use it as a full-time replacement though. Anyone who does is a lot different to me i guess, or has different needs (see above). I can’t imagine spending a bunch more cash for the same specs (and a smaller screen) unless my job meant i was travelling all the time.

There are lots of labtops with full blown keyboards. Including numpad and stuff. Labtops are very often build with games or video edit in mind. Of course those often tend to be less comfortable when actually placed on top of your lab. But i use a 19′ Dell with full keyboard and luckily i do own tables. And now i can move between them.

To me, the problem with gaming laptops isn’t that desktop gaming is better–(though it obviously is, even if squeezing more pixels onto a smaller increases the cuteness factor), it’s that a laptop capable of serious gaming will be an inferior mobile device. It will be larger, heavier, hotter, louder, less convenient and less reliable than the device I would likely carry around if I didn’t care about gaming. (I believe this is true even if you don’t care about cost–if a manufacturer can make a gaming device of given size, weight, power consumption, and noise, they could make a smaller, lighter, more efficient, quieter one if they ignored gaming requirements.) That means I’ll either be stuck lugging around that device, I’ll end up having to manage even more devices (a gaming beast for longer trips, an iPad, Nexus or Surface for walking around with), or I’ll simply leave the device behind most of the time.

Given all of the other possible entertainment that other devices are capable of (reading, writing, video, older games, phone/tablet games), it strikes me as a bizarre trade off. And I think this will only become more true in the future if gaming moves in the direction of more diverse hardware (motion controls? VR headsets?)

Exactly. I have a really nice desktop that I hardly ever get to use because I have a 14 month old daughter at home I babysit during the week (I work only Fri-Sat). Most of my “free” time is the last half of my shifts at work and if not for a decent laptop I wouldnt get to play games at all.

It’s a ways off, but I seem to remember Intel saying that 2014’s Broadwell will be coming with a complete redesign of the graphics architecture? As impressive as Intel’s gains on the GPU front have been the past few years, I’m looking forward to seeing what they do next.

I’m running an IvyBridge laptop with a GTX680M and I’ve had zero problems gaming on my laptop. I think the big problem is expecting to run CoD17 on a $500 notebook at 1080p with “Ultra” graphic settings.

It’s all about expectations, but gaming on an appropriate notebook is WELL beyond “competent” and firmly in “good” territory.

“Lucid was showing the technology off to IDF attendees and according to a chum of mine, it really works.”

That would be a first, wouldnt it ? Their Virtu technology, which they promise adds 60% to your 3DMark scores and FPS in games – is a scam, literally ! And yet, somehow they were able to convince ALL major motherboard makers to support it. I have no idea how they did it, either it was through blackmail, or Asus et all are really dumb and didnt bother to check their claims.

To conclude – everything Lucid says is pretty much bullshit. Except for QuickSync part, but you can just switch the monitor cables and why would you bother with their solution to non-existing problem.

Yeah, this is the sort of thing that sounds amazing on paper but will probably end up feeling half-baked in practice. Aside from image quality degradation, will there be any additional latency or miscellaneous rendering bugs? Those are the sorts of details that could keep this from taking off.

A scam doesnt mean “half-baked”, it means “not baked”.
This is what they do, they interleave normally rendered frames with pure black frames “rendered” in record times, naturally, thus your FPS increases, and why wouldnt it !
More about their method here : link to behardware.com

Also, judging by the description in the article, of it being software that runs alongside the game analysing every frame and optimising it. Surely the overhead of having the additional software running and analysing each frame would decrease performance enough to make any gains irrelevant. Surely simply turning down the graphics settings would be a much easier and more direct solution, if you’re going to end up with reduced visual quality either way, why give yourself the unnecessary overheads of extra software??

My Asus netbook does a pretty good job of running Torchlight II and anything less demanding than that. Only cost $350. Heck, the Nexus 7 has a quad-core that doubles as a GPU at what? 1.2GHz? I suppose it really depends on what you consider “modern games,” but we’re pretty much there already when our tablets can emulate consoles from just one generation back.

I was really surprised to do not see any mention of the Nexus 7 in this article. For 200 $ (or the equivalent in € or £) you get a very nice tablet with a design adapted to gaming: the 7″ form factor is pretty close to handheld consoles and the bezel is used to make the tablet ergonomic while playing.
There are some nice games on Android. Nothing comparable to Dishonored or XCOM, but Osmos or Waking Mars are perfect to chill out for 20 minutes.

I own both android phone and a DSi. I have played on both . I do like the android almost everything but for gaming i prefer the DSi much better on the hands. As well as the Pandora is completely open system but with any android phone you have to root it. Another problem for me is when i’m trying to play my PS1 games on an android phone it stutters and just not good. Its earlier just to burn the game and use my mod PS2.

Mine is a Lenovo thing that was only a couple of hundred quid (as I recall). I use it most days on my commute either for hobby coding or for games and have played loads of great stuff on it. In particular, anything remotely old (check out GOG if you haven’t) often plays fine as well as more recent games that are not too graphics dependent.

Examples of games I have played mainly or exclusively on netbook rather than my main PC this year:

Space Chem
Fate of the World
X-Com (the original one)
FTL
To the Moon
Dwarf Fortress
Planescape: Torment
The Longest Journey (just finished it last night — brilliant!)

All were perfectly fine on netbook (subject to getting used to using trackpad rather than mouse) and the biggest issue is usually the morning sun through the windows of the train (which I understand to be hardware independent). As far as Steam is concerned, that also works fine provided you remember to switch to offline mode before going out on the move.

A £10 Logitech wireless mouse (with discrete micro dongle) makes for a great netbook gaming “upgrade”. I found a lot of games that would run on my netbook were fairly hampered by the lack of a mouse. It’s worth getting one for sure if you do any kind of mobile gaming.

I may soon upgrade to one of those swanky Logitech G700 models, as that’s a lot more familiar feeling, since i use a G500 on my desktop.

The best way to go wireless mouse on modern laptops is Bluetooth. Nearly all laptops have WiFi combined with Bluetooth on the same internal card. Don’t need no stinkin’ dongle. I have two from Amazon:
Connectland CL-MOU23014 Bluetooth Wireless Optical Mouse

I’m thinking of buying a Lenovo Y580 as a laptop/gaming device which I intend on using for a next couple of years. Main thing is that I’m studying computer science and I like to do my coding at odd places, like toilet (seat) which is down right impossible with a desktop. Plus it’s cheap and it supposedly plays BF3 on high settings :)

The 3 most important parts in a gaming laptop are the CPU, GPU and screen. The $999 looks pretty good on all those fronts :D

I’ve been laptop gaming for a few years now, my current rig is an i7 740qm, and 1gb gpu (Amd 5650). It’s only this year that I’ve had to lower from ultra and high to the midrange settings for more than the occasional game.

Check the GPU here (link to notebookcheck.net) and as long as it’s in the top set (74 at the moment) you should be good for at least 3 years, provided you don’t mind lowering the graphics settings eventually.

I’ve been light gaming on my Asus 1201N Netbook++ and it has worked reasonably well for older games. (Civ IV, Bloodbowl, etc.)

I find that it is fun to tinker with on the couch next to the missus while she watches TV or reads her Kindle in bed.

Since it is dying I’ve just ordered a Samsung Series 5 with a NVidia 620GT, Intel i3, 6gig of RAM etc. At 14″ it is a little large but it was a really cheap ultrabook with enough grunt to play Skyrim at medium details and also allow me to do some software development.

As for this magic Dynamix software – it is pure snake-oil. Development studios spend thousands of man-hours optimising graphics code down to the driver level in conjusnction with the chip manufacturers. If it was so easy to double performance by optimising the graphics code all games would run at twice the frame rate right now.

It’s about time that AMD gets back in the business. The 3570K is 10-15% better than the 2500K and now another 10-15% improvement. They’re not even adding more cores. Bah, as long as there’s no competition Intel can just set the pace and do evil stuff like locking their CPUs.

On the plus side, it would be nice to see some use for the graphics unit in my 3570K. Never got the point of it, everyone who buys one of those CPUs will have a dedicated GPU. It would be nice to see that it actually helps my Nvidia graphics card, even if it’s just adding a few fps.

My first gaming laptop was a 2004 15 inch Compaq with an ATI Mobility Radeon 9200 and 32M of dedicated graphics RAM. I used it on trips and in my living room. I did a lot of gaming and it served me well. I replaced it with another laptop, 15 inch MSI GX630, AMD 2-core Turion, Geforce 9600M GT that I still use at home and on the road. It does a great job gaming on Steam. Through all this and before I have built numerous desktop PCs for gaming housed in my home office/lab. The final one was a noisy space heater powerhouse. Last July I replaced it with an HP 17 inch laptop – Intel quad-core CPU and GT650M 2GB graphics. It runs my Steam games just as nicely as the old space heater. No more desktops for me. I am looking to replace the MSI, possibly with a Surface Pro or Razer Edge. BTW, my power bill has gone down a bit since I got rid of the two desktops I used to run.