The future is 4K and it’s the PC not next-gen leading the charge

4K play.

There was a time, back in the late nineties and early noughties, when console graphics were king. While the PC industry and the likes of the now-defunct Silicon Graphics poured most of their efforts into making specialist chips for specialist 3D workstations, it was Sony and Nintendo that led the charge on 3D graphics for the consumer. By the time the PlayStation 2 launched in early 2000, console graphics had far surpassed anything available on the humble PC, much to the chagrin of dedicated PC players.

Fast-forward a few years, and huge investments into R&D, tighter fabrication processes, and a freedom from the tighter power constraints of the console--not to mention a much-improved DirectX from Microsoft--saw the PC steadily claw back the performance crown. Even the last bastion of console overkill, the PlayStation 3 and its Cell processor, was backed up by what was essentially a standard 7000-series Nvidia graphics card, with a performance to match.

Nowadays, it's tough to imagine even a company as large as Sony putting as much time and money into developing something as esoteric as the Cell processor, which is probably why it and Microsoft have eschewed such tech for off-the-shelf (or thereabouts) chips from a company that makes them for a living. That certainly has its benefits (it's simpler for developers for one), but they aren't challenging the best the PC has to offer in the same way that the PS3 and Xbox 360 did.

Where those consoles ushered in the HD generation, such a sea change isn't in the cards this time around, at least in terms of whizzy visuals. Instead, the PS4 and Xbox One peak at a nice but hardly cutting-edge 1080p at 60fps for games. That's the standard for now, but if big tech trade shows like CES and IFA are anything to go by, it certainly won't be for long.

"…they aren't challenging the best the PC has to offer in the same way that the PS3 and Xbox 360 did."

4K, or Ultra HD as it's otherwise known, has been around for a while in cinemas and video production houses, but it's only in the last year or so that the technology has switched from completely absurd pricing to something that's a little more accessible for the average Joe. A decent 60Hz 4K TV or monitor still runs you a couple of thousand dollars, but as the technology matures over the course of the next few years, prices will fall. And if you're willing to take a punt on a cheap Chinese import, a 50-inch 4K TV can be yours for less than $1,000 right now.

So why would you want 4K? Think of it like the first time you saw the iPhone 4's retina display: the crispness of the text, the pin-sharp pictures, and the horrible realisation that from that point on, anything less would look like garbage. It's a wonderful thing to see with your own eyes. And, unlike the 3D technology heavily pushed to get us all to buy new TVs, it's unlikely to be a fad. After all, it's an easier up sell. 4K is more--more pixels, more sharpness, more definition--and it's easily demoed in stores. If there's one thing people love, it's more.

Like in the transition from SD to HD, there's not a whole lot of native 4K content out there at the moment, but if you're a well-heeled PC gamer who doesn't mind a little bit of fiddling, you can get in on the action right now. Monitors like ASUS' PQ321 31-inch display are slowly hitting the market, and while it's hardly cheap at $3,499, it's far better than the $20,000 such monitors once cost.

Rendering the 8 million pixels of a 4K set in real time is a tough challenge, even for the most powerful of PCs. And it's made all the more difficult by how 4K monitors currently work. Rather than one giant monitor, they are actually two 1920 x 2160 panels stitched together, so clever software from the likes of Nvidia and ATI is needed to prevent any noticeable vertical tearing or artefacts between the two panels. There were some problems in the very early days of 4K, but the latest set of drivers from both companies seem to have ironed out most of the issues.

Indeed, we didn't spot any of those issues as we gawked in amazement at Metro: Last Light and Battlefield 3 being run in 4K. The benefits of cramming so many pixels into a display are open to debate (and be sure to watch Reality Check for an insight into that), but when you're sitting just a few feet away like the typical PC user does, everything looks crisp and clear, and the 31-inch size of the monitor does wonders for sucking you right into the action.

"You need a very powerful PC to drive the display."

But it's worth reiterating that you need a very powerful PC to drive the display. Our test rig--despite sporting an Intel i7, Samsung Evo SSD, 16GB of RAM, and a killer graphics card in the form of Nvidia's Titan--struggled during some of the more demanding games. The opening of Crysis slowed to a crawl, forcing us to knock down from ultra to mere high settings, while busier sections of BioShock Infinite suffered from some mild chugging.

There are also some games, such as Skyrim, that don't have particularly high-resolution textures, meaning they look a little worse in 4K, thanks to the blurring effect of stretching the textures out. More games are poised to adopt higher-resolution textures, though, and it wouldn't be a surprise to see next year's PC games adopt 4K assets as standard over the 2K ones currently used. But the fact is, most games work fine at decent frame rates and look spectacular. There's not even a need to use antialiasing, thanks to those tightly packed pixels taking the place of blending colours to smooth edges. And it all works on technology, albeit high-end technology, that you can buy right now.

The beauty of the PC is that today's high-end tech is tomorrow's mid-range. In as little as a year's time, playing 4K games on the PC is going to be much cheaper, and they'll perform even better too. That leaves the next-gen consoles in something of a predicament. If a $3,000 PC equipped with the most powerful GPU around can only just cope with 4K, what chance does the PS4 or Xbox One have? A deeper access to hardware and a lighter OS gets you only so far.

Raw power and eye-popping visuals certainly aren't a requisite for making a great game. Indeed, as the late, great godfather of Nintendo, Hiroshi Yamauchi, once said, "We cannot guarantee interesting video games through the use of better technology." But as living rooms across the land begin to move to 4K, the next-gen consoles won't be able to deliver that content. Sure, Sony has confirmed that the PS4 will support 4K films and photos, and Microsoft has said that the Xbox One will too. They may even support lighter 2D games in 4K further down the line.

But the big blockbuster AAA experiences in 4K? That will be the domain of the PC. It's a platform that's easily adapted for new technologies, and that's only going to get more powerful as the years roll by. It's even got its sights set on the living room with the likes of Valve's SteamMachines and Steam OS. Honestly, once you've seen what games look like in 4K, those next-gen consoles are far less attractive. You're simply not going to want anything less.

you know Nividia Garage demo was released with the gtx 400 series and only in 2014 such tech will be implemented in a game and that's project cars and this guys talking about pc leading the future lmao such a joke

PC gaming is dead a platform that gets no attention from developers is just useless

im sorry to say it but im sick and tired the extra quality you get on pc's is not worth it anymore for me

To get REASONABLE 4k gaming will cost AT LEAST 3500 right now that's 4k resolution with CONSISTANT 30+ fps gaming. Its too expensive and it WONT be affordable for at least another 2 years. The PC itself isn't even the problem getting a single titan PC with a core i7 is not too terrible in price for a gaming rig, but its finding a decent monitor that's both 4k and fast (10ms) or lower. At the moment NextGen consoles have the window to capitalize and have a great market for gaming that is better suited to NORMAL people. I am a PC gamer, I LOVE my PC for gaming and I have been running games at 1920x1080 at 40+ fps for YEARS and frankly THAT is acceptable and what I expect to see on the NextGen consoles. 4k is great but seeing them SIDE BY SIDE to 1920x1080 though its a LOT more resolution, its NOT worth the cost. I judge this as SD to HD was a HUGE difference and DROP DEAD noticeable... HD to UHD is NOT that fantastic and its certainly not worth the cost right now in my opinion.

I would be more concerned bout how they can use the raw horsepower (on PC or next-gen consoles) to improve the ACTUAL RENDERING. Things like how good skin looks, how much crap you can stuff into a scene. Larger worlds, more crowded worlds, further line of sight in a manner that actually contributes to gameplay (sniping simulator? I don't know). The point is, most current gen games are simply cleverly designed to put their best foot right in the middle of the scene and blur out the rest of the background. It's either a depth of field effect, or excessive shininess everywhere, or really crappy background textures. The only things that look good most of the time, are character models and weapons in case of first-person shooters. The rest of the effort is spent in pulling off big effects that you won't be looking too closely at anyway.Few games have been good with the attention to detail and filling scenes with at least decent-looking crap - like the original Crysis and the Uncharted series. But with the additional horsepower available with next-gen it would free up a much wider range of developers to put more detail (actual, interactive detail. Not just richer wallpapers) into their games. That's what would really excite me.

The MGS V playthrough was a good example of that. Rain spattering off all surfaces. Smooth, lifelike (obviously mo-capped) animations everywhere. Camera zooming from nasty scars out all over to a sprawling, living complex with people doing stuff even in areas you can only see out of a corner of the screen. All with nary a snag. Now I don't know what they were running it on, but that's (and hey Ryse was looking pretty good too) the kind of detail I want to see in speedy big-budget game going forward.

Of course PC will have better graphics than consoles. You can't just rip out a graphics card, or upgrade the memory in a console like you can with a PC. Consoles have to make do with the stuff it gets built with from day one, and so far it's done a pretty good job (uncharted and last of us look damn good considering it's running on 2006 tech). No matter how powerful a PC is next to a console I will always choose the console. It's not that I don't understand why pc's are more powerful, it's just that I don't care

Also, its somewhat absurd to think PS2 graphics were so far ahead of PC in 2000 when in 2004 PC graphics were SO MUCH AHEAD of PS2 (Far Cry 1, Doom 3, Half Life 2). There was evolution on PC graphics from 2000 to 2004, but not SO MUCH as to say PC was FAR BEHIND in 2000 and FAR AHEAD in 2004.

Consoles had far surpassed anything available on the humble PC, by the time the PS2 arrived??? I never read so much nonsense on Gamespot before.

It´s usual for consoles to launch with better graphics, just to be again surpassed by PCs 1-2 years later.

But FAR SURPASSED is too strong of a term.

Some games from PC in the year 2000

Deus X

American McGee Alice

Vampire the Masquerade: Redemption

Giants Citizen Kabuto

Project IGI

Thief 2

Tomb Raider 5

I am not talking about good games, I am talking about games with good graphics to the time, that were not behind most of the games being released to consoles at the time (and as we know, even though console hardware does not improve, as years pass, developers are able to pull out a little more juice from the same hardwares... back in 2000, they hadnt learned yet, so most console games were NOT much ahead of the ones I cited)

I can't stand playing at 720p, weirdly enough on my 24" monitor and 42" tv 1366*766 is doable, but there's a noticeable difference en prefer 1080p allover, 50cm from the monitor or 3M from my tv!

4k res is awesome but i don't get it, only the gamers below 18 (who mostly aren't able to afford the rig or screen) have their eye retina density high enough to comprehend it. 1600p is understandable, above that is plain stupid at most ages..

At 50"or above, sure gimmie 4 k, but what's the use for paint on the walls then?

I'm super excited for the 4k gaming era. I don't think it's come quite yet -- the prices are just too high -- but in the next year or two, we'll be there, and you can bet your ass there'll be some hardware capable of running it at 60fps.

I remember when we were told that everything was going HD. People were skeptical then, just as they are now. The rest of the world had been doing it for years. It really didn't take that long for the conversion once there minds were set. It will happen. Not right away, but it will start to happen slowly, then it will build up speed. Why worry about it. It will happen when it happens. No amount of debate will change that.

I once changed the settings on my mate's 1080p TV to 720p without him knowing. I told him about a month after, when he was pointing at it and saying "I can't believe you don't see how much better this is than 720p". When he gets a 4K TV, I plan to change it to 1080p and never tell him...

So I did some checking, and the PS2 can do the following resolutions: 480i, 480p, 576p, 720p, and 1080i. Only a handful of games ever released for the PS2 do anything above 480i or 480p. There were still a lot of games on the PC at that time (the PS2 was released in 2000) that were 640x480 (better than 480i and the same as 480p) but many of the games would go up to 1024x768. RollerCoaster Tycoon, for example, supported both 640x480 and 1024x768 and it was released in 1999. That's a 38% higher vertical resolution than 480i and 480p and is still better than 720p.

I've been playing PC games for 25 years and consoles have NEVER had better graphics than PCs. I'm very curious about where this information came from. Old CRT screens (which were what we had back in the PS1 days) didn't have the resolution that PC monitors did so artifacting was an issue and by the time the PS2 was released, NVidia and AMD were slugging it out for home graphics dominance and we were getting new graphics card releases by both companies a couple of times a year.

:-) Life marches on! Sure 4K is pricy now and early adopters will pay the price... but at some point 4K will be the norm and then it'll be 8K (and everyone will shout and scream we don't need it)... then 16k etc.

This is the inevitable march of technology - it's a good thing really. I remember people saying we didn't need any more power (or resolution) when I had an Amiga 1200 with 8 gig of RAM (that was a lot) - both statements totally laughable now.

I think what really gets peoples backs up is when they don't get a good long use out of their investment. If I spend a wedge of cash of a brand new TV and games system - I don't really want another next generation standard to come for a while ;-)

PC owners however are kind of used to it - Sure my monster rig is perfectly capable of playing most things maxed out at 2500x1600 but I'm used to swapping out graphics cards, motherboards and processes - it's a way of life and part of the fun. I think console owners "may" start to realize a shorter cycle than the usual 7 year one - especially as the parts used in them are now more `generic' than they used to be.

Either that or there will start to be a bigger disparity between PC games and consoles...

Bottom line is you need to have cash if you want to be at top or an early adopter and as always - it sucks to have shit forced upon you, especially if you can't afford it!

But then again, 4k makes sense in PC gaming when you are sitting quite close to see the pixels and the jagged edges of a 1080p monitor. I don't think it makes much sense if you're sitting on your sofa and playing in your living room on a 37 or 42 inch. since most consoles are going to be played in that scenario, it is pointless to put in better but expensive hardware for a feature that people won't really notice or take benefit from it.....like an octa core or x64 processor in a smartphone =)......but with that said, I don't think that this genration would last as long as the current out going one since the technology would just be a little too outdated in 3 years.

This guys right... I see the errors in my ways. I mean why spend $400 on a new system when I can go out. Buy a beefy desktop for $3000 and a $4000 monitor..... I've been seriously cheating myself.... Specially since I can buy a laptop *thinking of the portability aspect here* that runs current gen on max settings for like $1500..... *sure building my owns cheaper, but I'm just looking at it from his point of view*

my friend finished Witcher 2 on forced 800x600, to him best game he ever played, for true immersion graphics mean fuck all really, so you need beefy rig to power display, so whats going to power the games, you think you know the answer but its not that simple, PC is the most powerful but most games dont use that power, well they do only for graphics and you would pay few grand just for that

4k isn't just about video games. It's about a lot of entertainment. As 4k becomes cheaper, more people will use it. There are already 4k cameras for $3k. However, don't count on it being around for long. It's an intermittent format to 8k, which will be the next big thing. However, there is a big push for 4k right now in many areas of media and it could very well sweep the market. The only thing lacking right now is cohesion between displays and methods of getting 4k. Theaters, atm, aren't really equipped with 4k projection, so any movie shot in 4k won't really matter because it won't be shown in 4k. Video games might have a better chance, as they could actually be displayed in 4k, given that you have a 4k TV or monitor. But for now, there's really no telling whether or not it will catch on.

Id rather have a nice oled tv to play my consoles on than 4k resolution on my pc for gaming.

Yeah hi res is nice and all but nice graphics are nice but they don't make a bad game good, or a good game better.

4k wont be standard for a very long time and when it becomes mainstream it wont be because of pc gaming. It will be because of home consoles making it mainstream because consoles is what the industry follows the most. If it isn't home consoles it will be the movie industry, but even that isn't a given considering the movie industry backed 3d and it never took off. So when 4k gaming becomes widespread it will be when there are new consoles that support it.

I really don't get the obsession with resolution when there are so many more types of eye candy. Try this: play something like the Witcher 2 on PC with the highest graphical settings and resolution set to 1280x720 (no AA). Then play it again with the graphics at Medium and res bumped up to 1920x1080. In my mind there is no contest--lower res and higher quality looks better every time. Maybe this is just me.

@uncagedpaul_86 which is a mistake on their part. Sony, MS, Nvidia\Intel, & AMD could make so much off of the consolite kids begging for upgrade parts every holiday. maybe they've got some plan to offer versions with better gpus and more memory as the years go by just to keep bank rolling everyone stuck on consoles.

i know plenty of people who've bought at least 3 of each xbox & ps from this gen just because they had a new design or some game art painted on them, also for the bigger storage capacities. don't worry, there's millions being pumped into their advertising and profit margin research just to keep banking as the years do go by. i imagine just as the PC community with a new gpu released every month or so that has slightly better cooling or is 50mhz faster than the last, the consoles will be pulling the same.

@rogerpenna 1 or 2 years later? In this day and age, it seems more like a week later when it comes to PC. Pc's are already far more powerful than any of the 2 consoles, and they haven't even been released yet.

"4k res is awesome but i don't get it, only the gamers below 18 (who
mostly aren't able to afford the rig or screen) have their eye retina
density high enough to comprehend it. 1600p is understandable, above
that is plain stupid at most ages.."

I've been to trade shows
overseas and I've seen both 4K and 8K tvs/monitors of various sizes ranging
from 28" to 100+" within a close enough proximity to compare. And the
difference is mind-blowing, I'm 30 years old and have 20/20 vison.

@Slagar3D is the present. waiting on my NVIDIA 3D Vision 2 kit to start using my new monitor now. shit's still too expensive though, $350 for 27" 144hz, but once you get that upgrade itch it never goes away. maybe next will be 4k 3D for me, in like 10 years and if i ever get a promotion.

@RPG_Fan_I_Am 4000$? In order for me to spend 4000$ on a PC i would have to add so much trivial shit. Your 1500$ laptop will not run Metro Last Light maxed settings full SSAA and all those goodies, also just get a decent sized screen to game.

For 1200$ you can easily build a PC that will run everything. After wards, you just sell your GPU every 2 years take the money, add 300$ extra out of your pocket, RESEARCH the best graphic card in your budget frame and bam.

150$ a year you have a gaming rig that can constantly run every game maxed.

@RPG_Fan_I_Am You don't need to spend that range of money to top the PS4 and Xbox One, that's to support 4k full blast, and there's no way either console could do that so you're option of spending 400 vs 3-4000 isn't a strong one. With that said if you're a smart shopper and have patience you can build a rig like I did last year for 748 instead of 1300 with savings and rebates and it still smokes the next gen systems. Consoles can never stack up against the PC because the PC will always, always have the flexibility to surpass them each and every time.

@WolfgarTheQuiet Well first off, The Witcher series in general has the capacity to look amazing if ran with good hardware, and the better a game looks the more immersion potential it has, at least from a graphical standpoint. Secondly, you could certainly run The Witcher 2 on 1080p res easy enough with some generic mainstream(ish) ~$125 gpu's depending on the other graphical settings you are running. There's nothing wrong with less graphical intensity, (why do you think handhelds still do so well?) but it's always nicer to have the eye candy if possible.

@WolfgarTheQuiet you are wrong though. with all the highest settings and resolutions the immersion is more intense. not saying that they aren't still great games it's just like the difference between watching a movie in a XL theater with super surround sound or watching one on your old mono glass screen television. there is a huge difference.

@WolfgarTheQuiet Yeah. I love to play older games, from the PS1 and earlier. Sure, they look like shit, but they are still great games. I play most of my PC games on a mid-range laptop. I can't play in full HD with everything maxed, and I don't care. The games works fine at least, and that's all I care about.

@nomadic_topgun I never understand the point of posts like that. If Ferrari's were cheaper, a whole lot more people would be driving them as well, but there's no real point in any of us posting on the Ferrarri forums about how if/when the prices drop, our interest will increase in their product.

@lordgodalming i play more older games as time goes buy, didnt upgrade my rig since 2006, thats when i dedicated to casuual console gaming, give me story any time like The Witcher games any time, even if i dont play them at max, my friend finished Witcher 2 on forced 800x600, to him best game he ever played

@headbanger1186@RPG_Fan_I_Am most consolites don't seem to understand this. babbling on about $5,000 systems that us "hermits" build. have just over $800 into my current build and haven't found a game yet I can't run at it's highest settings @ 1080p, usually between 40-60fps. and they never seem to figure in all the other uses we get out of these systems: burning movies, music, creating artwork, just producing text documents, studying, the list is infinite. even those things that a console may be able to do are very limited and still the others won't be implemented in this "next" gen that is supposedly so great.