4K Monitors: Everything You Need to Know

Ultra HD (UHD) is the next-gen PC resolution—here’s why you have to have it

Dream Machine 2013 had some bitchin' hardware, but most of it was available at retail for any well-heeled hardware hound. One part, though, was not available to the unwashed masses: its glorious 4K monitor. You see, 4K was an other-worldly resolution back in mid-2013, simply because it offered four times the resolution of 1080p—at a wallet-busting expense of $3,500.

Now, though, 4K is available and relatively affordable, and all modern games support it, making it one hell of an upgrade. Over the next pages, we'll tell you all about 4K, show you what you need to run it at its maximum output, and explore 4K gaming benchmarks, too. But as sweet as it is, it's not for everyone, so read this guide before making the move.

What is 4K?

a slight misnomer, but catchier than ultra hd

To put it simply, 4K doubles the standard 1920x1080 resolution both vertically and horizontally to 3840x2160, which is quadruple the pixels. We can already see you folding your arms and scanning the page for a downvote button, saying, “That’s obviously not true 4K. It only sums up to 3,840 pixels.” “True” 4K resolution is a term used in the movie industry. When you hear about movies being shot in 4K, they’re typically shot at 4096x2160 with an aspect ratio of 17:9. On the PC side, we generally run with television makers, who have mostly settled on a resolution of 3840x2160, which uses the same 16:9 aspect ratio as 1080p. Despite this being far short of 4,000 pixels horizontally, television and monitor makers have all settled on 4K as the term to push, rather than Ultra HD. In other words, we don’t make up the buzzwords, so hate the game, not the player.

In a historical context, 4K is simply the next rest stop along the path of technological progress. Way back in the day, we ran our displays at 640x480, then 800x600, then 1024x768, and then 1280x1024, and so on. As graphics cards became more powerful, we were slowly able to bump up our display resolutions to where they are now, which for a large majority of gamers is 1920x1080, or 1080p. Society has settled on 1080p as the go-to resolution right now, simply because those monitors (and TVs) are very affordable, and you don’t need a $500 graphics card to run today’s games at that resolution.

Not everyone is average, though. Many enthusiasts today run 27-inch or 30-inch monitors at much higher resolutions of 2560x1440 or 2560x1600, respectively. That may seem like a step up from 1920x180, but a GeForce GTX 780 ti or Radeon R9 290X isn’t even stressed by 2560x1440 gaming. Factor in PCs with multiple GPUs, and you start to wonder why we’ve been stuck with 2560x1600 for more than seven years, as everything else has leapt forward. We won’t question that wisdom, but we do know that it’s time to move forward, and 4K is that next step. Looking ahead, the industry will eventually move to 8K, which quadruples the pixels, and then 12K, and so forth. In fact, some vendors already demonstrated resolutions way beyond 4K at CES 2014, including Nvidia, which ran three 4K panels at 12K using four GTX Titans in SLI. For 2014 and beyond, though, 4K is the new aspirational resolution for every hardcore PC gamer.

It’s All About the Pixels Per Inch

You know how every time we pass around the sun once more and it’s a new year, people joke about their “New Year’s Resolution” being some sort of super-high monitor resolution? Well, we do it, too, because as hardware aficionados there’s always room to grow and new boundaries to push. We want our hard drives to get bigger right alongside our displays, so the move into 4K is something we have been looking forward to for more than a year; as resolution scales up, so does the level of detail that is rendered on the screen. The official term for this spec is pixels per inch, or PPI, and it’s a good bellwether for how much detail you can see on your display.

To see how PPI varies according to screen size, let’s look at a few examples. First, a 24-inch 1920x1080 display sports roughly 91 pixels per inch. If you bump up the vertical resolution to 1,200 pixels (typical on some 16:10 ratio IPS panels), you get a PPI of 94. If you crank things up a notch to 2560x1440 at 27 inches, the PPI goes to 108, which is a small bump of about 20 percent, and probably not very noticeable. Moving on to 2560x1600 on a 30-inch panel, you actually get lower density, arriving at a PPI of 100. To put this in the context of a mobile device, the Google Nexus 5 smartphone has a 4.95-inch display that runs a resolution of 1920x1080, giving it a crazy-high PPI of 445. The iPad’s 9.7-inch screen delivers a PPI of 264, and the iPhone 5’s PPI is 326 pixels. Clearly, there’s a lot of room for improvement on the PC, and 4K is a step in the right direction.

Pixel Peeping

Now, let’s look at the PPI for a few of these fancy new 4K monitors that are coming down the pike. We’ll start with the model we used for Dream Machine 2013, which is an Asus PQ321Q. With its resolution of 3840x2160 spread across 31.5 luscious inches, its PPI is a decent 140, a noticeable increase from a 2560x1600 display. Not enough for you? Dell has a new 24-inch 4K monitor dubbed the UP2414Q that shrinks your icons for you while retaining their sharpness. Still, it has the highest PPI yet for the desktop panel at a skyscraping 185 pixels. Slightly below the Dell is a new Asus LCD named the PB287Q, which at 28 inches has a modest PPI of 157 pixels. Keep in mind that in some cases, this is a 50 percent increase in the number of pixels per inch, so when you tally that up across the entirety of the display, it equals quite a few more pixels, which results in a lot more detail that is visible even to the naked and semi-clothed eye.

Comments

I'm looking forward to having a text display that truly is indistinguishable from high-quality print. (Retina isn't quite there yet, but the PPI is getting close.) But for gaming?

Games already have more detail than anyone really cares about. The best games these days are not the polygon-overload triple-A titles - they're indie games that barely use the screen technology we've already got. Even with the triple-A texturefests, you can't really see the pixels on your current HD display. You've got to go to a stupid-big screen, or press your nose up against the panel.

I agree with some of the other comments... more monitors, or wider monitors. Or how about better image quality? LCD technology still isn't delivering true blacks or a particularly good range of dark shades. And it's not totally blur-free (though I admit, I really don't notice the blur some people seem fixated on.) 4K won't shift any of these limitations. We're getting 4K just because it's easier for the manufacturers to shrink the pixels than to tackle the subtler, but more meaningful, challenges.

More pixels are always better, but the one thing this article proves pretty conclusively is that 4K is going to be a very bad investment for quite a while - say 2 or 3 years. At that point we'll suddenly find that all monitors, right down into the $200 range, are 4K, and that even $100 graphics cards support them pretty well, without liquid-helium cooling. And we'll realize that - like 3D - 4K just wasn't that big a deal.

i've hit 4K while running PSP emulators on my new ASUS ROG G750JM-DS71 laptop, using only a single 2GB GTX 860M. it's still playable, but the avg fps dropped to 30. This is with 16xTXAA enabled. I also got a bad audio lag. The good side about 4K emulators is: the game i played (Ghost of Sparta) looks ridiculously amazing than before.

I understand that 4K emulator resolution is not real 4K, but still: i'd stay away from 4K gaming (in general) for now until technologies improve for another 5-6 years duration.

Everthing I've been reading is telling me that ultra wide monitors are the next big thing. Easier to drive with graphics cards. Replaces dual monitor setups with one ultra wide. Plus for those it removes the center bezel. Gamer budgets are going to follow immersion.

Agreed. I've been using a multi-monitor gaming setup for years now, first with Eyefinity and now with Nvidia Surround. Add in TrackIR and it's all the immersion you'll ever need, far superior to merely increasing resolution. Like you, I see double-wide (so-called "ultrawide") monitors making a far bigger splash than 4K. Past a certain point, there is a limited usefulness to increasing resolution, resulting in diminishing returns on investment. For most users -- including gamers -- we're already past that point.

Looking through the electronic equivalent of a paper tube, no matter how fancy the tube, cannot imitate the way our eyes view the world. On the rare occasion that I play a non-Surround game, it doesn't last very long. I simply can't stand the feeling of claustrophobia which gaming on a single screen generates for me now that I'm used to much better range of vision and peripheral motion cues.

How about: "There's no single card solution that will afford playable frame rates at this resolution with any of the recent AAA titles—expect to shell out thousands for multi-GPU rigs and get barely tolerable frame rates."

GPUs are not ready for 4K, not unless all you do is word processing and web browsing. But thanks for the shameless pimping.

This sums up the enormous hurdle for the AAA gaming community when it comes to 4k resolution. The price barrier to entry for the multi-card setup needed to push a modern FPS on a 4k monitor is a huge expense.

I'd rather game across 3 (small bezel) 24-inch HD monitors than a single 28-inch 4k display. If I could get an ultra-wide IPS monitor that could do this for me without the bezels I'd consider it.

GPUs have become more efficient, but we still haven't seen any single (reasonably priced) cards that can play triple-A FPS games at 4K resolution with consistently playable frame-rates.

SLI is really an inefficient use of power and money for the most part. We need some single card advancements that will help boost 4K accessibility to the masses of gamers that can't (or won't) spend $3000 to $5000 on a single PC build.

The Nvidia 800 series should have been out by now, not the re-branded old chips, but the actual new 870, 880 units. Much of the hardware development seems still focused on mobile or streaming technology.

The real problem I'm seeing is that the display industry is seemingly having a "tough time" trying to move on from 1080p. Even ultra-wide monitors, the ones that won't break your bank are 1080p (albeit 2560x1080). Everything beyond 1080p starts costing a lot more.

Though from a manufacturing standpoint, I can understand. Given the same surface area, there's a lot less chance of failure if you have less elements to build and test. Then again, I still don't see that as a good excuse to not move beyond 1080p.

From another stance in regards to gaming... why does it HAVE to be at the native resolution? I mean, yes, not running at the native resolution introduces ugly interpolation, even if you have a nice evenly divisible resolution. But 1080p on my 1440p never really bothered me. To me, at the moment, you can get a 4K monitor and enjoy the content you more than likely have it display anyway: GUIs (which are a joke for any modern GPU to render, even at 4K).

Heck, given a large enough 4K monitor, you can still play around with 21:9 resolutions and such.

GPUs have always been playing catch up with monitor tech at least until quite a few years ago when display manufacturers decided 1080p was good enough and squatted on it.

Gaming at non-native resolutions just look blurry to me, it's not like the old days on a CRT where it still came through well. What I'd like to see though would be more support from GPU makers to support different scaling methods, so if you used nearest neighbor scaling you could have a 4k monitor run effectively at 1080 and it would look totally sharp, just with larger pixels. It would let you keep the 1080 for gaming as an option, but 4k for other applications.

I think that all goes back to when they were wrapping up the article. It seems as thought the capabilities of the monitors is far ahead of what the GPU's are capable of. Which is not a bad thing necessarily, cause you could buy a sweet 4K monitor now, then a year down the road, buy the GPU(s) that could run it properly, once they're devloped, proven out, and the standard allow for higher framerates than 60hz.

I know with my GPU, a 6870 Radeon....it would be an extremely underwhelming experience in 4K, but with a "390X" or "395X2", it could be awesome.

I have read this article several (yes, several) times in my monthly subscription mag. This is one of the BEST articles on 4K I have read. I am ready to move up to 4K, but the part section "The GPUs" was very depressing, not so much the price, but the performance. Yes, if I go with the R9 290X Crossfire... But I don't have room for Crossfire set up (i7 3930K / 16 GB DDR3 / Sabertooth X79 / X-Fi Audio / Adapter Card (Taking up my Crossfire PCIe)/ Antec 1200 / Radeon HD 7950 / PC Power & Cooling 950 PSU.)

I have tried to find information on the next generation GPUs from AMD & Nvidia, but at best I get the feeling that I am going to be on hold until sometime in 2015.

A 295x2 is a single card solution, but I think you ment to to say single GPU. Even then, I doubt that an 880 will have the power needed to push games at 4K. Maybe the Maxwell Titan may finally answer the need of a single GPU to handle 4K. Maybe. Still, that may not come out till next year.

Well at a certain point adding more resolution in a fixed space won't offer much benefit. Most tech publications are starting to question the need to push 1440p on 5" handsets. 4K 24" monitors are even pushing the 1:1 practicality (you could use it at 1:1, if you don't mind squinting).

I have a 720p 4.5" handset and it still looks more amazing than the 1080p 15" laptop I had.

If monitor sizes remain at 30" maximum, then I say 8K is going to be stopping point for resolution bumps before it becomes impractical. It's like why we stopped with 24-bit audio and color. (Yes I know there's 30-bit and higher color resolutions, but this is mostly to just prevent errors from accumulating)

All great info! I'm into the year-ish long process of overhauling my "battle station" as my wallet allows. My "budget" for my babies is much higher than I should let it be.. but it's worth it, it's one of the things that brings me joy in life.

The backlight on my aging single 1080p monitor took a dump on me right around the time I began pondering my next build. So I had a decision to make.. one large chunk of change on 1 mid range "affordable" 4k monitor or split it up between 3 of the best 1080p monitors I could afford. After researching and coming to many of the conclusions MPC did, I decided that financially 4k gaming the way I would want to do it was just not realistically within my reach and I went with 3 1080p monitors. I'm sure glad I did because surround/eyefinity gaming is freaking great! Not to mention the usefulness of having multiple montiors for many other things. Ill never go back to a single monitor unless someone makes a sweet (and large) 16:3 display ;) (take notes monitor makers) Next up, z99, ddr4 and a couple of maxwell gpus and ill be set for a while I hope.

What I am most looking forward to is where I hope the technology is in 4ish years when I may be looking at another complete overhaul. I'm hoping for very affordable gaming oriented 4k displays and single gpus that eat 4k for breakfast. 11520x2160 gaming WILL be GLORIOUS ;)

I agree with you. I remember about 5 years ago buying a 16:10 aspect 1920x1200 res monitor for gaming. The problem was that a lot of the games that i played did not support the res and aspect, so most of the time i was forced to use res and aspect's that did not look right for the games.