Posted
by
Soulskillon Tuesday April 30, 2013 @07:38PM
from the scale-it-until-they-catch-fire dept.

Vigile writes "One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080. Yes, 25x16/25x14 panels are coming down in price, but it might be the influx of 4K monitors that makes a splash. PC Perspective purchased a 4K TV for under $1500 recently and set to benchmarking high end graphics cards from AMD and NVIDIA at 3840x2160. For under $500, the Radeon HD 7970 provided the best experience, though the GTX Titan was the most powerful single GPU option. At the $1000 price point the GeForce GTX 690 appears to be the card to beat with AMD's continuing problems on CrossFire scaling. PC Perspective has also included YouTube and downloadable 4K video files (~100 mbps) as well as screenshots, in addition to a full suite of benchmarks."

4K Ultra high definition television 3840 × 2160 which is, as I'm sure you can figure out, double the resolution of current HD content. That said, I will agree that calling it "4K Ultra HD" is kind of misleading:)
See, http://en.wikipedia.org/wiki/4K_resolution [wikipedia.org]

4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

Thank you. Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window! It's "1920x1080 should be enough for anybody" mentality. You'd think people would learn after a while.

Thats mostly because we got really good multi-monitor support a few years ago. we can already get 4, 6 or even 9 times the resolution of 1080p. And if you are willing to blow enough money or cut apart an LCD you can do it with almost no bezel.

"The image will look more life-like than any of the common TVs available today..."

Not because of the wide gamut it won't. Having the gamut on your output device doesn't mean you have it on your input device. Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not. The value is suspect and the cost is mandatory extra bit depth leading to higher data rates.

The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better. This is television we are talking about, not Photoshop. Today's HD content won't look the least bit better on a wide gamut display, it could only look worse.

It's different for different parts of the business of course, but the graphic designers I know personally (through a family member) don't care about monitor gamut or colour fidelity at all. Sounds odd, perhaps, but there's good reason for it.

Most graphic design is not for the web, but for physical objects. Anything you see around you that's printed or patterned - kitchen utensils, tools, and household objets; clothes and textile prints; books, calendars, pamphlets; not to mention the cardboard and plastic boxes it all came in - has been designed by a graphic designer. And it's all printed using different kind of processes, on different materials, with different kinds of inks and dyes.

A monitor, any monitor, simply can't show you what the finished design will look like, since it can't replicate the effect of the particular ink and material combination you're going to use. So they don't even try. Instead they do the design on the computer, but choose the precise colour and material combination by Pantone patches. We've got shelves of sample binders at home, with all kinds of colour and material combinations for reference. As an added bonus you can take samples into different light conditions and see what they look like there.

The finished design is usually sent off as a set of monochrome layers, with an accompanying specification on what Pantone colour each layer should use. They do make a colour version of it too, but that's just to give the client a rough idea of what the design will look like.

TVs will include colour correction as part of the up-scaling process for HD and SD video.

4k is a stop-gap on the way to 8k. NHK has said they are not going to bother with it and go directly to 8k instead, which is a huge step up and needs a lot of special equipment to be developed. For example you can't visually check focus on a studio monitor at 8k, you need auto-focus to stand a chance.

There is a value on being there early, though. People who pay a lot for a "4k" display will also want to pay a lot for "4k" content to try their new toy. Of course it won't be just ANY content, it has to be the type of content that interests people with the interest AND money to get "4k" hardware.

4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

In theory yes,

In practice people cant tell the difference between 6 bit and 10 bit colour. Besides this, most people wont be able or willing to configure or manage colour on their TV set properly. Most people cant be bothered to set their monitor at the proper resolution.

It's the same with DVD and Bluray, most people cant tell the difference. They only think they can because they know it's bluray. I can easily convince people an upscaled DVD is bluray simply by telling them it's bluray. They think I'm

In practice people cant tell the difference between 6 bit and 10 bit colour.

Some particularly problematic scenes involving mostly a single color should still benefit, but I tend to agree especially for movie watching. The real purpose of moving to 10-bit components is to accommodate the ~3x larger gamut without introducing banding compared to 8-bit.

In practice people cant tell the difference between 6 bit and 10 bit colour..

That is unfair.

It's not unfair all.

Most people wont be able to tell. The graphs you linked to tell you the measurable difference, but that's using a device to measure it, we're talking about people here. People are terrible at measuring things.

And technology able to display this signal... is not really coming at all. We've taken a huge leap backwards in terms of color gamut displayed by TVs when we made the switch to LCDs, and the only decently promising tech that is coming that has decent enough gamut to compete with CRTs is OLED... which doesn't seem to scale all that well to big screens and has huge problems with longetivity (dimming).

What's the point of having the awesome signal being able to carry a much large color space, when you have no m

That's not really true. The current limit on the display gamut is typically not the broadcasting spec but the display technology. Essentially all LCDs available on the market have significantly worse gamut then CRTs, and CRTs don't have enough to cover the current HDTV spec.

The breakthrough that is currently waiting to happen is OLED. It's the only technology in addition to plasma that has a decent chance of actually making use of gamut available from the signal spec. Considering the difficulties in making

A 9.7" retina display costs 55$ off the shelf for horsiest. That's about 40$ BoM or lower. Quintuple\Sextuple that for a 23" 4K and add a thick margin and you end up in the 300-350$ range without taking into account how this production will scale to make it all cheaper.

As for your 27-30" 3840x2160 desire, it's actually quite easily doable now since it's really not that dense when you consider stuff like 5" devices having 1920x1080.

I would imagine a small OEM could make an order for these right from an exist

Eh. I could see using this for a PC, but honestly, I drive all of my media through a HTPC, and I'll be darned if I'm going to have to buy something that can fit a full height video card just to watch videos, plus the video card. My 51" 1080p plasma display at 10 feet looks crystal clear, with no discernable pixels. Maybe in 5-10 years once the life has been zapped out of this plasma, I'll think about it. But until it is commodity hardware, no thanks. By all means though, other folks feel free. I'm more inte

1920x1200 have been around long before the Dell U2410, so it's silly they ignored this. But would you really reject a 2560x1440 display because it's 16:9? How about a 4k display? That's just silly.

People need to get over this 16:9 vs. 16:10 garbage. What matters is the number of pixels. Once you get past 1200 lines or thereabouts, it's all gravy. I'm happily using two 16:9 displays, a pair of Dell U2711, and I'm well pleased with that. The extra cost to get an additional 160 lines from a 16:10 30" sc

Those are your preferences.. I think anything over 24" is useless for desktop use as it requires too much neck panning and eyeball rolling. It's not just the number of pixels that matters, it's how many can be crammed into your visual range at a time. I'd like to see a 3840x2400 panel in 23-24", 120hz or better, no input lag/ghosting, and deep color support. Of course, this is unobtanium along with the gpu to drive it well, but everyone has different priorities.

I'd happily pay a few dollars more for a 2560x1600 display because it is 16:10.
16:10 displays are superior to 16:9 for almost all computing purposes. For games it gives me a taller FOV, for work it's exactly 2 A4 pages side by side and gives me a taller view (yes, an extra 4 CM really does make a difference when working on a large spreadsheet, config file or script), with video editing you can have the tools on screen without overlaying

Why? Why does 16:10 make a difference at that resolution? I mentioned the 2560x1600 displays, but you know what, they cost hundreds more and they have lower pixel density. The premium for 160 pixels is 30% or more, hell with Dell on Amazon right now it's 50% more.

What exactly are people doing that requires 16:10? I've used 'em, I like 'em, but I'll take 2560x1440 over 1920x1200 any day of the week. Likewise I'll take 3840x2160 over 2560x1600.

There's really only one advantage that 16:9 has over 16:10, and that's smaller black borders (or no borders at all) for widescreen video content. Otherwise, the vertical real estate is very nice to have, and I've found 2560x1600 (which I've used for the last 5 years) somehow really hits the sweet spot between vertical size and widescreen.

16:10 tends to work out better for office work. Sure, the higher res makes it less important. But its the physical size that makes it less important... depending on how much space you have to push the display back.

But once you get up close to the holy grail of true 4k which is 4096, why even bother with 3840? Cinema digital is shot in 4096 (and up). 3840 should be boycotted or even banned.

But once you get up close to the holy grail of true 4k which is 4096, why even bother with 3840? Cinema digital is shot in 4096 (and up). 3840 should be boycotted or even banned.

No black bars. People hated it when we went widescreen, they won't accept another round. There is so much non-cinema 16:9 content that can't be remastered, and even if it could all existing DVDs/BluRays don't have it. The only thing that could give a hint of 17:9 adoption (4096x2160 seem to be the standard for cinema monitors) is if 4K BluRays come with the ability to ship both 3840x2160 and 4096x2160 on the same disc, like an extra 256x2160 slice added to the picture. Or since you're probably doing a bit o

I heard at one time that 16:10 came out of the video editing industry. Basically they were working on 16:9 video, so they had displays with extra space at the bottom for controls. These displays then were adapted to the higher end computer market. However once 16:9 displays were being manufactured in large quantities for consumer TVs, I imagine that drove the price down for manufacturing 16:9 computer monitors. I'm fairly certain the decision to use 1920x1080 in the TV industry had nothing to do with co

Ratio does matter, more vertical space reduces scrolling in documents and web pages, gives more space for content creation that isn't widescreen formatted itself (like making square or portrait oriented art) and is beneficial for some games (most non-first person games). The only real benefit that I can think of for 16:9 over 16:10 is no letterboxing, I'd gladly trade that for the benefits of 16:10 (or even 4:3, I can watch movies on my TV if the letterboxing is going to be that big of a deal).

You're half right. More vertical space is great, but the ratio doesn't matter. For the work you are talking about what matters is the height.

There's a funky Dell out now that has an even wider aspect ratio, it's around 2.33:1 I believe. Now I'd like that, if it had more pixels. It's 2560x1080. That's a detriment. But imagine if that was more along the lines 3350x1440. Would you still complain that was too wide? You could have three documents, web pages, whatever up, side by side, and still have a lo

But what if I'm reading ONE website instead of three? It's true that if the screen is tall enough the ratio doesn't matter, but getting that height costs more horizontal desk space that I shouldn't have to give up and in a world where both ratios were widely produced getting the height from a 16:9 display would cost more. Also if I'm gaming the higher resolution can just as easily be a detriment, requiring more or higher-end video cards to get the other quality options up.

There's a funky Dell out now that has an even wider aspect ratio, it's around 2.33:1 I believe. Now I'd like that, if it had more pixels. It's 2560x1080. That's a detriment. But imagine if that was more along the lines 3350x1440. Would you still complain that was too wide? You could have three documents, web pages, whatever up, side by side, and still have a lot of vertical space.

And if it was twice as wide again you could have six documents open side by side, but you'd probably want it curved inwards at each end, or else you'd have to be like a table football player sliding on your chair from side to side.

I just took a 21" CRT to the recycling place. In 1995, it cost about $2200 new. In 2001, my employer gave it to me as scrap when our building was closed and they decided that a lot of that stuff was cheaper to give away than to move to some warehouse across the country. (Plus it was a tiny bit of good will that the local management could show the laid-off employees when the Big Guys were being callous pricks kicking us to the curb while we were still going to 9/11 funerals.)

I bought a Gateway open-box 21" monitor back in the late 90's for about $1k. I think it could do 1600x1200, but it wasn't real solid at that. That was one heavy beast to move around. I got rid of it some time ago, don't remember how. I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines, but other than that the image was solid. Eventually its color started to go though. I also bought a 17" NEC monitor in '95 or '96 for just under $700

I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines,

That was true of all Trinitron monitors. Here is what Wikipedia says [wikipedia.org] about them:

Even small changes in the alignment of the grille over the phosphors can cause the coloring to shift. Since the wires are thin, small bumps can cause the wires to shift alignment if they are not held in place. Monitors using this technology have one or more thin tungsten wires running horizontally across the g

... like 4096x1728 (digital cinema size plus a few more pixels to make it mathematically right)? Feel free to make the actual LCD pixels a bit smaller so it can all fit in a decent size (not over 80cm, please). Hell, I'd be happy even with 2048x1280 for now so I can avoid the border bumping on 1920x1200.

Seriously, whining about a few extra pixels more or less is silly. The "double HDTV" version of 4k is fine, and works well for things given that it makes scaling a 1920x1080 signal easy. There is nothing special about 2^12 when it comes to monitors. We also wouldn't want a computer monitor with such a wide ratio. When you are doing computer work, vertical real estate matters. 2:39:1 CinemaScope is fine for a movie. It isn't what you want when programming or writing documents.

Two of the three most popular desktop OS famillies (windows and linux) don't have proper provisions for resolution independent font and window sizes.

The problem is that they tend to mis-report the physical size of the viewable area of the displays, without which you can't work out the scaling factor. (The low-level font rendering engine wants pixel-based sizes for obvious reasons, though you might well not be normally working at that level.) However, a bigger problem in practice is that the non-text parts of windows are not designed with scaling in mind. The most obvious example of that is where someone uses absolute positions for all the components wit

In reality, almost no desktop software is capable of displaying at this high DPI without messing up font size or layout. Two of the three most popular desktop OS famillies (windows and linux) don't have proper provisions for resolution independent font and window sizes. It's really leet that your have a bazillion ppi, but if that means that a web page renders at the size of a stamp, you can't properly read it a distance comfortable for your eyes. I work with computers. That means I have to look at a screen for 8 hours a day and then drive home in busy traffic. If my eyes, neck and back muscles are tired and sore from staring at a monitor all day, that's not going to be a huge success. I need a display that will comfortably display my 20+ application windows at a good arms length or a bit more. Until Operating Systems are capable of doing that even for legacy applications without depending on 86 PPI screens (still the standard for Windows 7) I don't want insane PPI numbers but actual screen real estate. I currently have two 30" IPS screens at home and those are comfortable. Increasing the PPI on those will not make them more comfortable and in reality, my productivity will not rise even with added pixel count.

are you just trolling or are you ACTUALLY USING OSX? because windows and (popular desktops on) linux have had scaling for fonts, window decors etc user elements(for properly built programs) for years(over a decade). - it's a must thing to have if using fullhd resolution on a 15" screen. and works just fine for most programs(it's an easy way to see if an application was coded crappily or not).

Bullshit. The text rendering engine properly uses the available resolution, and the major apps that use their own cross-platform rendering engines (MS, Adobe) have been updated. Vector graphics also get rendered properly at the actual screen resolution. Apple limits the APIs notion of screen resolution to "regular" and doubled just to make things easier on developers with regard to bitmapped graphics while avoiding crappy scaling of bitmapped graphics. (Of course applications that don't provide high-res ver

Though as someone who's been a gamer since duke nukem... and the ultima games... I don't see what all the hype is about. The colors aught to be much nicer on a 4k display, but I know I won't be spending money on one until their dirt cheap or I get one as a gift (which means they'll be dirt cheap by then).

Then again you can make a pretty game, that gets pretty boring pretty fast =) I've played some hideous monstrosities with the worst interfaces known to man just because the actual game was fun.

There are a few games where a multi-monitor setup is really good. Flight sims in particular where you want 1 or even 2 monitors to your side to display the side windows, maybe one above or one for the instrumentation.

In fact if you're learning to fly, a multi-monitor setup with HOTAS is a godsend.

But so few games actually support multi-monitor setups. So for the most part they are just e-peen extensions.

They have to do with backlight and filters. You can already get monitors with much wider gamuts than normal (normal meaning sRGB). This can lead to much more realistic colours since it can cover more of what the eye is capable of seeing, and more importantly can cover all of the colours you are likely to encounter in the world, excluding some special cases like lasers.

The issue currently is that most software isn't colour space aware, so it won't render things right, you'll get oversaturation when you use a

I'm actually in the market to replace my existing monitors (2xDell 2408WFP) with new monitors. I'm currently considering 3x27" 2560x1440 LED IPS monitors (Dell U2713HM or LG 27EA83-D are the top 2 choices right now) so this interests me greatly, and something I keep seeing popup.

We're approaching "retina' resolution [isthisretina.com] on the desktop anyway at 2560x1440 already. I of course mean retina in the Apple marketing sense as "at the normal viewing distance the human eye cannot resolve an individual pixel".

True enough, but you're talking about a niche. There's niche uses for tons of technology that has no place on the average consumers desk (Like hex-core procs with hyper-threading, $1k video cards and motherboards with 10 SATA ports).

It would help if those apps had better multi-monitor support. A lot of CAD apps don't seem to have updated their interfaces since the move from DOS to Windows 3.11. Having said that it took until WIndows 8 to get proper multi-monitor task bars.

It really is. I see people talking about how anything lower than 400 ppi is unacceptable on their phones and I just shake my head. I'm sure there are people who can legitimately tell the difference, but the vast majority of people aren't going to be able to. Meanwhile upgrading to 2560x1440 (or better x1600) may make my games a little prettier, but it's also going to require a third video card and won't do anything for the internet or the DVDs (or even Blu-rays) that I watch. When I get around to upgrad

I think it comes down to screen size and viewing distance. Personally I think 4K could provide a better movie experience than 3D. Sitting close to an enormous screen is pretty damn immersive. Then again, I don't particularly want my living room dominated by the TV.

I do like a large screen to work with, although I cannot use a 1080 screen on a 15" laptop (everything's too small for my eyes). Increasing the resolution and DPI better not make things smaller!

If I could have my perfect setup, I'd have a 32" 4096x2560 main monitor, with two 27" 2560x1600 monitors to each side. And running each at 144Hz, with full AdobeRGB coverage (or better), while we're at it.

I just bought a 1440p display, and it is hands-down the best single computer component I have ever bought. Better than getting an SSD. Better than any new processor, or new video card, or new sound card, or new RAM.

True, I'm probably never going to watch video at that resolution. And it's li

That's really the important thing. We've been stuck in a rut with display sizes for a long, long time. It's time to move pixel density forward. The 27" displays that have been on the scene for the past 2 years or so are great, but so far the price hasn't dropped a great deal (disregarding the generic Korean Dell/Apple rejects).

Once 4K TV production ramps up that should lead to more higher density monitors at reasonable prices. Sadly I have to admit that it really seems like Apple was the company that pu

Oh yeah I love using HiDPI mode on my 27" iMac to turn a 2560x1440 display into a virtual 1280x720 screen with twice the detail. This lets me sit way back 3-5 feet or more and have a nice readable picture. Helps avoid eye strain and is really nice how crisp everything is. Of course 1280x720 is limiting useable screw space and I occasionally have to switch it back, but I really do prefer to use it whenever possible.

It's sort of the opposite of what a true retina iMac would do for me though.

Until the monitor manufacturers can crank those puppies out for the same price they can crank 1080p screens out for its largely gonna be moot, it'll be one of those niches like laserdisc back in the day that only a few of the videophiles cared about.

Until then the sweet spot seems to be 32-42 inch 1080p screens, at least that is what I'm seeing in the shop from both the gamers and those setting up home theaters, with the gamers preferring 32s and the home theaters going 40 to 42. Makes me feel like a dino

The reason I see more 40s is 1.- We have a LOT of apt dwellers and in an apt there just isn't as big a living room so the monsters don't fit. I've seen one TV over 42 and that is a sports nut that wanted the biggest mother he could fit so all his buds could pile in for the game, and 2.- We have a LOT of families so you get 2 sets, one for the parents and a smaller one for the kids, that tends to cut into the finances.

But yeah until the price drops a shitload i just don't see it happening, it'll be like 3D T

A 4K 50" display 4' or 5' away would give you a pretty damn immersive experience. Wouldn't that be nice?

I'm sitting with my eyes about 3' from my 27" 2560x1440 display with about 108ppi. I can make out some pixels as it is in the text. I'm not wearing my glasses, so that helps some. If this was a 4K 27" display, that would be 163ppi. That's a 50% increase right there.

Wasn't that long ago that running 1280x1024 on a 17" LCD was pretty damn nice, and that was 94ppi. So for a decade we've barely improved when it comes to density. Hell, a 24" 16:10 display that so many people love so much has the same density as a 17" LCD.

Of course my very first PC games ran in CGA, and I thought VGA was a huge step up. But at no time have I ever thought to myself "Nope, more wouldn't be better". Not when it comes to graphics, RAM, harddrive size, etc. Give me more and I'll use it.

Yes, it was a decade long race to the bottom for price and performance. People seemed to just want large and cheap monitors that fit their widescreen DVDs.

NEC and Eizo kept on producing quality displays but, much as I hate to admit it, I think it wasn't until Apple came out with high resolution displays that people started to care again. They only got half way there though. The failure of 3D to really generate a lot of sales is what started the push to 4k.

Of course my very first PC games ran in CGA, and I thought VGA was a huge step up. But at no time have I ever thought to myself "Nope, more wouldn't be better". Not when it comes to graphics, RAM, harddrive size, etc. Give me more and I'll use it.

As a game programmer I have to say, pushing pixels is damn expensive. I make my 3D data visualization software and games resolution independent (because, why not?) but some AAA game devs don't (not in the budget). That means when you get a higher resolution monitor and throw more pixels on the screen your UI text can shrink. Oh, just scale them up? Yep, then what's the point more pixels if you're just going to upscale the textures?

does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!

Support varies by engine; but one (reasonably) common thing that people do with games that weren't originally designed with high-resolution widescreens in mind is mess with the field of view. Some games react badly, with all sorts of distortion effects; but it can also create a nice 'peripheral vision' sense that the game originally lacked.

This would also be engine-dependent, in terms of how much it can be tweaked; but it isn't uncommon for (even comparatively low resolution) games to have decent texture as

does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!

What does that matter when the game devs program for the consoles and port it to the PC? Sure, maybe on the PC you can go a bit higher resolution, but crappy textures still look like crappy textures.

uh wait what? no.. Bitmapped data (like an mpeg stream) will not show you more detail if simply scaled up. The detail has to be there in the first place. Some tvs and players can 'add' sharper edges and gradients, as well as add intermediate frames (temporal/motion smoothing) with filters, which are running at the higher resolution, but this is fake data added after the fact. Some people like this and some don't.

But how many people can actually see that when all the shit be blowing up all over the screen? I used to always have to go for the biggest GPU I could possibly afford but then one day it was pointed out to me "All you are doing is letting those around you enjoy a little more eye candy, you are too busy trying not to die to notice" and he was right,when I am focusing on actually playing a game as long as it stays above 30FPS and doesn't have obvious graphics pop in I couldn't pay attention to bling, I was to

Agreed. And the reason this is being pushed now is probably BECAUSE 3D is bombing. The content companies need to push something new to get people to upgrade their TVs, 3D didn't work so now they're going for 4K. Whatever, when I bought my TV I chose the 720 model because it was going to be used for Xbox and DVDs, I doubt I'll be getting a 4K for a decade or so.

Well they can push all they want, working in a little shop that does a lot of home theater installs i see what folks are buying and its a mix of 720p and 1080p at anywhere from 32in to 50in (with 42in being the sweet spot, but we have a lot of apt dwellers so that makes sense) and no 3D TVs in sight, folks would rather get a larger screen than pay for the 3D capable screen AND pay for extra glasses for something that the content just isn't there. I hear they make a few 3D Blu Ray but since most folks here a

On a side note, I wonder how much work would be needed to get current cards rendering 4k Surround/Eyefinity.

Buy the monitors and cables, and hook them up? My 6970 has 2 DisplayPort outputs, each of which can support up to 4 monitors with the correct cable/splitter. 4K would only take two monitors on each, and the 2-way splitters are fairly easy to get your hands on. I don't even need the splitters, as I also have 2 DVI outputs on the card, so I can drive two monitors with DP, and two with DVI (and I have never seen a monitor that supports DP and doesn't support DVI).

Most people read information on computer displays, reading web pages, emails, facebook updates, twitter feeds, wikipedia, and reference materials; and work in word processors, spreadsheets, and programming environments. All of these features are regularly constrained by vertical resolution.

For people watching cat videos and playing simple games (which comprises almost everybody else not doing the above), neither >1080p resolution nor fidelity matters.

For people doing high-end gaming and watching high-end media, your situation applies. However, it's a pretty tiny sliver of overall computer monitor time, all things considered.

I would take a 1080P display with a 10% improvement in contrast ratio over a 4k TV any day.

I wouldn't. I have a 1680x1050 panel at home. Why would I move to a 1920x1080 panel? That's 30 more pixels vertically. Since most of the time I'm coding I have 2-3 code windows tiled hoizontally. 1680 pixels is more than enough for slighly more than 80 cols per terminal and a useable font size.

A 1080P monitor gives me nothing except the ability to watch high-def video without scaling. Since I don't do that on my PC anyway I don't see the point. Now, if you could give me a proper 1920x1200 monitor that would be a few more vertical lines and that is better.