If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Sooooo... a game that is set to run at 800x600 could take up the fullscreen still instead of just the ACTUAL 800 by 600 pixels?

No, that has been possible for a long, long time already.

This is mostly about windowed applications on high density screens, where the window would be unreadably small (not only text, but buttons, lines, all kinds of graphic) without scaling. While configuring a default scaling factor for all stuff on a screen, it also allows the application to render in high density and bypass the default scaling, to produce finer graphics.

So in essence, applications that do not care about high density will be scaled up by user preference, and they will become readable and usable, while applications that do handle high density have the chance to produce finer graphics.

This is not about DPI, font sizes, or keeping windows sizes equal in meters on different DPI monitors. This is simply about an integer scaling factor, that will make the difference between usable and unusable for a user.

Which is still good, I'd love to see high-DPI displays become common on desktops. It's already there on mobile, tablets, Macs and ChromeOS (Pixel). Windows is holding us back for desktops and general laptops though. Hopefully with everything on Linux supporting high DPI and the general push towards Linux and Mac with the failure of Windows 8 we can finally start to see high DPI non-Apple PC displays become standard.

Been around since 2006. I've had monitors with it personally for 3 years now. No, the reason Windows is holding us back is because no display manufacturer is willing to make a panel for the PC market that Windows does not support, and Windows on a high-DPI panel is a horrible experience. Especially WinXP, which still has considerable market share and until recently was still #1.

Or do you really think Apple has some secret ability to make high DPI panels for Macbook Retinas that the entire monitor industry doesn't? Apple doesn't even manufacture their own panels. Clearly, it's the OS that's holding us back.

No no, of course not. As you say, Apple don't make the panels (although they do buy up most of the capacity of many of their components, so the rest of the industry is locked out practically).

But we're talking about different things. Laptop-sized panels obviously exist, but to my knowledge desktop-sized panels with these sorts of resolutions do not (except for the prototypes that get shown at CES every year but never seem to amount to anything). Even if they do exist, there certainly aren't any complete displays actually available to buy. You might be right that all this is due to a lack of support from Microsoft, but that is conjecture and it isn't obvious that this is the reason. I personally think it is much more likely that the extremely high cost of these displays would make them a non-starter. Ignoring Windows entirely, I cannot imagine there wouldn't be demand from Mac owners if these were available for anything resembling reasonable money.

I stand corrected about DisplayPort; I didn't know that v1.2 (which has not been around since 2006) had so much bandwidth. Very cool.

Regardless of the specific spec, they could always use multiple cables like some 3D displays do with using two DVI-D DLs to make 4 links. I really do believe that if the market was there for high resolution displays it could have been done before now. I find it hard to believe we have really been stuck at around 100PPI for so many years without progress.

And I'd be willing to bet money as proof that the first retina class desktop display also comes from Apple, not due to technical ability but due to their entire market having an OS and app platform that can handle it properly.