It looks like 2013 is finally going to be the year that we're going to see truly high resolution displays - according to Intel. Retina displays for laptops and desktops for everyone. Considering promises regarding HDPI have been thrown our way for years now, it's high time they became reality. As the article mentions, there's one interesting possible issue: Windows 8's desktop mode. How will it handle HDPI displays?

My paranoid prediction : Windows 8 desktop mode will not support high DPI, Microsoft will take much care not to patch this, and then they will present this as an asset of their Metro interface in order to get people to use it.

Forget Windows 8. Windows 7 doesn't even support manually entering the DPI. You can choose normal, or high. Sometimes you can get it to accept a percentage.

I'd settle for just being able to manually enter the DPI so that I can reduce it on my low-res monitors (like the SD TV). "96 DPI" on a 27" TV with only 480 vertical pixels looks horrible! And "120 DPI" is worse.

Wouldn't it be nice if the OS included methods to query the monitor's physical dimensions, query the video card for the current resolution, and figure out what the *ACTUAL* DPI of the display is? And then use that figure for displaying things so that a resolution change wouldn't change the size of icons, text, images, etc? Or changing monitor sizes wouldn't change text/image/icon sizes?

Oh, wait, those capabilities already exist (EDID, for ex). But none of the OSes out there do this.

X11 does do that, if the monitor returns valid information (many don't) then X11 will set the DPI accordingly.
It's a pet hate of mine that systems other than X11 completely ignore this information, it makes high dpi screens a pain in the ass to use.

It's not a "hidden option", it's simply under the "Appearance -> Fonts -> Details" dialog. But yes, it is sad that in recent years the Linux desktops seem to have given way to the "lets ignore X11 dpi suggestions and just force 96 dpi" attitude. I think this attitude was adopted to simply make things like websites look like they do under Windows.

But alas, you can still override this and let X11 correctly detect the dpi value and use it. All my Ubuntu Linux systems are setup like this.

Bert64 is correct. X11 has done this for years! X11 has set my dpi to 154 (I think that was the number) on my 8 year old laptop....umm 8 years ago. Windows XP (which was included with the Dell) always hard-coded it it 96 or 120 dpi. Most Windows apps couldn't work in 120 dpi (buttons appear outside a non-resizable window etc), so I was forced to use 96 dpi on a 1920x1200 screen - making for damn small text. Luckily I haven't run Windows on that Dell laptop in years.

Anyway, the hard-coded 96 dpi (from windows) or 72 dpi (from Mac) is what is keeping software and hardware from moving to high dpi displays! Maybe they can actually learn something from X11 and Linux apps (which are much much more high dpi friendly).

XP isn't hardcoded to 120 DPI ... I dunno where you got this nonsense from.

I meant it had two pre-configured settings: 96 & 120 dpi. OEM's gave you one or the other, they NEVER setup your OEM PC or laptop with something else - that was left up to the end-user to experiment with.

Is Windows' high DPI support just about adjusting text size, a setting which I do remembered finding somewhere in Windows 7's control panel, or have Windows GUIs become able to resize such things as icons, fixed-size windows and toolbar buttons in a way that makes high-DPI actually usable without me being aware of it ?

Well, 120 DPI mode was available in XP. That's not exactly "high-DPI". And there's no way to easily, via the GUI tools, change that to anything higher. And there are so many hard-coded pixel sizing elements in XP that setting it to even 120 DPI made things wonky (text labels overrunning the edges of buttons, menus overrunning the edges of windows, etc).

Windows fully supported high DPI apps written using Windows Presentation Foundation e.g. since Vista. GDI/GDI+ apps had to be made DPI-aware by developer (XP in 2001). But every occasion is good to bash Microsoft.

"Retina Display" is an Apple trademark and does not refer to any specific resolution, screen size, pixel size, etc. It's just a marketing term that means different things for different displays (neither the iPhone 4S nor the iPad 3 have the same resolution, screen size, or DPI, but they both have "Retina Displays" (tm)).

Hear, Hear. If not even Apple herself knows what a "retina display" actually is, since the definition changes with each new device, why should we use it in a serious discussion? If you want to say greater than 300dpi, say "greater than 300 dpi".
We should never use terms that came from marketing, it just causes confusion, like "is this video HD or Full HD or Super Ultra Mega HD?"

It's crazy how these days you can't get decent 4:3 monitors. I absolutely hate wide screen monitors, I use my monitor for work, not for watching porn. If I want to watch movies I have a TV for that. Why did they have to screw with something that was perfectly good.

With 17'' displays the 4:3 and the 5:4 format yielded the most displays per production cycle. With 19'' displays widescreen formats yielded more monitors. And when monitors get larger than 20'' widescreen starts getting more useful.

It's crazy how these days you can't get decent 4:3 monitors. I absolutely hate wide screen monitors, I use my monitor for work, not for watching porn.

Why is widescreen a problem for work? For me, it's perfect - I can display two documents side-by-side, e.g a code diff tool, a code window with console output on one side, a spec document next to the code, etc. Sure, vertical space is important too, but the more horizontal pixels, the better...

Computer monitors should definitely not by put in the same category as TV displays.

Unfortunately the hardware manufactures had a shortage of LCD panels back in 2009, and found that if they use 16:9 ratio displays (instead of 16:10) they can cut 18 15" displays out of a sheet, instead of 15 displays. Scoring 3 extra monitors for no real extra cost. So they started pushing the 16:9 ratio LCD monitors, thus giving us the absolute hideous 1366x900 resolution laptop screens, which are now covering 56% of all laptops on the market! Very sad indeed. I think only Apple still sells 16:10 laptops - but for a premium.

Give me the good old fashioned 4:3 monitors any day!! I want more vertical space, which makes much more sense for computer use.

The sad things is that 4:3 monitors actually have more pixels than their widescreen counterparts, but because they advertise the widescreen displays as 15.6" or 17.3" the consumer thinks they are getting a bigger monitor - which they aren't!! :-(

That again would give you too little horizontal space per window. eg: 56% of all laptops now have 1366x900, so that would give you 1366/2 = 683px per window if you have two windows tiled. Now include window borders, scrollbars etc and you have even less "usable space" per window.

You mean 1366x768, which is the 16:9 widescreen resolution most common nowadays. 1366x900 isn't 16:9 or 16:10 (1366x853 is 16:10).

1440x900 is the "standard" 16:10 resolution, and is way too common on 19-24" widescreens.

768 vertical pixels is not enough to do anything. Really, 1024 vertical pixels should be the minimum. And it's very hard to find widescreen monitors that are over 9" tall with over 1024 vertical pixels.

Migrating from 19" 1280x1024 4:3 screens requires 23" 16:9 screens in order to not lose vertical screen real estate (physical or logical). Which takes up a *lot* more physical desk space.

I miss my old IBM CRT 21" screen that supported more than "FullHD" resolutions. Unfortunately, it was over 80 lbs and warped my desk. But the resolutions it supported ...

2013-2014 is going to be a good year for GPU and CPU companies.
I can see chipmakers being behind this because higher resolution means stronger computers is needed. Because gamers just love the max FPS at highest resolution. And at 3840x2160 that will drive a hellavu upgrade cycle.

I can also see chip companies becoming much more involved pushing extreme spec pc titles. Which is becoming a rarity.

I for one, was disappointed when I found out the this "Retina" thing varied with viewing distance. I doubt we'll be seeing true 300+ ppi (300ppi roughly simulates 133 - 150dpi printer resolution, give or take) monitors soon.

Funky monitors with 204dpi (ok actually 4 monitors input merged into 1 panel), I would have almost dreamed of having one , if it wasn't for the lousy refresh rate.
Pretty much in advance on their time.

the desktop display ecosystem is embarrassing. I'm still waiting for the 2560x1600 27" monitor I was promised after the trinitron fw900 came out last fucking century. it still sells for more than the pitiful, weak LCD displays that replaced it. embarrassing.