The implementation of multiple monitor support has some rough edges.

High-density displays in computers are becoming more common. The wave that began with the Retina MacBook Pro and continued with the Chromebook Pixel is now washing over the Windows world. Toshiba's Kirabook was the first, but there were plenty of others announced at Computex. We're sure they'll begin to trickle down into cheaper devices as the year continues.

To prepare for these new laptops and displays (and to catch up to where high-PPI display support is in both OS X and Chrome OS), Microsoft's freshly-announced Windows 8.1 is making a few small but important changes to how the operating system handles them. We installed Windows 8.1 to the Kirabook to investigate.

Different displays, different densities

Windows 8.1 definitely seems to be doing some automatic display scaling here, as has been rumored based on beta builds of the OS. In both Windows 8.1 and the preview of Windows Server 2012 R2, as long as a proper graphics driver was installed, Windows would automatically scale the Kirabook's screen to around 150 percent.

Dive into the manual settings and you'll find a few things that weren't here before. If the "Let me choose one scaling level for all my displays" box is checked, you'll see the same old percentage-based scaling options as before. Whatever percentage you choose will be universally applied through the entire operating system on all of your displays.

On the Kirabook, a 200 percent scaling option has now been added to the 100, 125, and 150 percent options from post-Vista versions of the operating system. You could set the scaling level to 200 percent (or any number, really) through a custom setting (as we found when we installed Windows 7 on the Retina MacBook Pro), but 8.1 exposes the scaling setting to make it easier to access.

Enlarge/ The 200 percent scaling option is now exposed rather than hidden in a custom setting menu.

Andrew Cunningham

Things get more interesting when you choose to scale different displays independently. Make sure the "Let me choose one scaling level for all my displays" box is unchecked, and the list of percentages will change into a more user-friendly slider that makes UI elements larger and smaller. Using the same under-the-hood calculations by which it scales the UI automatically, Windows will tell you what the "recommended size" is for the display that you're on. We left the setting at the recommended size for good measure.

Enlarge/ Windows 8.1 includes a new, more user-friendly slider for adjusting scaling settings.

Andrew Cunningham

We then connected a second monitor to the Kirabook, a 21.5-inch 1080p screen from Samsung. Rather than scale elements on that display to 150 percent, Windows would automatically rescale application windows as we dragged them back and forth between the screens. You could actually use Windows 8.1 as-is to drive an external display or projector from a notebook with a high-PPI screen. You wouldn't have to deal with overly large images on the external displays or overly small images on the built-in one.

This isn't to say the implementation is perfect. When you drag a window from the Kirabook's high-PPI screen to the external one, the window doesn't change from something rendered at 150 percent scaling to something rendered at 100 percent. Rather, Windows takes the 150 percent-scaled window and resizes it on-the-fly to be the same size as the same window rendered at 100 percent. The results are the same as they might be if you took a large image and shrank it down: fine detail is sacrificed, and windows on the external display were subtly blurry (enlarge the screenshots below to see the difference).

Enlarge/ A window rendered at about 150 percent then resized to be about the same size as a window rendered at 100 percent. Note the relative blurriness of the images and text.

Andrew Cunningham

This effect was even more obvious if I set the external monitor to be my primary display and then dragged windows onto the Kirabook's screen: windows rendered natively at 100 percent scaling were blown up to be the size of 150 percent scaled windows. Text and images were as pixelated as they would be if you blew up a small image. In each case, Windows seemed to use the primary display's scaling settings as the "canonical" ones and would resize windows up and down from there.

Enlarge/ A window rendered at 100 percent scaling then resized up to be the same size as a window scaled at about 150 percent. Note the pixelated images and text.

Andrew Cunningham

Microsoft's solution for multiple monitors is workable, and it's definitely better than it was in Windows 8 or any prior version. Previously, the scaling settings felt more like an accessibility option than a real, fully implemented feature, something to be used to prevent eyestrain on a standard monitor rather than drive displays with over 200 pixels-per-inch. At least now the settings are being tweaked with high-PPI laptops in mind.

That said, Windows 8.1 does not somehow fix the inconsistent behavior of third-party desktop apps when you're using Windows' scaling features. The Start screen and apps from the Windows Store mostly look great, but on the desktop the state of things is the same as it was when we reviewed the Kirabook. Some applications just look blurry, some are crisp but exhibit odd rendering problems, and others ignore the scaling settings entirely.

The fault here doesn't lie with Microsoft but with the third-party developers. We'll just have to hope that the coming wave of high-resolution hardware (as well as Microsoft's own attention to the issue) will convince developers that implementing proper scaling support is worth their money and time.

What has changed for the better is the way that the operating system has been tweaked to accommodate high-resolution, high-density displays. We'd like to see more granular settings for controlling the scaling on each display by the time the final version of Windows 8.1 is released later this year, but for a preview this is a not-inconsequential step forward for people interested in high-PPI laptops.

75 Reader Comments

1) The ability to provide different pre-rendered images based on the current pixel density2) Drawing instructions that provide a canvas which does not necessarily map to pixels, but instead to screen units that can then be drawn smoothly at whatever scaling factor is present between those units and the actual screen pixels.

If they don't, I think they will continue to fail at HiDPI. If it's a matter of developers using deprecated drawing APIs or failing to provide multiple versions of their images, you can pin it on developers. If not, that's on Microsoft.

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

OSX does a great job with high PPI displays. Some app's end up with a slightly blurry look to them because they are being scaled (If the app does not have built in retina display support), but its better than squinting to see them because they are not being scaled. And at this point in time, a large majority of apps do have built in retina display support.

1) The ability to provide different pre-rendered images based on the current pixel density2) Drawing instructions that provide a canvas which does not necessarily map to pixels, but instead to screen units that can then be drawn smoothly at whatever scaling factor is present between those units and the actual screen pixels.

If they don't, I think they will continue to fail at HiDPI. If it's a matter of developers using deprecated drawing APIs or failing to provide multiple versions of their images, you can pin it on developers. If not, that's on Microsoft.

As for 2, isn't that called "Windows Presentation Foundation"? WinRT apps also operate this way. Unless the developer manually overrides this, of course. And there are legitimate reasons for doing so, mainly related to problems with trying to draw on 'normal' resolution displays when a thin line is attempted to be rendered along a pixel border, for example.

Microsoft is in an interesting situation here because they tried to do high-DPI "right" back in the Vista days; they'd expose the (arbitrary) scaling factor to apps and apps would draw appropriately. But no apps implemented drawing correctly. Then Apple came along with a much simpler way to do things (scaling old apps by 2x) that's not pedantically correct but is much more reliable. Now MS is copying OS X while having to preserve compatibility with the old-style mess.

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

OSX does a great job with high PPI displays. Some app's end up with a slightly blurry look to them because they are being scaled (If the app does not have built in retina display support), but its better than squinting to see them because they are not being scaled. And at this point in time, a large majority of apps do have built in retina display support.

Every app I know of which looks blurry on an OSX retina screen is a Carbon app. Apple is (very sensibly IMHO) refusing to waste any time bothering to retrofit retina to Carbon. It's just a damn shame that there are STILL some companies that just will not get the hint that it's time to abandon the 1990s and give up Carbon --- most notably, of the apps I use, The Mathematica front end, which looks like garbage on a retina display.

Personally I think we'd all be better off if Apple had announced, at this WWDC, that while OSX 10.9 will support Carbon, the next rev will not (and the rev after that will no longer support 32-bit code). It's pretty obvious that asking developers nicely to do the right thing just doesn't work --- the ONLY way to get some of them to get their act together is an outright threat. (I think the persistent support for Carbon (and Cocoa 32bit) also puts the lie to the claim that Apple is all about change to force people to buy new HW and SW. Pretty much every time they've abandoned old machines, they've had a good reason to do so; and the times when they don't have a good reason, where abandoning the past would be simply a matter of cleaning things up, not a matter of essential consolidation, they have in fact not moved nearly as fast as I would like.)

"The fault here doesn't lie with Microsoft but with the third-party developers."

Sorry, I don't buy that. It is Microsoft's responsibility to provide an option to force scaling of older applications that don't support it. The alternative is ridiculous and unworkable. While in some ideal fantasy land, having every old Windows application modified to support scaling might seem the right approach, it is utterly worthless in practice, since the entire point of Windows is that there is an enormous universe of legacy software available for it, and there is approximately 0 probability that any significant fraction of those will get scaling support added. I use tons of older apps and tools that aren't supported any longer. Older versions of photoshop because I won't feed the Adobe monster any more cash, older versions of Office for the same reason, older versions of Acronis True Image... for the same reason. Who is going to modify those apps?

Either MS allows us to force scaling for older apps, or they have missed the main point.

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

OSX does a great job with high PPI displays. Some app's end up with a slightly blurry look to them because they are being scaled (If the app does not have built in retina display support), but its better than squinting to see them because they are not being scaled. And at this point in time, a large majority of apps do have built in retina display support.

OSX is helped immensely by the fact that it only has two scaling modes: 1:1 and 2:1. Integer scaling just works, and things that aren't high-DPI aware simply look like they would have on the lower DPI screen.

But the reason OSX can do this is because Apple controls the type of displays that are in all of its products. It wants to make a product retina? It doubles the resolution. This lets them always keep the same effective screen real-estate for a given monitor size and simply increase sharpness.

Microsoft cannot and will never be able to do this. It has to support every possible resolution combination under the sun, at varying real-world DPI levels. Some hardware vendors put 1080p screens in 11" laptops. This is certainly hi-DPI, but not enough that you can do just straight pixel doubling, because then you'd wind up with the effective screen space of a qHD monitor, 960x540, which is absolutely useless on an 11" screen with a full Windows desktop. But at 1:1 everything looks tiny. So you have to do 1.5 scaling, which makes everything look like crap, no matter how well the application is DPI aware.

Really, there isn't a good way for Microsoft to fix this. The problem is hardware partners not wanting to do straight doubling of display sizes for cost reasons, and instead throwing odd-ball DPI's into devices to use standard resolutions. Until that changes, Microsoft is going to fight an uphill battle.

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

OSX does a great job with high PPI displays. Some app's end up with a slightly blurry look to them because they are being scaled (If the app does not have built in retina display support), but its better than squinting to see them because they are not being scaled. And at this point in time, a large majority of apps do have built in retina display support.

Every app I know of which looks blurry on an OSX retina screen is a Carbon app. Apple is (very sensibly IMHO) refusing to waste any time bothering to retrofit retina to Carbon. It's just a damn shame that there are STILL some companies that just will not get the hint that it's time to abandon the 1990s and give up Carbon --- most notably, of the apps I use, The Mathematica front end, which looks like garbage on a retina display.

Personally I think we'd all be better off if Apple had announced, at this WWDC, that while OSX 10.9 will support Carbon, the next rev will not (and the rev after that will no longer support 32-bit code). It's pretty obvious that asking developers nicely to do the right thing just doesn't work --- the ONLY way to get some of them to get their act together is an outright threat. (I think the persistent support for Carbon (and Cocoa 32bit) also puts the lie to the claim that Apple is all about change to force people to buy new HW and SW. Pretty much every time they've abandoned old machines, they've had a good reason to do so; and the times when they don't have a good reason, where abandoning the past would be simply a matter of cleaning things up, not a matter of essential consolidation, they have in fact not moved nearly as fast as I would like.)

Xcode already no longer supports Carbon. Any current app that's still Carbon has to be produced under an older version of Xcode and OS X.

"The fault here doesn't lie with Microsoft but with the third-party developers."

Sorry, I don't buy that. It is Microsoft's responsibility to provide an option to force scaling of older applications that don't support it. The alternative is ridiculous and unworkable. While in some ideal fantasy land, having every old Windows application modified to support scaling might seem the right approach, it is utterly worthless in practice, since the entire point of Windows is that there is an enormous universe of legacy software available for it. I use tons of older apps and tools that aren't supported any longer. Older versions of photoshop because I won't feed the Adobe monster any more cash, older versions of Office for the same reason, older versions of Acronis True Image... for the same reason. Who is going to modify those apps?

Either MS allows us to force scaling for older apps, or they have missed the main point.

There is no way to 'force' scaling of applications that don't support it. The fact that you say that means that you do not understand how the Windows heavyweight windowing kit has worked for the last two decades, and that Microsoft still has to support. Applications were supposed to poll the OS for the current DPI setting and act accordingly, but the vast majority simply ignored that guideline because until 3 years ago there simply weren't displays with higher than about 110 DPI.

1) The ability to provide different pre-rendered images based on the current pixel density2) Drawing instructions that provide a canvas which does not necessarily map to pixels, but instead to screen units that can then be drawn smoothly at whatever scaling factor is present between those units and the actual screen pixels.

If they don't, I think they will continue to fail at HiDPI. If it's a matter of developers using deprecated drawing APIs or failing to provide multiple versions of their images, you can pin it on developers. If not, that's on Microsoft.

1) They've provided that since the Windows XP days actually. It's just that the vast majority of applications ignored it and assumed the default 96DPI setting because there was almost no hardware that was higher than ~110DPI and thus wasn't worth the development resources.

2) Yes the do, and have for a long while, but again the legacy applications (because remember Microsoft has to maintain compatibility back to at least Windows XP) do not use it and use the heavyweight windowing API's instead.

The fault here doesn't lie with Microsoft but with the third-party developers

Bullshit I have heard that copout too often. Like with the driver mess in Vista the fault lies with MS.

Yeah because they completely changed the way drivers worked to vastly increase the stability of the OS, and magically that wasn't supposed to break backwards compatibility? Ok. Sure. It's their fault that hardware vendors refused to spend resources updating drivers for the new OS's driver model. Yep.

Does no one remember the move from OS9 to OSX, and then from PowerPC to Intel on the Mac side? And how, like, literally nothing worked without developers investing large amounts of resources into updating them? Anyone?

The fault here doesn't lie with Microsoft but with the third-party developers

Bullshit I have heard that copout too often. Like with the driver mess in Vista the fault lies with MS.

If there would have been some long term stable api planning instead of the bewildering array of choices hoisted on us by Microsoft. And if they had not completely ignored high density screens until apple forced their hand the situation would be much better.

Yes in the end third party developers need to implement it but you can make it easier or harder for them. Microsoft definitely chose the latter. Secondly you could put emphasis on high dpi displays or completely ignore them until a competitor is showing you up. Microsoft again chose the procrastination way. So the fault lies with Microsoft.

The idea that "the bewildering array of choices hoisted on us" is a valid reason for poor software design, on the third-party side, is what I feel to be a copout. One of the great things about developing on Windows is that there are so many different choices and ways to do, well, practically anything. That same freedom also lays the responsibility at the feet of the developer to handle it properly. It means that if I don't bother to try to make my application PPI-aware, and ignore the settings in Windows (or even worse, use controls that do obey the settings, but then don't obey the settings myself, when programmatically manipulating those controls), then my software is going to be shitty when run on machines with high PPI screens. Microsoft isn't responsible for cleaning up my mess, I am. As long as they offer the tools in the operating system to handle it (which is what they are continuing to do here), then it's up to the developer to use them.

If you want to lay the fault at the feet of Microsoft, I am curious, what should they have done? Should they have removed the older options and simply broken compatibility with older programs that do not follow the new resolution scaling paradigm? Perhaps there is a better solution that I'm missing, so feel free to tell me I'm wrong

The fault here doesn't lie with Microsoft but with the third-party developers

Bullshit I have heard that copout too often. Like with the driver mess in Vista the fault lies with MS.

Not true in any meaningful sense. Vista was released in late 2006; the driver model was basically done in 2004. It's just that NVIDIA (and only NVIDIA; other GPU vendors fared much, much better) didn't seem to care, as if they felt that WDDM operating systems would never be released.

Quote:

If there would have been some long term stable api planning instead of the bewildering array of choices hoisted on us by Microsoft. And if they had not completely ignored high density screens until apple forced their hand the situation would be much better.

They didn't "completely ignore it". They provided the tools to do it properly since the 1990s.

"The fault here doesn't lie with Microsoft but with the third-party developers."

Sorry, I don't buy that. It is Microsoft's responsibility to provide an option to force scaling of older applications that don't support it. The alternative is ridiculous and unworkable. While in some ideal fantasy land, having every old Windows application modified to support scaling might seem the right approach, it is utterly worthless in practice, since the entire point of Windows is that there is an enormous universe of legacy software available for it. I use tons of older apps and tools that aren't supported any longer. Older versions of photoshop because I won't feed the Adobe monster any more cash, older versions of Office for the same reason, older versions of Acronis True Image... for the same reason. Who is going to modify those apps?

Either MS allows us to force scaling for older apps, or they have missed the main point.

There is no way to 'force' scaling of applications that don't support it. The fact that you say that means that you do not understand how the Windows heavyweight windowing kit has worked for the last two decades, and that Microsoft still has to support. Applications were supposed to poll the OS for the current DPI setting and act accordingly, but the vast majority simply ignored that guideline because until 3 years ago there simply weren't displays with higher than about 110 DPI.

No. I understand how Windows works; I have written entire native widget APIs around Windows graphics. I didn't say perfect scaling; I said scaling. If I have an app that is rendering buttons and text too small for me to even see or select reasonably, it is trivial to do pixel doubling at the very least, which is the simplest form of scaling. I don't care if it is a bit jaggy; I want the option to FORCE that kind of view for any application I choose. And by extension, there is no reason that bicubic interpolation can't be used to FORCE any arbitrary degree of scaling on an application's pixels. It will look a little blurrier, but hey, that's the nature of scaling. It's a hell of a lot better than nothing, and they absolutely can do it.

If there would have been some long term stable api planning instead of the bewildering array of choices hoisted on us by Microsoft. And if they had not completely ignored high density screens until apple forced their hand the situation would be much better.

They didn't "completely ignore it". They provided the tools to do it properly since the 1990s.

Apple's mostly did too, it was just less painful because they only did integer scaling. But as I've said, the only reason they can do that is because they also control the display hardware. Microsoft does not, and likely never will, have that luxury.

"The fault here doesn't lie with Microsoft but with the third-party developers."

Sorry, I don't buy that. It is Microsoft's responsibility to provide an option to force scaling of older applications that don't support it. The alternative is ridiculous and unworkable. While in some ideal fantasy land, having every old Windows application modified to support scaling might seem the right approach, it is utterly worthless in practice, since the entire point of Windows is that there is an enormous universe of legacy software available for it. I use tons of older apps and tools that aren't supported any longer. Older versions of photoshop because I won't feed the Adobe monster any more cash, older versions of Office for the same reason, older versions of Acronis True Image... for the same reason. Who is going to modify those apps?

Either MS allows us to force scaling for older apps, or they have missed the main point.

There is no way to 'force' scaling of applications that don't support it. The fact that you say that means that you do not understand how the Windows heavyweight windowing kit has worked for the last two decades, and that Microsoft still has to support. Applications were supposed to poll the OS for the current DPI setting and act accordingly, but the vast majority simply ignored that guideline because until 3 years ago there simply weren't displays with higher than about 110 DPI.

No. I understand how Windows works; I have written entire native widget APIs around Windows graphics. I didn't say perfect scaling; I said scaling. If I have an app that is rendering buttons and text too small for me to even see or select reasonably, it is trivial to do pixel doubling at the very least, which is the simplest form of scaling. I don't care if it is a bit jaggy; I want the option to FORCE that kind of view for any application I choose. And by extension, there is no reason that bicubic interpolation can't be used to FORCE any arbitrary degree of scaling on an application's pixels. It will look a little blurrier, but hey, that's the nature of scaling. It's a hell of a lot better than nothing, and they absolutely can do it.

Try bicubic interpolation on 1 pixel wide/tall elements. See how that works out for you

You are correct that they might be able to force scaling with straight pixel doubling, but that would render the vast majority of displays in Windows devices useless, because in the Windows world right now HiDPI primarily means 1920x1080. It would also likely reek absolute havoc with any application that mixes 2D/3D, and that has any kind of art assest. Bottom line, it would probably break many more things than it would help fix.

The fault here doesn't lie with Microsoft but with the third-party developers. We'll just have to hope that the coming wave of high-resolution hardware (as well as Microsoft's own attention to the issue) will convince developers that implementing proper scaling support is worth their money and time.

Bullshit. The problem is the units.

MS's API's measure everything in *pixels* (fonts excluded, as they use pt == 1/72 in.). Anyone attempting to programmatically render any image needs to jump through extra hoops to query the system DPI, then scale accordingly.

And then they need to be able to handle instant changes to that DPI, as happens when the user plugs in and switches to a new display. So, I disagree with the poster(s) who assert that "just query the DPI" is the correct solution. DPI (read: inches) is also a poor measure for most cases. Text an inch tall on a 72 inch display might look reasonable at normal distances, but ridiculously large on a 13 inch display. If you're lucky enough to know the size of your display, you're golden - but DPI is otherwise no good.

What you need to know is the amount of a person's viewing area (or retinal area) consumed by the display. You can make some fair and counterintuitive assumptions: For instance, a 13 inch display likely consumes more of a user's retinal area than a Jumbovision (which is very far away). Steradians might be a good measure, for instance; but, they're difficult to intuit (at least for me), and as far as I'm aware translate poorly to Cartesian coordinates. Latitude and longitude may also work, but aren't much better as the lines of longitude bend. So: What's a truly good measure?

(And no, I am not forgiving MS for failing to bother to think this problem out ago 14 years ago when .NET was first being devised. I don't know Apple's API's well enough to comment on the matter, but doubt they've resolved the problem either...)

The fault here doesn't lie with Microsoft but with the third-party developers

Bullshit I have heard that copout too often. Like with the driver mess in Vista the fault lies with MS.

Yeah because they completely changed the way drivers worked to vastly increase the stability of the OS, and magically that wasn't supposed to break backwards compatibility? Ok. Sure. It's their fault that hardware vendors refused to spend resources updating drivers for the new OS's driver model. Yep.

Does no one remember the move from OS9 to OSX, and then from PowerPC to Intel on the Mac side? And how, like, literally nothing worked without developers investing large amounts of resources into updating them? Anyone?

OS9 -> OSX: Classic.PowerPC -> Intel: Rosetta.

This does not imply MS has done anything wrong; supporting all the hardware combinations running Windows is a Sisyphean task. Just wanted to correct your claim that "literally nothing worked" for those transitions when actually almost everything worked. Maybe not nice-looking or fast, but it worked

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

OSX does a great job with high PPI displays. Some app's end up with a slightly blurry look to them because they are being scaled (If the app does not have built in retina display support), but its better than squinting to see them because they are not being scaled. And at this point in time, a large majority of apps do have built in retina display support.

Every app I know of which looks blurry on an OSX retina screen is a Carbon app. Apple is (very sensibly IMHO) refusing to waste any time bothering to retrofit retina to Carbon. It's just a damn shame that there are STILL some companies that just will not get the hint that it's time to abandon the 1990s and give up Carbon --- most notably, of the apps I use, The Mathematica front end, which looks like garbage on a retina display.

Personally I think we'd all be better off if Apple had announced, at this WWDC, that while OSX 10.9 will support Carbon, the next rev will not (and the rev after that will no longer support 32-bit code). It's pretty obvious that asking developers nicely to do the right thing just doesn't work --- the ONLY way to get some of them to get their act together is an outright threat. (I think the persistent support for Carbon (and Cocoa 32bit) also puts the lie to the claim that Apple is all about change to force people to buy new HW and SW. Pretty much every time they've abandoned old machines, they've had a good reason to do so; and the times when they don't have a good reason, where abandoning the past would be simply a matter of cleaning things up, not a matter of essential consolidation, they have in fact not moved nearly as fast as I would like.)

Xcode already no longer supports Carbon. Any current app that's still Carbon has to be produced under an older version of Xcode and OS X.

Interesting strategy on Apple's part, and probably as good a way as any to force the issue! Good for them --- this cheers me up. Now I only have to rant about 32-bit Cocoa...!

I really feel there is no real excuse to not fully supporting high PPI displays at this point in time. High PPI displays are no longer ultra new or ultra high end. In the next year its going to become much more common.

Of the 15 best-selling laptops at Best Buy, 13 have displays in the 100-110 PPI range. The other 2 have displays in the 130-140 PPI range (one 768p 11.6" screen and one 1080p 15.6" screen). None of those resolutions require special applications coded to support high PPI displays.

Quote:

OSX does a great job with high PPI displays.

Great, but Apple makes expensive hardware that only caters to 10% of the market. Microsoft isn't interested in that niche, they want 90% of the market.

Sure, if it were trivial to support high PPI displays, then it would make sense for 3rd party devs to do so. But the vast majority of available apps are coded with Win32 and GDI, the most successful API in history. It doesn't make sense to rewrite for a new API just to support a small share of the market. Maybe in 5-10 years, but not now.

So, I'm kind of curious what happens if you connect a high-DPI external display to a Mac that is not the same resolution as their Retina displays. What kind of scaling does it do? What if it is a different aspect ratio? Can somebody comment on this?

Great, but Apple makes expensive hardware that only caters to 10% of the market. Microsoft isn't interested in that niche, they want 90% of the market.

Sure, if it were trivial to support high PPI displays, then it would make sense for 3rd party devs to do so. But the vast majority of available apps are coded with Win32 and GDI, the most successful API in history. It doesn't make sense to rewrite for a new API just to support a small share of the market. Maybe in 5-10 years, but not now.

Can you or someone explain why it's not trivial? The operating system can tell the application it's running at one resolution and then it knows the display is running at a different resolution, why can't it "do the math" and display it properly?

Display tells OS, "I'm running at 2880x1800."OS tells application, "The display is running at 1440x900."Application says, "Thanks! Here's what I should look like then."OS tells display, "Show the application on screen like this...."

While there are "a lot of different displays Microsoft has to support and Apple controls the hardware" is a nice sounding argument, it's not like there are 3 different resolutions that Macs run at and 6 thousand million that PCs run at.

Please, assume I'm stupid, call me an idiot, whatever it takes to get a layman's explanation as to why Microsoft can't pull off what Apple has. I have a Retina MBP and I love it, and I'd love to be using Windows without having to resort to my external display so the screen doesn't look like ass!

Display tells OS, "I'm running at 2880x1800."OS tells application, "The display is running at 1440x900."Application says, "Thanks! Here's what I should look like then."OS tells display, "Show the application on screen like this...."

First, my understanding is that non-integer rescalings are not trivial and often result in drawing abnormalities. Clearly that's what Apple found, given that they went with pixel doubling on both iOS and OS X.

Second, I wasn't referring to whether it's hard or easy for Microsoft to write an API that scales apps perfectly to arbitrary resolutions. What I was saying is that in the existing popular API, Win32 + GDI, does not support resolution independence. Ergo, most apps would need to write to a new API, which is never a "trivial" thing to do.

Great, but Apple makes expensive hardware that only caters to 10% of the market. Microsoft isn't interested in that niche, they want 90% of the market.

Sure, if it were trivial to support high PPI displays, then it would make sense for 3rd party devs to do so. But the vast majority of available apps are coded with Win32 and GDI, the most successful API in history. It doesn't make sense to rewrite for a new API just to support a small share of the market. Maybe in 5-10 years, but not now.

Can you or someone explain why it's not trivial? The operating system can tell the application it's running at one resolution and then it knows the display is running at a different resolution, why can't it "do the math" and display it properly?

Display tells OS, "I'm running at 2880x1800."OS tells application, "The display is running at 1440x900."Application says, "Thanks! Here's what I should look like then."OS tells display, "Show the application on screen like this...."

While there are "a lot of different displays Microsoft has to support and Apple controls the hardware" is a nice sounding argument, it's not like there are 3 different resolutions that Macs run at and 6 thousand million that PCs run at.

Please, assume I'm stupid, call me an idiot, whatever it takes to get a layman's explanation as to why Microsoft can't pull off what Apple has. I have a Retina MBP and I love it, and I'd love to be using Windows without having to resort to my external display so the screen doesn't look like ass!

The main problem is, then your applications can never actually render any features at the higher resolution. So, then you can make an API so that the OS can ask the application, "are you capable of rendering at a high DPI?" And only do what you describe if the application says "no" (or doesn't know). This is in fact what Windows does.

The next problem you run into is, when the applications say, "yes, I can render at a high-DPI, and here's what that looks like." And then renders something that looks like crap. This is main problem today.

I suggested that Microsoft may need to actually have a whitelist of applications known to render correctly at high-DPI regardless of what the application claims to support, but it doesn't sound like they're going to do that.

It is not trivial, I can give you that much, but this is a side of Windows that has not progressed much in over a decade. MS has always focused it work on the engine but has left the presentation pretty much the same.

They have added some eye candy here and there but the limitations, such as scaling on the desktop explained in the article, are long standing annoyances. MS is one of the biggest software development houses in the world, it could be argue they should be leading the pack with innovations.

The problem is hard, but don't they have the size, resources and talent to tackle it head on? Even if it is not a high priority item, it can only go for so long. At some point it will make the OS look unpolished, specially if competitors come with better solutions.

So, I'm kind of curious what happens if you connect a high-DPI external display to a Mac that is not the same resolution as their Retina displays. What kind of scaling does it do? What if it is a different aspect ratio? Can somebody comment on this?

I still haven't quite figured it out, but I will say that everything is the right size, and as far as I can tell, text on the external monitor looks exactly as it should when being rendered at 1X. (the retina display is 2X) I think they may actually have some resolution independence stuff going on. It's limited, but at least in practice it works better than the scaling 8.1 has so far. (I'm tempted to get a Mac over this.)

So, I'm kind of curious what happens if you connect a high-DPI external display to a Mac that is not the same resolution as their Retina displays. What kind of scaling does it do? What if it is a different aspect ratio? Can somebody comment on this?

I still haven't quite figured it out, but I will say that everything is the right size, and as far as I can tell, text on the external monitor looks exactly as it should when being rendered at 1X. (the retina display is 2X) I think they may actually have some resolution independence stuff going on. It's limited, but at least in practice it works better than the scaling 8.1 has so far. (I'm tempted to get a Mac over this.)

Macs do it (at least on the internal display) but never upscaling anything unless you're running at the "best for Retina" resolution. For everything else (at least for "apparent resolutions" higher than "best for Retina" -- I don't know how it handles the 2 supported apparent resolutions that are lower than the "best for Retina") the actual rendered desktop image is 2x the "apparent" resolution. This is then scaled down to fit on the 2880x1800 screen.

So when you're running your screen at, say, an apparent resolution of 1680x1050, the actual "resolution" being used is 3360x2100 and then this is scaled down to 2880x1800. When you're running at an apparent resolution of 1920x1600, the machine is actually rendering a 3480x3200 desktop and scaling it down to 2880x1800. When you run your desktop at an apparent resolution of 1440x900, the actual rendered image sent directly to the 2880x1800 screen.

In short, Macs don't upscale anything. How this makes things continue to look good even when running non-"native" apparent resolutions, I don't know. I do know that 1920x1200 on Retina looks better than on a Mac screen for which 1920x1200 is the native resolution. Also, I should note that I am using the 15.4" Retina MBP for this illustration. I don't have the 13" and I don't know which apparent resolutions it supports.

I'm sure it was possible to handle scaling well in Visual Studio 2005 but I couldn't figure it out. I'm only a hobbyist programmer though. There are scaling options built in but they are all but useless, in the sense that no matter what options I pick nothing actually scales. I assume it's better now (though figuring that out in VS2012 has not been a priority as I switch things over), but I think the tools did not make it as simple a task as it should have been.

Ironically, I suppose, I have programs that were originally written in VB6, in which the size of everything is defined in device-independent twips. In the switch to .net, everything switched over to pixels instead. So in principle, the framework for scaling was better in VB6. Now (VS2012) even the function calls for converting twips to pixels are deprecated (and not supported at all for 64-bit programs) so that code finally needs to be replaced....

Anyway, my point is that you can only put so much of the work in the lap of the developer. Microsoft also produced the tools, and if the tools couldn't handle it, they definitely share the blame.

So, I'm kind of curious what happens if you connect a high-DPI external display to a Mac that is not the same resolution as their Retina displays. What kind of scaling does it do? What if it is a different aspect ratio? Can somebody comment on this?

I still haven't quite figured it out, but I will say that everything is the right size, and as far as I can tell, text on the external monitor looks exactly as it should when being rendered at 1X. (the retina display is 2X) I think they may actually have some resolution independence stuff going on. It's limited, but at least in practice it works better than the scaling 8.1 has so far. (I'm tempted to get a Mac over this.)

Didn't ARS do this exact thing with a 4k computer display a few months back? I know I saw it somewhere...

Really, there isn't a good way for Microsoft to fix this. The problem is hardware partners not wanting to do straight doubling of display sizes for cost reasons, and instead throwing odd-ball DPI's into devices to use standard resolutions. Until that changes, Microsoft is going to fight an uphill battle.

the hardware probably won't be available for mainstream systems until Windows 10; but 4k laptop screens would make it much easier 300-400DPI (15 to 11") gives enough integer scaling modes that most people should be able to find something that works for them. A 22" 4k monitor is only 200DPI, dropping to 146 at 30"; so we'll probably have to wait even longer for 8k screens before the desktop is fully settled.

OTOH by then most apps might actually be rewritten in WPF/Metro and be able to be scaled automatically by the OS.

So, I'm kind of curious what happens if you connect a high-DPI external display to a Mac that is not the same resolution as their Retina displays. What kind of scaling does it do? What if it is a different aspect ratio? Can somebody comment on this?

I still haven't quite figured it out, but I will say that everything is the right size, and as far as I can tell, text on the external monitor looks exactly as it should when being rendered at 1X. (the retina display is 2X) I think they may actually have some resolution independence stuff going on. It's limited, but at least in practice it works better than the scaling 8.1 has so far. (I'm tempted to get a Mac over this.)

Macs do it (at least on the internal display) but never upscaling anything unless you're running at the "best for Retina" resolution. For everything else (at least for "apparent resolutions" higher than "best for Retina" -- I don't know how it handles the 2 supported apparent resolutions that are lower than the "best for Retina") the actual rendered desktop image is 2x the "apparent" resolution. This is then scaled down to fit on the 2880x1800 screen.

So when you're running your screen at, say, an apparent resolution of 1680x1050, the actual "resolution" being used is 3360x2100 and then this is scaled down to 2880x1800. When you're running at an apparent resolution of 1920x1600, the machine is actually rendering a 3480x3200 desktop and scaling it down to 2880x1800. When you run your desktop at an apparent resolution of 1440x900, the actual rendered image sent directly to the 2880x1800 screen.

In short, Macs don't upscale anything. How this makes things continue to look good even when running non-"native" apparent resolutions, I don't know. I do know that 1920x1200 on Retina looks better than on a Mac screen for which 1920x1200 is the native resolution. Also, I should note that I am using the 15.4" Retina MBP for this illustration. I don't have the 13" and I don't know which apparent resolutions it supports.

Because Macs only support two DPI scalings, 1:1 and 2:1, they completely independently scale to the external display correctly (this is helped by the fact that basically any external display is probably going to be 1:1).

As pointed out above, OSX does something clever to support arbitrary resolutions while always doing integer scaling: If the target resolution is less than the native resolution of the display, they render at 2:1 of the target resolution, then downsample to get to the native resolution of the display. So if your target resolution is 1920x1080, and the actual display is 2560x1200, it is rendered as 3840x2160 then down-sampled to 2560x1200. This ensures maximum image quality, at the cost of performance. Ironically, on the rMBP running the display at 1920x1080 is more taxing than running 1:1 scaling at its native 2880x1440.

As to why doing that works from an image quality point of view... it's complicated. But basically, to stupidly over-simplify, if you upscale, you are attempting to create information that doesn't actually exist. This will always result in something that looks unnatural/artifacty. On the other hand, if you downsample you are systematically removing extra information. This may result in something that looks a little blurry (because there is averaging going on), but it will almost always look more natural than upscaling.

Great, but Apple makes expensive hardware that only caters to 10% of the market. Microsoft isn't interested in that niche, they want 90% of the market.

Sure, if it were trivial to support high PPI displays, then it would make sense for 3rd party devs to do so. But the vast majority of available apps are coded with Win32 and GDI, the most successful API in history. It doesn't make sense to rewrite for a new API just to support a small share of the market. Maybe in 5-10 years, but not now.

Can you or someone explain why it's not trivial? The operating system can tell the application it's running at one resolution and then it knows the display is running at a different resolution, why can't it "do the math" and display it properly?

Display tells OS, "I'm running at 2880x1800."OS tells application, "The display is running at 1440x900."Application says, "Thanks! Here's what I should look like then."OS tells display, "Show the application on screen like this...."

While there are "a lot of different displays Microsoft has to support and Apple controls the hardware" is a nice sounding argument, it's not like there are 3 different resolutions that Macs run at and 6 thousand million that PCs run at.

Please, assume I'm stupid, call me an idiot, whatever it takes to get a layman's explanation as to why Microsoft can't pull off what Apple has. I have a Retina MBP and I love it, and I'd love to be using Windows without having to resort to my external display so the screen doesn't look like ass!

The main problem is, then your applications can never actually render any features at the higher resolution. So, then you can make an API so that the OS can ask the application, "are you capable of rendering at a high DPI?" And only do what you describe if the application says "no" (or doesn't know). This is in fact what Windows does.

The next problem you run into is, when the applications say, "yes, I can render at a high-DPI, and here's what that looks like." And then renders something that looks like crap. This is main problem today.

I suggested that Microsoft may need to actually have a whitelist of applications known to render correctly at high-DPI regardless of what the application claims to support, but it doesn't sound like they're going to do that.

This is not the only option. The other option is the equivalent of probing "does this app know WTF it is doing", and reacting appropriately. Personally I think this sort of kow-towing to idiot developers is ludicrous, but it is in fact the standard MS pattern when MS wants something to happen. There is the infamous example of them doing precisely this thing for Windows 95 Direct Draw:http://blogs.msdn.com/b/oldnewthing/arc ... 71307.aspx

(The Apple equivalent of handling this sort of problem would be for an app to declare in its info plist that it supported HiDPI. If this declaration was not there, which it would not be for older apps, go down the autoscaling path; otherwise go down the app-draws-in-HiDPI path.)

People can slice and dice this however they like, but IMHO all the evidence points to fact that MS' support for HiDPI is lousy because MS never actually cared about this feature. Like much of what they do, it was thrown in as a bullet point feature, not something of strategic interest, and it got precisely as much company love and support as one expects for bullet point features.

The fault here doesn't lie with Microsoft but with the third-party developers

Bullshit I have heard that copout too often. Like with the driver mess in Vista the fault lies with MS.

Yeah because they completely changed the way drivers worked to vastly increase the stability of the OS, and magically that wasn't supposed to break backwards compatibility? Ok. Sure. It's their fault that hardware vendors refused to spend resources updating drivers for the new OS's driver model. Yep.

Does no one remember the move from OS9 to OSX, and then from PowerPC to Intel on the Mac side? And how, like, literally nothing worked without developers investing large amounts of resources into updating them? Anyone?

OS9 -> OSX: Classic.PowerPC -> Intel: Rosetta.

This does not imply MS has done anything wrong; supporting all the hardware combinations running Windows is a Sisyphean task. Just wanted to correct your claim that "literally nothing worked" for those transitions when actually almost everything worked. Maybe not nice-looking or fast, but it worked

I meant on the driver side. Although Apple doesn't really expose the whole driver thing nearly as much as Windows does.

People can slice and dice this however they like, but IMHO all the evidence points to fact that MS' support for HiDPI is lousy because MS never actually cared about this feature. Like much of what they do, it was thrown in as a bullet point feature, not something of strategic interest, and it got precisely as much company love and support as one expects for bullet point features.

As stated previously, Microsoft has supported HiDPI in some form since the late 90's. But you're right, it was just a bullet point. Why? Because there was no hardware support.

OSX made it not just a bullet point. Why? Because Apple made a crazy display and stuck it in a laptop to show it off. After having made a crazy display and stuck it in a tablet to show it off. After making a crazy display and sticking it in a phone to show it off.

Seeing a pattern there?

Microsoft can only push an experience/strategic vision so much. They still have to rely on OEM's to actually deliver hardware that delivers on that experience. It often becomes a chicken/egg problem. Apple doesn't have such restrictions. They can build the hardware and software together to deliver exactly the experience they're looking for.

If you're using 100% scaling wouldn't it be better to simply set the high DPI monitor's resolution to half of it's actual? So if it's a 2160p monitor set it to 1080p. Since it's an even multiple of the monitor's resolution you don't have the bad looking stretching and blur that otherwise would come with rendering at something other than the monitor's native resolution.

For a bunch of people who claim to be seasoned developers, there are a lot of idiots in here.

1. A bunch of pointless arguing that Microsoft should automatically scale dumb apps if they don't somehow signal that they support it, and other people arguing that this is impossible.

Why is this pointless? WINDOWS DOES EXACTLY THIS SINCE VISTA. In Vista, setting any DPI over 100 automatically did that. In 7 and up, you get that if you go over 130%, but there's a checkbox to override the behavior if you like, called "Use XP Style scaling" that turns off bilinear scaling and lets apps either scale themselves or pretend it's 1999 all over again and stuff cotton in their ears. (The 130% mark only flips off the checkbox, actually.) Windows 8 is weird, maybe it's a bug but it just pretends you didn't uncheck the box if you don't go above 130% and checks it for you. I hope 8.1 fixes that.

So that signal already exists, it's even called HighDPIAware in the manifest and apps that scale freely use it. THESE ARE NOT THE APPS THE ARTICLE WAS TALKING ABOUT. The article was talking about apps that signal that they totally support HiDPI, trust us, don't scale anything - and then totally screw it all up: have the UI mixed up, wrong font sizes, or don't scale at all. In that case it'd have been better if they'd just left the tag out and let the OS do the scaling.

Seriously, are all of you guys arguing about something you've never used or researched?

2. Non-integer scaling is hardly perfect but it really doesn't look half bad, compared to being too small to read. Seriously, try it sometime, set your own system to 135% and log out just to see how it looks. It'll never be crisp, and cleartype is ruined, but it's still pretty legible even on very high res montors.