Posted
by
timothyon Wednesday September 26, 2012 @02:33PM
from the to-be-fair-there's-not-much-room-there dept.

crookedvulture writes "High-PPI displays are becoming increasingly popular on tablets and notebooks, but Windows 8 may not be ready for them. On a 13" notebook display with a 1080p resolution, the RTM version of Win8 scales up some desktop UI elements nicely. However, there are serious issues with Metro, which produces tiles and text that are either too small or too large depending on the PPI setting used. That setting, by the way, is a simple on/off switch that tells the OS to 'make everything bigger.' Web browsing is particularly problematic, with Internet Explorer 10 introducing ugly rendering artifacts when scaling pages in both Metro and desktop modes. Clearly, there's work to be done on the OS side to properly support higher pixel densities."

1080p is just 1920x1080 - that part almost certainly works fine. It's that pixel density has, for years, been within a fairly narrow range (22-27 inch displays all maxing out at 1920x1200 or 1920x1080). The problem is pixel density is now increasing, think apple retina displays, and that's a problem most of us on the software side were never expecting and aren't used to having to cope with. At least not for desktops/laptops (phones is another matter because they are a rapidly maturing product used in a completely different way).

Besides that, different groups will have people who are more or less aware of this problem and trying to deal with it. Microsoft *should* have testing labs for all of these different things, and feedback about a very uneven experience should have moved up and down the chain. But as someone who writes games for a living, most of the stuff I have done in the last 5 or 6 years would look like shit on a 13 inch display at 1920x1080. Everything would be too small unless the screen is 10cm from your face. That's the catch here, we've designed for a single pixel to take up a certain fraction of your personal field of view, suddenly higher density displays come along, to which we initially ask, why, was there something wrong with the old pixel densities? Is this technology actually better or is it just going to be a method to sell expensive video cards. These new displays people are physically positioning the way their old setups were, but well, all of the assumptions about field of view get tossed.

I recently bought one of those korean ebay 2560x1440 27" displays and it looks great...but not *everything* looks great on it. Win7 UI elements scale pretty well with the built in settings, but if you turn off "Use XP compatible scaling" (or similar), applications start doing really stupid things. I believe that what they do with any application that does not report it is dpi-scaling-aware (and unfortunately, just because it says it is, doesn't mean it

iOS and OSX. And that's all. Of course, you get the whole Apple package in which there might be elements you dislike.

Nope. Not at all. OSX and ios suck at it just as bad as windows.

Apple has just been careful to hide the problem by not shipping any hardware that exposes it. Their own high-dpi displays were carefully chosen to be exact multiples of the traditional resolution, so that they could scale things with pixel doubling.

But as soon as you get outside of that little box, and ask OSX to do 125% or 150%

They chose to build their own hardware specifically so that they don't have to solve unsolvable problems. To me, it looks like they are the smart one in this fight.

But sure, blame it on Apple for taking a highly risky path to solve important problems hardware wise because it's so much simpler.

In the meantime I am entertained to see MS struggle to tackle a problem clearly too big for them. Windows 8 will soon be there with so many stupid design decisions in it, like, I don't know, having 4 different modes fo

They chose to build their own hardware specifically so that they don't have to solve unsolvable problems.

They did no such thing. If grandma buys a mid-size imac, can't see it well enough, and scales it 30% it still looks like crap. Apple can't control what the customer is going to do anymore than Microsoft can.

If someone buys a 1080p 13" display designed to show a lot of content in a really small space, then it works great. If the user (or idiot reviewer in this case) decides he wants the text 30% bigger th

Apple has just been careful to hide the problem by not shipping any hardware that exposes it. Their own high-dpi displays were carefully chosen to be exact multiples of the traditional resolution, so that they could scale things with pixel doubling.

But as soon as you get outside of that little box, and ask OSX to do 125% or 150% scaling of pretty much anything and you get the same mess.

Perhaps, perhaps not.

Have you seen the Retina MacBook Pro running at a scaled mode? You have a variety of settings - from the 2x 1440x900 mode to the decidedly non-integral 1920x1200 mode.

And 1920x1200 is, despite not being a nice integral factor of the native resolutoin, looks practically native. As in, no scaling artifacts.

What happens is that internally, OS X creates a double height frame buffer - 3840x2400, renders to that "retina style" 2x mode, then runs a scaler (custom-designed by Apple so both the 650M and Intel 4000 scale it identically) to bring it back down to 2880x1800. And the results are DAMN good - you can't tell, other than the fact that the GPU is now too underpowered to do 60fps.

And this is 150% scaling (1920x1200 -> 2880x1800), and looks awesome - you definitely don't get the "non-native resolution" crap you see on other displays.

OTOH, the low end mode, I think there's one where it runs at something like 1366x768 or so. It looks awful because even in 2x mode, the virtual framebuffer is smaller than the screen resolution, leading to the hardware having to scale the image up again.

But going from logical 1440x900 to 1920x1200 on the same 2880x1800 panel? Looks damn nice for a 150% scale up.

It depends what you are looking at. You mention 60fps which suggests games, and i agree they work out pretty good.

The web doesn't tend to do nearly as well. Various web-widgets and icons which are low resolution to start with simply don't survive scaling well. Those little shopping carts, and thumbups, and stars, and sites with line art/borders... some of the lines get doubled to 2 pixels which look to thick, even with gray scaling and anti-aliasing etc.

iOS and OSX. And that's all. Of course, you get the whole Apple package in which there might be elements you dislike.

Actually OSX doesn't handle it well at all. Sure it's fine on the retina display when they know the screen size and number of pixels but when you plug it into an external display OSX does a terrible job, there is no arbitrary scaling, they are in no better position than Microsoft. With my mac mini plugged into my TV i want the UI elements and text a bit bigger but i don't want to reduce the resolution, OSX cannot do that (and Windows isn't particularly good at it).

Yes. There is something VERY wrong with old pixel densities.Aside from the idea that I had a 1600x1200 display TWENTY YEARS AGO and since then WE'VE GONE BACKWARDS, I can offer a lot of very good reasons.

1. Clearer pictures. My phone can snap images many times larger than the effective res of my monitor. My monitor should not be the bottleneck.

2. Font smoothing - Subpixel rendering is an ugly hack and makes things look like garbage (If it looks good to you, you need glasses. Yes. You do.) You really need "

Aside from the idea that I had a 1600x1200 display TWENTY YEARS AGO and since then WE'VE GONE BACKWARDS

You are so right there... I would not have believed 10 years ago that newer large monitors I bought would feature worse resolution. Yet that has been the case for years with monitors that align to 1080p...

Thankfully we are finally starting to break free and actually get more resolution at last, as with the Korean displays people have mentioned here... going to get one of those soon I think.

What 1600x1200 display did you have 20 years ago? A 20 inch CRT? That's roughly the same pixel density as a 22inch 1920x1200 or 1920x1080. Aspect ratios screw things up a bit. Also, part of making shit cheaper has been making it lower quality, that's a separate problem.

In 1992 we were still looking at 15 inch 1280 x1024 displays, which is about the same as today, give or take an aspect ratio.

To your specific points

1. Clearer pictures. My phone can snap images many times larger than the effective res of my monitor. My monitor should not be the bottleneck.

And? How well can you tell the difference for most tasks? If you quadrupole the number o

Pixel densities where chosen way back in the 90's precisely because they were pretty much good enough. Apple's 'retina' display aims to be the point at which there is literally no further benefit from more pixel density, but 'no further benefit' and 'good enough for virtually every task' are not the same thing.

I suppose that the print world should drop down to 100ppi or so, since that's clearly "good enough" for you (reference for the number: a 17-inch laptop panel at 1600x900 is roughly 109ppi, and if you bump that up to 1680x1050 then you'll be getting rougly 114ppi). Some of us want our screens to look like the printed page, and until then things aren't "good enough" but merely "mediocre".

If you think a "retina" display is "slightly better looking" then you clearly haven't used one at length. I use my iPod tou

That's the catch here, we've designed for a single pixel to take up a certain fraction of your personal field of view, suddenly higher density displays come along, to which we initially ask, why, was there something wrong with the old pixel densities?

Yes, there was. The existing low-density displays require ugly (and often patented) hacks like hinting and subpixel rendering to display fonts at normal point sizes. When the pixel density is increased enough, this all becomes unnecessary. And it looks a lot better when it's done right. Have you ever tried using the new iPad? To me, it was a revelation: with a web page or PDF fully zoomed out, the letters were still incredibly sharp and clear, with none of the usual "cloudiness" that results from standard anti-aliasing techniques.

ClearType on Windows is very nice, but it's still just a hack compared to real high-DPI display. I am looking forward to cheap 4K TVs in smaller sizes (32" or so) so that I can use one of them as a desktop monitor. We've been held back by repurposed 1080p HDTV panels for too long.

Yep. I can't fathom why anyone in apple design thought sticking on a high pixel density display was a good idea. It doesn't look appreciably better than a regular display and requires a lot more computing power to drive applications natively. Which is why no one else seriously tried something so stupid in the last 20 years.

Now they're all following along like sheep because apple is doing it, and it certainly makes sense to have standards and so on for larger displays, but there's virtually no benefit for

A bold if flawed assertion. I have glasses for vision correction, which take me to 20/20, and I do computer graphics for about 50% of my time. I can very much tell the difference, if I couldn't my artists would murder me. And much higher res doesn't justify itself on the desktop. Tablets and mobile are a different problem a bit because it depends where exactly you think you're supposed to hold the device and where it's comfortable. If you're young and short you're going to have a different experience t

Which is what I do all day and don't need a retina display for. Again though, mobile devices are a slightly different problem because of where they sit relative to the face varying quite a lot. Desktop displays are supposed to sit to occupy a particular field of view.

The fix is to do what Apple did and simply double the resolution. All your scaling problems go away, just double every pixel of images and render vector stuff like fonts at full res.

I hope we get 4k displays soon. I would prefer 4k over Apple's intermediate Retina resolution because it is only an effective 1440 pixels wide. 4k gives you an effective 1920 pixels, the same as what I have now but sharper. Panasonic do a 20" 4k display and a few manufacturers have demoed 24/27" monitors, but they are still in t

To be fair TFA is not comparing like with like. Windows is being asked to scale the display by 125% which is obviously going to lead to blurry images. Apple Retina displays are exactly 2x the previous resolution so have 200% scaling, so there is no blur when scaling images.

If you install Chrome on MacOS and scale it to 125% it will look just as bad as it does on Windows. There is no escaping the fact that scaling to 125% looks crap.

Both options have their advantages. 125% scaling on a 13" laptop with 1920x1

Microsoft has offered a UI framework with device-independent UI scaling since 2006 - that's WPF. Ironically, to do that fully, it also did the same kind of font rendering that OS X and iOS do (idealized rather than snap to pixels), but users hated it for that.

No, GDI was never about device-independent scaling, it always operated in physical device units (i.e. physical pixels when it came to the screen). You could certainly do it before, but you had to query the system for the current DPI value, and do your own calculations to translate points or inches or whatever it is you were using to physical pixels. There was one exception - CreateDialog - which used "dialog units" in dialog template, which were designed to be device independent.

I once worked on hardware rendering with a webkit-based browser, and these kind of issues are very common when you're converting between floating-point layout coordinate and integer screen space.Some rendering pipelines make it harder than others to deal with, especially if you can't control the behaviour of non-integer pixels at the edge of images. To fix it, you have to visit all the conversion sites and decide how you want to handle the conversion. It gets especially tricky if you're scaling and stitching images together that you've uploaded as multiple textures to get around maximum texture size issues. Concatenated transformations through composition layers can be tricky too depending on what your graphics API does.

I blame apple...if the wouldn't have released those retina displays high-PPI displays would have never come and windows 8 would have been a huge success.
Apple can you not leave Microsoft in peace for one second?!

I like how it fits incredibly small, readable text. Improves web browsing, email, and other reading (like Kindle app), even if the benefit is only aesthetic for games and such. I'm currently on Android, but I do like the 4s screen.

If it's PPI you're after, the iPhone 4/4s/5 isn't your best bet. If you're an Android user, you've got plenty of great displays to choose from.

The iPhone had, as the first smartphone, a high enough PPI that you can't distinguish between pixels - it's what they coined the phrase "retina display" for - so there's no point in going for something higher. And the iPhone 5 apparently has the best screen among smartphones [displaymate.com] today.

The iPhone had, as the first smartphone, a high enough PPI that you can't distinguish between pixels

Well, this phone [pdadb.net] doesn't quite match the "retinaness" of the latest iPhones (the definition seems to be pure marketing; my Transformer Infinity is not as high-PPI as the "new iPad", but according to Apple's official gospel whether a display is retina or not is both a factor of DPI and viewing distance - and I use mine almost always docked, and it has a higher PPI than the retina MacBook, so arguably it's more retina), but still manages a respectable PPI of 313. In 2007. How do you like them apples?

Reading comprehension FAIL. You seem to have missed the point of the message I replied to, which implied that Apple was first in introducing a high PPI phone. Which is false. And FWIW I don't think Apple has cracked the software issue, simply doubling (or quadrupling really) the resolution is hardly an elegant solution. I think Android does this better, the apps on Infinity (1920x1200) look pretty much the same as they do on the original Transformer (1280x800), that is, the physical dimensions are the same,

Nonsense. There were other phones with >300ppi prior to the iPhone 4 such as the Samsung S8000 (2009) and the Sharp SX862 (2008).

They just didn't give it marketable name like "retina display", probably because the term is virtually meaningless.

(Quoting Steve Jobs on what "retina" means: "It turns out that there is a magic number right around 300 pixels per inch that, when you hold something around 10 or 12 inches away from your eyes, is the limit of the human retina['s ability] to differentiate the pixe

No, but it makes the screen really crappy - or on a crappy phone. I want a smallish Android with a good screen. The Galaxy Nexus has a nice-ish screen (from reviews, not as nice as the iPhone 5 but maybe almost as good as the 4s) but on an aircraft carrier of a phone.

If you read up the thread, I'm not the one saying that Apple was the "first", I'm the one saying it's really nice.

This is not a Windows issue but rather the way that developers support High DPI in their apps.

Way too many developers are still using MFC and Win32 for UI development, which has no concept of High-DPI and requires scaling to be done manually. If the app doesn't even poll for the current DPI of the OS then nothing is going to scale correctly using those antiquated API's.

WPF automatically adjusts controls to the DPI settings of the OS, however if you don't use vector paths to render UI elements you might see an ugly bitmap stretch here and there. Haven't fully investigated Windows RT (the framework, not the tablet), but I am sure DPI awareness is also a fundamental part of its presentation framework. If a developer throws a 16x16 icon into an app resource, you are going to get and ugly scale.

When it comes to web pages then its anyone's guess how the web designer will support high DPI. Web pages are still mostly a bunch of static jpg's so scaling up something that looks like a line on regular DPI settings, only to see it smear and blur into a bar as shown in the article is purely the fault of the web page designer.

I do agree that as a whole Microsoft needs to do a little better job supporting High DPI across their API's, but most of what this article mentions comes from poor app/web design more then anything.

It is up to the web browser. The only thing the web master should do is put a CSS if it is a mobile screen to take away some complexity or change the dimensions of the text for readability.

The web browser is what takes care of integrating the images with the operating system and rendering them on screen. Windows 8 supports high DPI but I am fairly shocked IE 10 which is an excellent browser contrary to past releases of IE does not fully support it. IE 10 needs to be patched asap as it is used in Windows 8 m

This is not a Windows issue but rather the way that developers support High DPI in their apps.

It's still sort of a Windows issue. The complaint the article raises is that there are no fine grained controls for adjusting dpi. The default setting was "too small" but flipping the switch to "make everything bigger" was too large. Also changing this doesn't change things on the desktop side, and adjusting things on the desktop side doesn't change the metro side. This should be reconciled sooner rather than later.

Way too many developers are still using MFC and Win32 for UI development, which has no concept of High-DPI and requires scaling to be done manually. If the app doesn't even poll for the current DPI of the OS then nothing is going to scale correctly using those antiquated API's.

This isn't entirely correct.

First of all, Win32 does have some very basic support for DPI independence. In particular, the various dialog-related APIs (CreateDialog etc) measure things in "dialog units", which are defined in terms of system dialog font - the size of which varies depending on your DPI setting.

Additionally, if you write an app in Win32, and you don't explicitly state in your app manifest or via an API call that you are "DPI-aware" (i.e. know how to scale), then Windows will tell your app to r

I can't believe modern devices still don't support resolution independent layout, especially on tablets. All they would have to do is design displays to send their ppi as well as their resolution to the computer, then change the operating system to make that data available programs.

I think you can already pull DPI from EDID information. However it requires a lot of coding to make the scaling work through the OS and apps properly, so it's not exactly that easy.

It's not hard either. The GPU does the scaling. On Android (but not on iOS) the font manager takes care of hinting text to the native resolution. OP talked about resolution-independent layout, which is indeed lacking across the board. It's not rocket science for anybody except Apple, who stupidly backed themselves into a corner by relying on fixed size screen resolution.

Unless there's something I missed, that article says windows 8 doesn't support resolution independent layout. It says windows 8 does layout in pixels, but scales every thing up for certain devices (they can have scale factors of either 100%, 140% or 180%).

If your "pixels" in layout actually get scaled according to DPI, then they are in effect device-independent units (as they are in CSS today, for example).

It does say that it doesn't scale precisely to your DPI, but rather to one of 100%, 140% or 180% that is the closest match for your actual DPI - which is done to let people use bitmaps, so that they can just provide three versions of those and there wouldn't have to be any stretching which looks awful on bitmaps. XAML stuff is all vectors, though, and it c

Honestly Android is the only OS I've seen that elegantly handles scaling to higher resolutions and varying aspect ratios. Hell, the SDK itself gives you many options to make sure your app scales well.

Owning an ipad 2 myself, I can say that iphone apps scale horribly to ipad, and apple themselves even had a blunder when they changed aspect ratios (lol letterbox phone.) It's no mystery why when they increased the resolution on the ipad 3, they had to evenly multiply from 1072x768.

The Metro APIs were designed for web front-end programmers, not people who write for real GUI toolkits. You can build quite competent Metro apps in HTML and Javascript, and if you reach any limits, your web shop could hire a third party to write a module in C# or C++ to work around it.

The API for web programmers includes also rules that that apps should be made for a finite set of fixed screen sizes. Not resolutions -- screen sizes. Metro was never designed to be scalable.

This is not only a Windows problem, though. MacOS X on Retina(tm) displays is just as bad, but there the OS draws everything twice as big to begin with and scales down if needed when compositing windows. Apple never cared about hinting anyway, so all controls and labels are just as fuzzy scaled to 125% as always.

So Microsoft is being blamed for graphical limitations they had nothing to do with. From what I'm seeing the problem isn't that elements within the GUI are scaling poorly, it's that designers didn't account for the fact that some day someone might want to blow up their graphics on a much higher resolution display. It's ridiculous and unfair to blame Microsoft for this considering this would affect any high res display in any OS. What do you think happens when you run an iPhone 3 app on an iPad? By the logic displayed in this article that should also be Apple's fault.

Anyone with the most trivial experience in resizing photos will understand that this is an unavoidable problem. There's no practical way to fix it unless you rebuild the app to account for wildly varied resolutions. You could use vector art, but it's not a realistic solution for a lot of things. There's no elegant solution but hopefully the pixel density is high enough that these artifacts aren't all that obvious. This is one of those situations where it's on the third party developers can only fix the problem after it's arisen. Microsoft can't fix it for them.

I get that it's really hard to make a browser do everything right, but if you're going to push IE as such a major competitor to other browsers, maybe make it less of a steaming pile? The web browser is basically a commodity nowadays, drawing things right is just about the only thing that matters.

The issue comes into being most likely due to off-by-a-fraction errors when doing non-integer scaling of multiple elements.

One solution is just to get all your fractions right across the entire rendering pipeline. That is hard, and maybe impossible in some cases.

An easier solution is just to render to a canvas that is an integer multiple of the "base" or "expected" case, then finally apply a single scaling from that canvas to the display size at the last moment before the image is displayed.

Large displays too. Look how much space Windows 8 wastes on a 20 inch monitor rendering overly large tiles. People with large monitors and mice and keyboards should be able to zoom out and see more tiles at once if they so wish.

The retina display scales everything by 200%, which means that a grid of one-pixel lines will at least be displayed evenly even with bilinear filtering. In TFA, they scaled to 125%, causing (predictably) this to mess up in IE. If anything I'd blame the user, although the argument can be made that IE should use a nicer scaling algorithm.

What's strange is: my work just bought me an Asus Zenbook Prime and I'm running 150% on it (I nuked the OS to get rid of the crapware and to be able to log on to the domain, so I've actually never seen it stock). I can scale web pages easily by doing a pinch-zoom on the touchpad and they look terrific, including images. (I mean, sure, the images ARE scaled up, which never looks 100% perfect, but it's just not that noticeable, and doesn't look anything the article.)

What they may be noticing is ASUS Splendid Video Enhancement Technology [asus.com] which is turned on by default (I'm told, I didn't install it after reading that people were trying frantically to get rid of it). Basically, it's supposed to "fix" your graphics so they look "more lifelike". But I've seen cases where people report that web graphics were getting very blurred by it, exactly like what the article is showing.

After using 150% and browser scaling for the past week, I've been pleasantly surprised by just how "arrived" high-DPI scaling was in Windows 7. I really didn't think it would work, but it's terrific so far, with ultra-readable text that's incredibly easy on the eyes and looks just as good as Apple's Retina displays.

You'd think that a company with billions of dollars in revenue could test the product or at least re-use some old perfectly functional scaling code in prior products that performed the same task./snark

You'd think that the most valuable company in the world, with billions of dollars just sitting in the bank and a(n increasingly unwarranted) reputation for polished products, could test the product or at least quickly develop a decent solution.

You'd think that a company with billions of dollars in revenue could test the product or at least re-use some old perfectly functional scaling code in prior products that performed the same task./snark

Wait, are you talking about Apple?

After all, Apple has tens of billions more cash than Microsoft, which according to Slashdot wisdom, is dying.

Your first link is a very interesting link, which kind of proves that MS doesn't understand dpi since they insist on a "minimum resolution" instead of a "minimum size" for screens for the Metro UI. Way off the point.

Both links on the Apple side of the story, however, are so stupid I have to assume you haven't even had a look at them. Basically, both complain that the UI elements have the same size on the MacBook Retina than on a normal MacBook !!! Right from your second link: "This configuration should offer amazing detail but you don’t actually get any more desktop space". Well, guess what: That was the point, very precisely.

Overall you just proved that MS doesn't know what they are doing and Apple does. Nice way to illustrate an anti MS bashfest. Wait... wasn't that your intent?

Your first link is a very interesting link, which kind of proves that MS doesn't understand dpi since they insist on a "minimum resolution" instead of a "minimum size" for screens for the Metro UI. Way off the point.

What do you mean by "minimum size"? Like 7 inches wide? So you want MS to proclaim that all Metro apps are supposed to support a minimum of 7" width regardless of the resolution? So does that mean the apps must support a device with 320x240 resolution?

Both links on the Apple side of the story, however, are so stupid I have to assume you haven't even had a look at them. Basically, both complain that the UI elements have the same size on the MacBook Retina than on a normal MacBook !!!

They're no more stupid than TFA, which I suggest you read, and which complains about the same problem with scaling up graphics on the web.

Your first link is a very interesting link, which kind of proves that MS doesn't understand dpi since they insist on a "minimum resolution" instead of a "minimum size" for screens for the Metro UI. Way off the point.

What do you mean by "minimum size"? Like 7 inches wide? So you want MS to proclaim that all Metro apps are supposed to support a minimum of 7" width regardless of the resolution? So does that mean the apps must support a device with 320x240 resolution?

Well... everyone's talking about resolution independent interface, and that's really what it should be about. Of course, it makes no sense. So there is the Apple approach that take shortcuts that work well, and the MS approach which tries to accommodate for every fucking screen on the planet. Guess who's more efficient? Guess who will end up with an average interface that will work so-so on most configs but will look clunky on not so common screens. Interestingly that's the very same approach that did lead

Well... everyone's talking about resolution independent interface, and that's really what it should be about. Of course, it makes no sense.

So when you claimed MS doesn't understand DPI because they insist on a minimum resolution instead of a minimum size you had no fricking clue about what you were talking about(perhaps you don't understand DPI) and you now say it makes no sense to have a minimum size instead of minimum resolution? That's quite a flipflop between your two posts.

So there is the Apple approach that take shortcuts that work well, and the MS approach which tries to accommodate for every fucking screen on the planet. Guess who's more efficient? Guess who will end up with an average interface that will work so-so on most configs but will look clunky on not so common screens. Interestingly that's the very same approach that did lead them to the pre-ipad tablets. Talk about learning about one's mistake.

Perhaps you should learn about the late 80s and early 90s, when MS with the help of Dell, Compaq, HP etc. etc. and the variety of hardware and screens you're now decryi

So hold on, super-high resolution displays should only be used so that the rounded corners are more rounded?

Thanks, but no thanks. The entire point of higher resolution displays is displaying more information on the same space. Apple's way of just doubling the resolution for each length is good for backwards compatibility, but it shouldn't be the standard for new applications: they should be designed to take advantage of the high resolution displays, and I don't just mean better text rendering.

You're thinking old-school, and thus are part of the problem. The point of high-DPI displays isn't to have more screen real estate or to display more information in the same space, it's to have sharper text and image display at a usable size. Since this is a transition period, images are going to be worse off than text. I'm sick and tired of displays that are approximately 100dpi or so (given the average 17" laptop display). Display technology has lagged behind every other improvement in technology over the last 15 years. Thank God there's finally a push for higher pixel density. The best of both worlds will be when image and video editing apps make use of the higher resolution while still displaying UI at a reasonable size. Ever wonder why a lot of people have run LCD displays at non-native resolutions? Because without resolution independence, high-res displays present UI and text that is way too small for them to read. Proper resolution independence will allow people to run their displays at native, without worrying about being able to read what's on the screen.

As with anything, early adopters tend to get burnt or otherwise are dissatisfied with the performance. Apple does happen to have a leg up here, with their experience on the iPhone/iPod touch 4. The fact that Apple uses integer ratios for their screens (even on the Macbook Pro where the resolution can be adjusted, the backend renders at an integer ratio) is a big plus as it means that what we see in these screenshots won't happen. I may not like some of Apple's business practices, but they completely win when it comes to presentation and aesthetics. Microsoft would do well to learn some lessons from them.

Completely off topic, but I think Apple has another issue with their retina display--the retina is used for seeing images, not displaying them. "Retina display" makes as much sense as a tympanic speaker, jumbo shrimp, or Microsoft quality control (zing!).

You're confusing branding with technical specs. Apple merely calls it a "Retina Display" because at an average viewing distance for most people, the pixels are not discernable. Sure, there are people who will still be able to see the pixels. There will also be people who will get two inches from the screen and say "I can see the pixels durr hurr". I have been using an iPod touch 4th gen for quite a while now and think it's sad that the highest resolution display in my house is on a portable device. I can't

I think it's crazy that Windows still does not support vector icons (SVG or a similar format). Instead, Windows icon files contain about a half dozen different sizes of raster images (each at multiple color depths!), maxing out at 256x256, and then scales these bitmaps as needed if there isn't an exact match.

256x256 is good enough for icons even on high-DPI displays, but this is still an incredibly clumsy and inelegant way of doing things. I can understand why you'd want a custom 16x16 icon because at that small size, scaling down a vector image usually won't work, and you need a hand-drawn substitute. But there is no good reason why two different bitmaps should be needed to render the same icon at 48x48 and 256x256. A single SVG could handle both quite nicely, and could handle even higher resolutions than that if needed.

In what ways does SVG fall short? It's a widely supported open standard, which does pretty much everything you need for 2D vector graphics. It can even be tweaked by hand, since it's XML-based. (I've done a couple of simple SVGs entirely in Notepad.)

Well, and significantly improved tools for producing them.

Adobe Illustrator has supported both import and export of SVG files for some time. And while Inkscape is far from perfect, it's a workable free solution for most non-professional users. Are there other, better vector editing tools that don't allow the creation of SVGs?

Being XML is one of the reasons that SVG hasn't really taken off. XML killed MathML as well. That you can't use it like other image types is also a major problem. I'd go on and on about it, but it's not important enough to me to spend the time. Just google SVG criticism or something.

I'd love to use vector images, but SVG is just a huge pile of mistakes. Had the W3C kept it simple, not XML, individual elements out of the dom, and ditched scripting, we'd see it used everywhere.

The reason for multiple sizes is so elements can be perfectly aligned to the pixel grid for maximum crispness. A vector image won't help in this case. You can lose up to half the effective resolution by not designing for an exact pixel size. Of course, this becomes less important at higher PPI.

One of the arguments to leave XP behind is that it can't work at all beyond 100 DPI or the apps will break. I assumed Windows 8 Modern UI took care of this? This issue alone made me weary of a macbook as 200 DPI will not run so great in bootcamp with Windows with most of the apps all misrendered.

Windows 7 does DPI scaling pretty well. There are a handful of misbehaved apps that break, but unless you're unlucky enough to regularly use one of these, you'll be fine. The usual stuff will work fine.

For the desktop, it's the same as Vista/Win7 - it'll bitmap-scale apps that do not declare themselves as DPI-aware in the app manifest, but will let them do what they see fit if they do declare themselves such. Unfortunately, there has been a number apps that add the declaration but ignore the requirements.