Intel warns of impending high-resolution explosion

Intel predicts that Ultrabooks, laptops and all-in-one systems with 'retina'-class displays will be available as early as 2013.

Intel has indicated that it predicts laptops and desktops to go high-resolution as early as next year, with the PC market following Apple into the world of the 'retina-class' display.

Unlike Apple's efforts, which are limited to small-scale screens on its iPhone and iPad products, Intel predicts that high-resolution displays will be the order of the day across ultrabooks, laptops and all-in-one systems.

High-resolution computer monitors are nothing new, of course, but typical laptops top out at 1920x1080. Intel's vision of the future, outlined in slides obtained by Liliputing, sees 11in Ultabooks getting displays capable of 2560x1440, or around 250 pixels per inch.

The larger 13in Ultrabooks go a step further: Intel predicts that by 2013, these displays will offer a 2800x1800 native resolution, while 15in laptops will sit somewhere in the region of 3840x2160. Large-format 21in all-in-one systems, meanwhile, will offer a display of around 220 pixels per inch at 3840x2160.

The differing pixel densities, which sees hand-held devices hit 300 pixels per inch, laptops hit 250 pixels per inch and all-in-one systems hit 220 pixels per inch, should all allow for a 'retina'-like experience, Intel claims. The reason for this is the difference in viewing distances: the further away from the display you are, the lower the pixel density required to cause individual pixels to disappear and a smooth image appear in their place.

While Intel isn't confirming that future products will definitely come equipped with high-resolution displays - it can't, given that it doesn't actually make any display panels itself - it is warning that the ecosystem should start preparing as if it were a given.

There's plenty to do: as owners of Apple's new iPad are finding, having a high-resolution display is nothing without the infrastructure behind it. Web developers will need to ensure graphics are of a high enough resolution that they won't appear blocky or blurred, or switch to a scalable format such as SVG; games developers will need to use higher-resolution textures, which in turn means that graphics card makers will need to equip their hardware with more memory and higher processing power; even streaming media may have to look beyond the usual 1080p High Definition format to keep viewers happy.

Although Intel is predicting the appearance of high-resolution devices as early as next year, it will likely be a while before the format reaches majority saturation: industry watcher StatCounter revealed that 1366x768 has overtaken 1024x768 as the dominant screen resolution for web use on non-mobile platforms for the first time this week, despite manufacturers having standardised on widescreen displays for many years.

Share This News Story

51 Comments

The sooner 2560x1600 displays become cheaper, the quicker I'll be buying two of them. It's a joke how they are so expensive when monitors (and thus screen resolution) are such an integral part of your computer experience.

The Intel HD 4000 coming with Ivy Bridge is supposed to be a massive improvement on the current generation, with even bigger improvements planned for Haswell. I think Intel are already well-prepared for the high DPI revolution. Besides, Nvidia and AMD also stand to benefit strongly from this.

Originally Posted by fdbh96This is good, but until intergrated graphics improve, I wont be buying an ultrabook with a resolution higher than hd.

True but could you not just window it and run it at a smaller res? I don't game on a laptop, and never will, so I couldn't care less about gpu grunt in them and just want them coming with larger resolutions. If I did want to game on them though, I wouldn't be using integrated graphics.

About frickin' time. I'm sick of all these "HD" displays crowding out anything with a resolution I couldn't match in NINETEEN-NINETY-EIGHT.
1920*1080 is only a little bigger than 1600*1200, and I was running that on a smaller screen.

yay, just as we reach a sweet-spot where it is practical to run modern games at decent settings on inexpensive notebook hardware. You can run anything and well on a GT 540M @ 1366x768. That's not gonna happen with that res quadrupled. Need to get a message out to ATI & Nvidia to get their act in gear and get some more cores and wider memory interfaces going...

I imagine this will be the ceiling for some time. At 300ppi, you simply can't see the pixels any more unless you're nose-to-screen, so I can't see anyone bothering to produce anything higher than that for quite a while.

For that reason, I also don't think we'll see TVs go natively higher than Full HD for quite a while. The average punter can't see the individual pixels at even just 1080 from across a room.

^I agree, and the size of 4K files will be huge (hundreds of GBs), so they'll have to figure out a realistic platform to push this standard. But I definitely can't wait for 4K displays to appear; it should push the price down of 1600p and 1440p ones too.

I honestly don't see the appeal of high res screens that have a small dot pitch - I deliberately didn't get the U2711 because the pixel pitch is just too small for my liking.

A photographer friend of mine recently bought an iPad 3 and says hes "underwhelmed" by the crazy resolution - the last thing you'd expect to hear from somebody who spends his life working with pixels. Frankly I think it's just another gimmick. Bigger screens with high res - now that's more like it. ;)

anyone else find it ridiculous that its always apple that gets technology moving? seriously when was the last time a ridiculously popular computer-based phenomenon happened that apple DIDN'T start?

These high-resolution screens could have been done a very long time ago. Tablets could have been accomplished even longer ago. The really stupid thing is a lot of what apple comes up with isn't even that cool or practical, but people can't help but get their panties wet because THEY made it.

I have no problem with higher-resolution monitors. I think screen resolutions today on higher-quality monitors are good enough but I see no problem with increasing it.

Originally Posted by Madness_3dyay, just as we reach a sweet-spot where it is practical to run modern games at decent settings on inexpensive notebook hardware. You can run anything and well on a GT 540M @ 1366x768. That's not gonna happen with that res quadrupled. Need to get a message out to ATI & Nvidia to get their act in gear and get some more cores and wider memory interfaces going...

I'd say that Nvidia and AMD have been far ahead of the curve on graphics technology (partly due to console games and the software being far behind). Eyefinity and Nvidia Surround have demonstrated that their cards are easily capable of pushing 5760x1080 pixels, so it's not as far away as you think. Besides, you always have the option of gaming at a lower resolution or disabling AA as it's not needed at high resolutions.

hmm, I was just talking with my history of technology professor about how intel has been pushing the HD trend to encourage consumers to buy more powerful computers.

Quote:

Originally Posted by schmidtbaganyone else find it ridiculous that its always apple that gets technology moving? seriously when was the last time a ridiculously popular computer-based phenomenon happened that apple DIDN'T start

Netbooks jump to mind, there were some precursors like the OLPC, but the actual marketing trend was unquestionably pioneered by ASUS. But I do agree, somehow apple - without innovating any new hardware of it's own - has become the driving force behind tech culture trends.

Originally Posted by yougotkickedhmm, I was just talking with my history of technology professor about how intel has been pushing the HD trend to encourage consumers to buy more powerful computers.

Netbooks jump to mind, there were some precursors like the OLPC, but the actual marketing trend was unquestionably pioneered by ASUS. But I do agree, somehow apple - without innovating any new hardware of it's own - has become the driving force behind tech culture trends.

Well obviously intel would be pushing something like this, but since they don't actually make a complete product (just parts for products), its kinda hard for them to get anywhere with their motives. That's like saying tire manufacturers for cars want their customers to hit the brakes harder and peel out, but you can't just get your customers to push their hardware to its limits.

As for netbooks, I didn't even think about them but once again that's apple's doing. Asus may be the one that made netbooks popular, but Apple is the one that started the idea, with the original Macbook Air. The funny thing about the Air is it was the first (AFAIK) that used intel's Atom CPU, but then intel was like "wait a minute, we made the processor, why does it have to be exclusive to apple?" so then they sold it to companies like Asus, which sold a similar product to Air for a much more reasonable price. And since then, Air has been wildly unpopular because tablets and non-apple netbooks are better for their value.

I would love retina class computer monitors, though I am more interested in the awesome black levels of OLED displays. Computer LCDs are horrible when it comes to black levels, and plasma technology can't get high resolution at such a small screen.

I doubt it Intel, manufacturers can barely be bothered to produce a decent 1080p notebook screen at any semi decent price, there is no way we're getting even higher resolutions at $1000 ultrabook level anytime soon without some SERIOUS compromise such as no real GPU to back it.

Originally Posted by CowBlazedI doubt it Intel, manufacturers can barely be bothered to produce a decent 1080p notebook screen at any semi decent price, there is no way we're getting even higher resolutions at $1000 ultrabook level anytime soon without some SERIOUS compromise such as no real GPU to back it.

If Apple put a 'retina display' on a macbook air then I doubt manufacturers would have any choice.

Originally Posted by schmidtbagAs for netbooks, I didn't even think about them but once again that's apple's doing. Asus may be the one that made netbooks popular, but Apple is the one that started the idea, with the original Macbook Air.

The basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.

All this talk of high res displays is making me weak at the knees, there is one major problem that needs to be addressed particularly for desktop systems, that is display interface technologies. As it currently stands dvi remains the defacto standard even though display port has been around for many years now. Problem is display port doesn't provide that much more bandwidth than dual-link dvi. Right now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz. Therefore it's time for a new display interface technology or a major revamp of display port to allow for bandwidths next generation screens will require.

Originally Posted by PargeWell we can forget about running games in native resolution on lappys then.

Quote:

Originally Posted by sandysIf its high enough resolution and the scaler hardware is good enough it won't matter.

Or you could, you know, let your high-end GPU handle scaling with a wee tiny bit of its rather extensive power. For the life of me I can't figure out how to fully disable scaling on my radeon....

Quote:

Originally Posted by m0zesThe basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.

All this talk of high res displays is making me weak at the knees, there is one major problem that needs to be addressed particularly for desktop systems, that is display interface technologies. As it currently stands dvi remains the defacto standard even though display port has been around for many years now. Problem is display port doesn't provide that much more bandwidth than dual-link dvi. Right now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz. Therefore it's time for a new display interface technology or a major revamp of display port to allow for bandwidths next generation screens will require.

Just improve the transducers to go for higher bandwidth and there, job = done. From an architectural perspective, it's _that_ easy. Implementation-wise, we have Thunderbolt around with the necessary bandwidth. Apply TBolt PHY layer with DP signalling and its done.

Remember, CAT-7 (now called Class F cabling) cabling (4 twisted pairs, 4 "lanes") will do 100Gbit/s over 100m (with 32/22nm chips they say) so we most definitely have the base tech for it, especially considering DisplayPort does a "mere" 17.28Gbit/s over 4 lanes

Originally Posted by m0zesRight now if a manufacturer pushed the interface to it's limit a screen would be limited to 2560x1600 @ 120hz or 5120x3200 @ 60hz.

That's still plenty of headroom if all the display manufacturers do is double the pixel count of existing displays in both directions. Besides, DisplayPort can always be improved with faster signal clock speeds. It's very easy to boost speeds with a serial interface this way.

It's just enough headroom for a doubling, but i'd suspect that there would also be a market for 300dpi screens in professional industries. Then you factor in higher refresh rates, larger screen sizes and what headroom might have been there is extremely quickly gone. Whilst it maybe easy to boost speeds with a serial interface the same limitations always apply, that is you may get backwards compatibility with the input controller but no forwards compatibility with previous generation output controllers. Will it end up coming down to requiring a new dp revision every year to cater for ever increasing pixel densities? It all just ends up being a compatibility nightmare.

Originally Posted by m0zesThe basic concept of a netbook had nothing to do with apple, they originated from the OLPC [One Laptop per Child] initiative to provide low cost systems to the 3rd world.

Well in that case you could say apple didn't have anything to do with touchscreens, tablets, mp3 players, online music stores, computerized TVs, smartphones, and probably a lot more things I can't think of ATM. Apple doesn't invent anything, they're one of the least original companies out there. The only difference is apple makes a solid product that for some weird reason becomes wildly popular due to a spokesperson in straight-leg jeans and a black turtleneck, and then other manufacturers decide at last minute that this unoriginal product is suddenly a good idea because apple says so.

Originally Posted by m0zesIt's just enough headroom for a doubling, but i'd suspect that there would also be a market for 300dpi screens in professional industries. Then you factor in higher refresh rates, larger screen sizes and what headroom might have been there is extremely quickly gone. Whilst it maybe easy to boost speeds with a serial interface the same limitations always apply, that is you may get backwards compatibility with the input controller but no forwards compatibility with previous generation output controllers. Will it end up coming down to requiring a new dp revision every year to cater for ever increasing pixel densities? It all just ends up being a compatibility nightmare.

Posted this a little earlier (couple of edits as well for clarity's sake):

Quote:

Originally Posted by ZeDestructorJust improve the transducers to go for higher bandwidth and there, job = done. From an architectural perspective, it's _that_ easy.

Implementation-wise, we have Thunderbolt around with the necessary bandwidth. Apply TBolt PHY layer with DP signalling and its done.

As of now, we have CAT-7 (now called Class F cabling) cabling (4 twisted pairs, 4 "lanes") will do 100Gbit/s over 100m (with 32/22nm chips they say) so we most definitely have the base tech for it, especially considering DisplayPort (currently) does a "mere" 17.28Gbit/s over 4 lanes

As a student in electrical engineering (and general tech enthusiast), I reckon I'm speaking sense here.

Originally Posted by schmidtbagWell in that case you could say apple didn't have anything to do with touchscreens, tablets, mp3 players, online music stores, computerized TVs, smartphones.

Nice over reaction, only it was Steve Jobs himself that said that apple would never produce a netbook. All technology is interrelated, and it's not hard to draw relations between the two. But the ultimate question for me is would one exist if the other hadn't, and in that respect had the Air not existed the netbook still would have. If the Macbook Air was a netbook then yes it would be fair to say that they had a major role it popularizing the technology, but it isn't and was never meant to be a netbook, it was a slim light notebook - it used the same hardware, it ran the same software and it was priced accordingly. The original netbook used customized low power hardware, it ran different software and above all else was designed to be cheap. If you look at the pc equivalent of the Air we now have intel and other manufacturers pushing ultrabooks, they are the descendants of the Air not netbooks.

Quote:

Originally Posted by ZeDestructorAs a student in electrical engineering (and general tech enthusiast), I reckon I'm speaking sense here.

Thanks ZeDestructor, I agree that the solution is already available but I'm not worried about technology availability but rather that it is implemented. Dual-link DVI was always a part of the DVI standard, and it was a part of ATI and Nvidias reference card designs for many generations but manufacturers only ever used Single-link DVI outputs. In 2004 Apple release their first 30" 2560x1600 monitor, but due to the lack of dual-link dvi support only several of their cards and a small hand full of professional level cards were fully compatible. Dell release their first 30" around 18 months later and these had to be timed accordingly with a new generation of dual-link dvi enabled graphics cards so that they could be used at there full spec by the consumer market. Dell probably could have released there's earlier seeing as they used the exact same panel as apples but it was pointless doing so with out the compatible output hardware. If high res screens are about to make a massive appearance then it's time that we had a display interface with some serious bandwidth ready to go, otherwise development will be stifled all over again.

Thanks ZeDestructor, I agree that the solution is already available but I'm not worried about technology availability but rather that it is implemented. Dual-link DVI was always a part of the DVI standard, and it was a part of ATI and Nvidias reference card designs for many generations but manufacturers only ever used Single-link DVI outputs. In 2004 Apple release their first 30" 2560x1600 monitor, but due to the lack of dual-link dvi support only several of their cards and a small hand full of professional level cards were fully compatible. Dell release their first 30" around 18 months later and these had to be timed accordingly with a new generation of dual-link dvi enabled graphics cards so that they could be used at there full spec by the consumer market. Dell probably could have released there's earlier seeing as they used the exact same panel as apples but it was pointless doing so with out the compatible output hardware. If high res screens are about to make a massive appearance then it's time that we had a display interface with the some serious bandwidth ready to go, otherwise development will be stifled all over again.

Surely we have thunderbolt for this. Intel did just develop it (along side apple of course.)

Originally Posted by schmidtbagApple doesn't invent anything, they're one of the least original companies out there. The only difference is apple makes a solid product that for some weird reason becomes wildly popular due to a spokesperson in straight-leg jeans and a black turtleneck, and then other manufacturers decide at last minute that this unoriginal product is suddenly a good idea because apple says so.

Originally Posted by LennyRhysA photographer friend of mine recently bought an iPad 3 and says hes "underwhelmed" by the crazy resolution - the last thing you'd expect to hear from somebody who spends his life working with pixels. Frankly I think it's just another gimmick. Bigger screens with high res - now that's more like it. ;)

Yup, I don't see the use for more than "HD" on small devices.
The result is that you get interpolation when playing video's...

For big screens, sure, I understand. But do we really need more than 1920x1080 at 10"?
Remember, full resolution video's weigh in at ~5-10GB per Hour...try pulling that over wireless :D

Originally Posted by TC93Good luck trying to play a game at those resolutions on a laptop. Changing the resolution to a lower one will just make it look awful on an LCD.

Sure gaming at the native res is going to be a challenge, especially for the latest and greatest titles. However there is an advantage to higher resolution screens that needs to be considering. The reason why outputing a non-native reslution looks considerably worse than a native res is regardless of the output resolution, the display is always going to output it's native resolution. Therefore there has to be a level of interpolation to 'guess' what the missing pixels would actually be.

Depending on the screen this can be reasonably good to down right terrible, blurry text etc. However if you can reduce the resolution by a factor of 4x then what you get is perfect scaling, each pixel becomes larger and interpolation isn't required. If we were to do that right now with a 1080p screen the end result is 960x540 this certainly isn't a res you'd want to game at, expecially when 1080p gaming is possible.

Boost the native res to 2560x1600, this scales down to 1280x800. That's a res that many a laptop currently ships as a native resolution and will provide a nice trade off between gaming and general application usage. Aliasing will be more problematic with such systems, but a higher level of anti-aliasing will be easier to deal with than 4x more pixels to render. Just to note scaling down a current 30" 2560x1600 screen to 1280x800 yields excellent results, yes everything is big but the over all image quality is great. Shrink this down to a 13-15" screen and the results will be considerably better.

At least for my own use, I still question the need for such massively-high spec screens in laptops. If I want a powerhouse with very large screen resolutions, I have a desktop PC. I'm really not interested in 1080p video on my laptop; if it's very hard for most people to distinguish between 720p and 1080p video on screens smaller than around 32-40", then you sure as hell aren't going to notice the difference on a screen that's no bigger than 15-20". Having the extra resolution might be nice, but the trade off of needing more powerful hardware, and thus increased power usage, is too costly for a machine that is supposed to be portable. Besides, there aren't many tasks that I would want to achieve on my laptop that would require a massive resolution. Something higher than 1024x768 - what I currently have - would be nice, but you don't have to go mad with it.

But then, my opinion on the matter may be skewed somewhat; I already think that people (read: consumers) already buy machines far in excess of the spec that they actually need. The UberLaptop5000000 may be able to play games at full 1080p, render your 3D models in fractions of a second or speed up your photo editing by x% more than the competitor, but that doesn't make much difference if all you do with your crotch-boiling energy leech is couch surfing.

Originally Posted by dr-strangeloveI feel more and more like an old man with my 1680x1050 screen

1920x1200 here, but only because my trusty old 1280x1024 started playing silly beggars. I still prefer 5:4 to 16:9 on the desktop, but I'll take widescreen as long as the vertical resolution is sensible.

My laptop, sadly, is 1366x768. It didn't feel cramped until I got the new monitor for the desktop...

Originally Posted by BLCAt least for my own use, I still question the need for such massively-high spec screens in laptops. If I want a powerhouse with very large screen resolutions, I have a desktop PC. I'm really not interested in 1080p video on my laptop; if it's very hard for most people to distinguish between 720p and 1080p video on screens smaller than around 32-40", then you sure as hell aren't going to notice the difference on a screen that's no bigger than 15-20".

I think 1080p is the perfect res for a laptop after having tried a dell 15z with one. There is just so many more pixels to play with and everything just looks s much clearer.

Ack! forgot about this thread and missed a conversation about a topic I love

let's try and squeeze in a word or two before this qualifies as necro;

I have actually written a detailed research paper on the development of the netbook so I really know my stuff on this one;

The macbook air is an important step in the evolution of the netbook, it gave the concept of an ultraportable system a lot of publicity and interest, but asus had coined the term 'netbook' and released the Eee PC 700 by october 2007. The air was publicly announced in late 2008 during one of Steve Jobs' Keynote addresses.

I think it's important to define what a netbook is when discussing their origins; many people will call any small form factor laptop a netbook, if you go that direction there are precursors as early as 1996 with the Toshiba Libretto. I like to specify the netbook as a low-cost low-power ruggedly built laptop, with a sub-12" display. I say this because that is how the netbook started out; as a cheap, small and durable system meant for developing markets (not just the OLPC, asus had planned to market the Eee 700 in developing markets as well, it launched in Taiwan first).

based on my research it's hard to attribute the netbook to anyone but asus, the OLPC is important, but it really had far less impact than it's given credit for. the macbook Air helped fuel the market by showing consumers something cool but expensive, making the super-cheap ASUS models look all the more appealing.

Originally Posted by fdbh96I think 1080p is the perfect res for a laptop after having tried a dell 15z with one. There is just so many more pixels to play with and everything just looks s much clearer.

But are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?

Originally Posted by BLCBut are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?

Some of us don't have the ability to use a high-res screen all the time. I use my laptop to read notes during class at uni (and often take notes as well), so being able to split a big, high-resolution screen is far nicer than constantly alt+tabbing on a low-resolution screen. In fact, when I am at home, I use a dual-screen layout, with documentation/manual/notes/textbook/book on one screen and actual work/source code on the main screen. When gaming/browsing/having fun, I keep temperature monitors and IRC chat on the secondary monitor so I can easily glance back and forth.

For me, I really need a big high-resolution screen since I spend a LOT of time reading, and it's nice to have when I'm doing more relaxed tasks.

In terms of actual manufacturing costs, increasing the resolution isn't all that expensive since all the R&D has been done already. Just because Dell and co. put a high markup it doesn't mean its all that much more expensive at the base.

Finally, bigger, higher-resolution screens have practically NO effect AT ALL on battery life for these reasons:

1. The lamp is almost always the exact same (if not more efficient on the higher-end models) and the panel takes in a minimal amount of power to do the actual rendering, so no difference there.
2. During general 2D/Aero usage there is no practical difference since the GPU won't even bother ramping clocks and voltage up (yes, I have swapped a lot of resolutions and the GPU just doesn't care as the actual GPU load remains at 2-3%) and during 3D usage, the GPU will be maxed as a matter of design regardless of resolution.

So really, its just a matter of manufacturers hurrying up deployment already.

Originally Posted by BLCBut are there really that many situations where you absolutely have to have that extra desktop space and you don't have access to an external monitor? My 5-year old ThinkPad can drive a 1080p (and probably higher) display just fine. I do quite a bit of work on my laptop and I don't find the 1024x768 res restrictive in that many cases; when I do need the extra space I can just plug in an external monitor or switch to a desktop machine.

Sure it's nice to have all that extra space - I'm not denying that - but do you really need it? Is it really worth the additional expense (and knock-on impact on battery drain)?

I'd say its more than just a matter of space. When you use a resolution as low as 1024x768 then yes, some people would care about the space different. But for many people who use 1680x1050 or higher, space isn't a problem anymore - its image quality. Once you start reaching resolutions near the 2000s (in width), default fonts become hard to read and everything looks tiny, so you need to re-size everything to look larger.

Currently, Windows is the only modern OS that has no way to compensate for limited space. All programs must be crammed into the same taskbar and desktop, unless you have 2+ monitors, which is just an expense whereas Linux, Mac, Free-BSD, and others use multiple workspaces/desktops that you can switch to on 1 monitor. This is an immensely useful feature to me.

Originally Posted by schmidtbagI'd say its more than just a matter of space. When you use a resolution as low as 1024x768 then yes, some people would care about the space different. But for many people who use 1680x1050 or higher, space isn't a problem anymore - its image quality. Once you start reaching resolutions near the 2000s (in width), default fonts become hard to read and everything looks tiny, so you need to re-size everything to look larger.

Currently, Windows is the only modern OS that has no way to compensate for limited space. All programs must be crammed into the same taskbar and desktop, unless you have 2+ monitors, which is just an expense whereas Linux, Mac, Free-BSD, and others use multiple workspaces/desktops that you can switch to on 1 monitor. This is an immensely useful feature to me.

As someone who uses 8pt fonts (limited by density) regardless of resolution for high-density text (yes, even on a 2560 panel, hell I use 9px high on my phone which is even smaller at normal viewing distances), I have to disagree on your first point.

In addition, IQ is gradually becoming a focus. Dell for one offers a full-blown IPS panel in their Precision laptops, so it's far from difficult to make it. In addition, the cheaper 1080p TN panels in the XPS laptops manage some pretty good colour accuracy despite being shitty TN, so again, RAISE THE BAR ALREADY!

On the subject of multiple workspaces, I have to agree, Windows is hopeless there, although KDE intends to port over it's whole DE to windows at some point, which means other major DEs like GNOME and Unity might well make it across.

Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.