Questions on Super User are expected to relate to computer software or computer hardware within the scope defined by the community. Consider editing the question or leaving comments for improvement if you believe the question can be reworded to fit within the scope. Read more about reopening questions here.
If this question can be reworded to fit the rules in the help center, please edit the question.

I have 1920x1200, which I got a few weeks back; but I had to search hard, and pay a little more for it. I think the deal is that 1920x1080 is standard HD TV res.
–
Lawrence DolSep 18 '10 at 6:59

10

@randomguy: actually before the advent of "HD" the standard for 24" monitors was 1920x1200, a 10:16 ratio, the standard wide screen ratio for computer monitors. However after "HD" was invented, the people in marketing decided that dumb consumers would more likely buy something with the new catch phrase "high definition" tagged to it, thus we see many many more 1920x1080 monitors than the 1920x1200 monitors. Personally, I own a 1920x1200 monitor which i searched long and hard for because I'm more computer literate than the average consumer. 1920x1200 is better than HD!
–
FakenSep 21 '10 at 2:36

4

@Faken: I agree. I really appreciate the extra space as a programmer.
–
randomguySep 21 '10 at 12:11

2

Extra vertical pixels are all well and good, but everyone knows that the main reason to have a 1920x1200 display is so that, when you're watching a full HD movie that isn't anamorphic, you can still bring up the media player controls without obscuring any of the video.
–
LukasaJun 18 '11 at 15:22

4 Answers
4

My guess... because the technology is so DANG expensive. You can get higher resolution monitors, but they are really expensive. 1080p is pretty much the highest standard and most 52" HDTV's look great, so your 24" monitor won't see the difference. There is technology though that can take the resolution all the way to 7680×4800!

Another thing to consider is that at those high resolutions, there aren't going to be much differences between the quality of a 1920x1080 and 7600x4800 at the smaller sizes of screens (19 to 24 inch). If you're cramming 4800 vertical lines on a screen that's only 10 to 12" tall, it's going to be difficult to notice the difference. Even a comparison of 1080p and 720p will not yield major differences in quality of gaming or playback.

(However, as mentioned in other answers, there will be a noticeable difference in resolution)

Finally, DPI management is not available in XP, and I'm guessing maybe even some linux distros (not sure) which makes the higher resolutions harder to manage since text becomes less clear to see.

Well it's not about difference, but one can definitely benefit from a higher resolution larger monitor if they use it for computing. I can see myself much happier on a 27" monitor with a higher resolution, because I have much more screen space for coding.
–
Robert KoritnikOct 25 '10 at 7:30

@RobertKortnik That is true, however the questions states within the 19 to 24 inch monitor range.
–
KronoSOct 25 '10 at 14:19

2

I hope you are trolling and this is not your real opinion or I feel sorry for you. You can't seriously compare a 52" HDTV for video, with a computer display, where you read text and sit much closer. It's not that important if a video loses detail here and there but if you programm or read on a monitor you will be glad you have higher resolution. Why do you think so many OS, including Windows XP, resort to hacks like subpixel rendering for fonts in the first place?
–
user643011Jan 20 '13 at 13:02

Single link DVI has a maximum resolution at 60Hz of 2.75 megapixels. Which gives a maximum screen resolution of 1,915 × 1,436 pixels (4x3) or 2,098 × 1,311 (16x9). This is the most common.

However, DVI has provision for a second link containing another set of RGB pairs.
This allows for resolutions up to 4 megapixels @ 60Hz.

Some WQXGA (2560x1600 pixel) displays that are higher than the normal and utilize dual DVI come from Apple, Dell, Gateway, HP, NEC, Quinux, and Samsung.

I think you will need a graphics card that supports dual link DVI! Give it time, the cost will come down and they will be common place.

Hope that helps.

EDIT:
There is a new technology that's coming out from Intel & Apple. Apple calls it thunderbolt.
Essentially it's a universal interface that supports 10Gb/s throughput. You might soon see displays that utilize thunderbolt and higher resolutions. Looking forward to it!

Let me clarify a few things. "Cheap" monitors (under, say, $400) are almost always TFT screens, with all its attendant problems. When you move to IPS (In Plane Switching) panels, you're entering the land of graphics professionals, who usually use 30" monitors, running at 2560x1600 (16:10). This still gives a reasonably small dot pitch. Then there is the iMac 27", which runs at 2560x1440 (16:9). It's a gorgeous monitor, but has several drawbacks: shiny screen, and LED backlighting that bleeds around the edges (like all LED backlighting).

Professional monitors avoid both of those faults. If you want the best of both worlds, have a look at the Dell U2711, a 27" IPS monitor with a matte screen and the more traditional cold cathode backlighting. It has--believe it or not--a .233mm doth pitch! As far as I know, that's the finest dot pitch available on any of today's monitors. It's available only from Dell, and costs around $1200.

To add to the general confusion, the highest resolution available today in the PC world is 2560x1600. If you need higher resolutions, you'll have to go to ultra-expensive professional graphics cards.

2560x1600 is the current "maximum" of consumer-level hardware as of 2011 or so, since that is the maximum resolution of most graphics cards (you can check this yourself by looking at the specs of many consumer/professional graphics cards; I haven't checked the ATI cards as thoroughly as NVidia though). Higher resolution would require either a specialized graphics card, or quad-display configuration with two graphics cards and perhaps an adapter; I am not entirely sure about the specifics.

Reasons 1920x1200 may be the highest resolution produced

The reasons monitors probably tail off around 1920x1200 are economic reasons, due to consumers who don't know the difference between screen size and resolution, economies of scale, push to 120Hz "3d" (alternating-shutter-glasses) using up bandwidth, new LCD technologies starting off small, etc. However there may also be a minor technical reason that might possibly contribute, e.g. normal DVI supports at most WUXGA (1920x1200) at 60 Hz, and one requires "dual link" DVI (a modified version of a DVI cable with extra pins) to get higher resolutions/refresh rates.

Or as you say, it could be a conspiracy of price fixation, as another answer suggests. There are times that I have jokingly wondered that myself, with the trend in decreased monitor resolution, especially of laptops.

1920x1050 is also "Full HDTV"... and we know the power of marketing.

Special high-resolution monitors

There are a few rare monitors (e.g. IBM T220/T221 ~22" monitors) with extremely high resolution, but the DPI is so high that the pixels will blur together if you don't have 20-20 vision.

If one enters the realm of medical imaging or projectors, it is possible to achieve extremely high resolutions, but you have to be a millionaire (or company) to afford them.

Why resolution isn't the whole story

Even then, resolution isn't the whole story of a monitor, since there are issues of latency. Some monitors (like the T220s) cannot play movies or drag windows because the rise-fall latency of the pixels leads to ghosting. Many high-resolution monitors may suffer from major issues (like pink-blue areas on the 30" Dells). One may need to buy a stand to adjust the height. If someone is reading this and choosing a monitor, I would encourage them to pay attention to the details.