RSS

How-To Geek

You’ve probably heard that it’s important to use your display’s native resolution – assuming you’re using an LCD flat-panel monitor instead of an ancient CRT monitor. With a LCD, using a lower resolution will result in inferior image quality.

Windows generally defaults to your monitor’s native resolution, but many PC games will often default to lower resolutions.

Effects of Using a Non-Native Resolution

You can see the effects of using a non-native resolution yourself if you’re using an LCD monitor. Right-click your desktop and select Screen resolution. From the window that appears, click the Resolution box and select a resolution other than the one recommended for your monitor (this is your monitor’s native resolution).

After selecting a lower resolution, you’ll see its results. Fonts and images will be blurry and everything will generally look lower-quality and less-sharp. This is very different from how a CRT (cathode ray tube) monitor worked. With an old CRT monitor, you wouldn’t see worse image quality when using a lower resolution.

LCD vs. CRT

In a CRT, an electron gun shoots a stream of electrons that is filtered to become the image that appears on your screen. The exact details behind how a CRT monitors work is beyond the scope of this article, but the important point is that a CRT monitor can display an image at any resolution at or below its maximum resolution. When an 800×600 signal is sent the the monitor, it produces an 800×600 image that takes up the full area of the screen.

Unlike a CRT monitor, a modern LCD display contains a certain number of individual pixels. Think of each pixel as a small light that can be one of several colors (it actually produces a color through a combination of its red, green and blue elements). The image on your screen is built from the combination of these pixels. The number of pixels in an LCD results in its native resolution – for example, a laptop with a 1366×768 resolution has 1366×768 pixels.

When an LCD monitor runs in its native resolution – 1366×768 in the example above — each pixel on the LCD corresponds to a pixel in the image sent by your computer’s video card. This produces a sharp, clear image.

What Happens When You Use a Non-Native Resolution

Now, imagine that your computer’s video card sends an 800×600 image to a 1366×768 LCD — you’ll see that the 800×600 image doesn’t evenly correspond to the number of pixels in the LCD. To produce an image smaller than its native resolution, the display would still be using 1366×768 pixels – so the display must interpolate (scale) the image to be larger and fill the screen. In the example here, the aspect ratios (4:3 for 800×600 and 16:9 for 1366×768) are different – so not only will the image be enlarged, the image will be distorted.

This is similar to enlarging an image in an image-editing program – you’ll lose clarity and, if the image is a different aspect ratio, it will appear distorted. For example, here I’ve taken a screenshot of How-To Geek at 800×600 and enlarged it to 1366×768 (I then shrunk it, maintaining the aspect ratio, so it would fit this article.) As you can see, the image is blurry from being enlarged and distorted from being widened. This is what your LCD does when you use a non-native resolution.

When playing games on an LCD, bear in mind that using your native resolution is important for graphics quality – although other settings may be more important, as producing a larger image takes more graphics horsepower.

If you want fonts and other elements on your screen to be larger and easier to read, you should try adjusting the size of the elements in your operating system rather than changing your monitor’s resolution.

Comments (23)

If you have a 4:3 or a 16:9 monitor you must switch to a lower SAME ASPECT RATIO resolution.

As resolution is exponential 2 factor – quadratic – hard games will be played much faster, and sometimes the difference is being abkle to play them at 720p than at 1080p now that 16:9 are the standard.

I have a 16:9 24″ screen and a AMD HD4250 integrated GPU, 720p is the only option, and 1080p blurs.

But most important, Why at every OS – MS WOS and Linux – does the same at this matter, 1080p has very small fonts and icons, and 720p looks better, at least for people over 40 with not a young vision.

The icons and fonts size at 720p resolution MUST BE – in my opinion – the same at default 1080p and let the user put them smaller if they want.

Not everybody has a 24″ screen, most are even smaller, and they use the default settings, with very small icons and font sizes.

mitcoes, i think this is all personal preferance, It winds me up sometimes when i replaced the call centre CRT’s with nice shiny TFT monitors and they wanted the resolution turning down because they couldnt see it, but I could see it fine and in my opinion they needed glasses, but people are different so you change the settings and let them be…

we cant change the font settings (or we couldnt used too) because the software we make and use would all get messed up and change coloumns and not display others when we did..

what i would like to know is how when i bought my TV and it said it was only a 720p TV when i hook up my PC and xbox they can both run saying 1080i (i know its not 1080p but still) this is strange right?

Well, native resolution may be optimal but it’s certainly not the most important thing. What’s most important is for the viewer to have long term comfort with the monitor in front of them, & this all depends on the viewer, their eyesight & the programs they use.

Yup. sorta why I still use a CRT… that and the bettr black levels which is very important for graphic work. You would not believe how pissy people will get when their logo colours aren’t exactly right. Realy the thing to remember about LCD vs CRT is that CRT’s scale gracefuly while LCD doesn’t.. Which gives you greater flexibility.

With games for example. being forced to use your monitors native res and refresh rate may very well require more graphic power than you can muster… with a CRT you have a little more flexibility. Of course LCD’s are cheaper, lighter, smaller and less power hungry and do poduce good image quality at native.

@bobro
The reason it does that is due to the fact that 1080i is roughly the same as 720p, in fact slightly worse imho in quality but utilizing a different technique. The “p” stands for progressive, the “i” for interlaced. The progressive style is better as it processes twice the amount of images and therefore better clarity than the interlaced.

When you select a resolution with a different aspect ratio than your screen’s physical size, the computer attempts to fill the screen, resulting in an obviously distorted image. I have a 16:10 screen on my 27″ monitor and find that a 1680:1050 (16:10 ratio) resolution gives me the picture and text size I desire. It is interesting that this aspect ratio was not even correct in the native resolution that windows assigned to this monitor. The same aspect ratio in your screen resolution as the physical size of the monitor is the most important number to look at in screen resolution. 5:4, 4:3, 16:10, and now 16:9 are all common ratios that either are, or have been, popular in LCD monitors. 16:10 as well as 16:9 are commonly referred to as “wide screen” and are, in my opinion, quite poor screen sizes for anything except watching panoramic movies,

What I find funny (or at least a humorous introspective) is that I USED TO snicker and make fun of people that would use lower resolutions, and/or purposely cling to their ‘old-school’ CRTs. I had always assumed they were just unable to let go of the old technology, or keep up with the new stuff.

Now with a few years under my belt, I’m finally beginning to appreciate what they were doing. My eyes are precious to me (even moreso since I’m diabetic!), and I’d always taken for granted that they’d be all right forever. Well…now with a bit of time, diabetes-induced damage, and humility, I am beginning to do the same things, i.e. zoom in more, wear my glasses while using the computer, and purposefully use a lower resolution to make things ‘bigger’. Live and learn, I suppose. :3

As ‘r’ said above, it’s all about individual user comfort, and what they find easiest to see. After all, these are just tools in the end, and how much benefit is a tool if it’s difficult for someone to use?

The user’s comfort is the most important thing, as has been said above. For instance, I have an aunt with a 27″ screen at a low resolution, and it’s like looking through a magnifying glass to me, but necessary for her. I’m young and have a 15.4″ laptop running 1080p, at 100% (not magnified even though the laptop OEM default is 125%). Just make sure not to strain your eyes; eyes don’t get ‘built up’ with exercise like muscles do.

I am always getting angry when I installed Windows 7 because of display drivers. The only installation I have been happy with is my Toshiba Satellite A500. It has managed to put all drivers by it self! But sometimes when install it on other computers it is 800x600px or 1024x768px! 1024x768px is the worst! Everything just stretches, it is so annoying!

I’ve got a 55″ plasma HDTV with 16:9 definition. Great for what’s on TV, but I HATE WIDESCREEN MONITORS!!! My monitor’s an old clunky 19″ 4:3 LCD, TFT or whatever it’s called. My PC use involves “vertical” work as opposed to “horizontal” or widescreen.

Webpages and documents are truncated so I’ve got to scroll more. Most of my image work is “tall” as opposed to “wide”, so widescreen is a nuisance and incredibly frustrating.

But I’m tricky …. I’ve got a swivel monitor, so it’s easy to change perspective for the small amount of widescreen work I do.

On occasion I have to use a laptop with a 15.6″ screen. What a pita!

So am I a dinosaur? Nope. Just comfortable, so ++ to those who go for comfort over style and grace.

Slightly off topic, but I feel better now, and the rest of the day beckons. :)

If the max CRT resolution is 1024 x 768 than that’s the number of holes in the mask, and doesn’t somehow change when the video resolution changes. When the video resolution is changed to, say, 800 x 600 then that number of pixels still fills the physical 1024 x 768 pixels, but not by somehow skipping shadow mask holes but by having very smart video drivers to share pixels. The image will certainly be less sharp.

Most modern video card drivers have hardware interpolation settings built in (nVidia, for example). If that doesn’t work, modern LCD screens have hardware interpolation *themselves*. It’s not like in the early days of LCDs, where (for example), an old Gateway 15″ LCD would make the text screen (BIOS POST messages) look like a blocky mess of blurry garbage (I also had to use a DOS Command Window and FoxPro 2.6 for DOS on that machine – after once or twice trying to run it full-screen, I set it to run in a window and left it that way).

I have a 40″ LCD 1080p TV, and a PS3 – the two games I’ve played on it so far run at 720p and look fine.

Interpolation mainly affects small text; for gaming or use as a second monitor, you may never notice the difference.

Having poor eyesight, I prefer to set my 24″ LCD to 1400×900 resolution despite its 1900×1200 native resolution. I find fonts and windows far to small and hard to read at its native settings, and consider readability more valuable than the quality of graphics, well for me at least…

I held on to my CRT’s as long as I could because of the resolution issue, very easy to change without getting distorted. But, eventually, I over pumped the gamma in the video card to overcome the dimming of the CRT (CRT’s get dimmer over time and the screen begins to shrink as the coils lose efficiency) and both monitors and the video card burned out. Now my video games are distorted because they are forced to fit within native resolution of the LCD’s. A work around would be to use a different log on account with the settings optimized for the video games on one and using native resolutions on the other.

You should mention that setting a 1/2 of the natural resolution should produce a sharp image too, as the pixels will bunch in a 2×2 “big pixel”. Same with 1/4, 1/16, etc. At least I assume so. Of course, as a practical thing, that usually reduces it to just the 1/2 option unless you have a monstrous native resolution of like 2000xsomething.

Why this happen: When I use a HD TV as an external monitor to both my laptops (via HDMI port), even tough these TVs are supposed to be 1920×1080, this resolution is ‘overscanned’, ‘too big’ and overflowing the limits of the screen (the TaskBar is bellow the inferior limit, and the TitleBar of a maximized window is above the superior limit of the screen). I’m forced to use a inferior resolution for the image to fit.

I suspect most of the current HD TVs have NOT a native resolution of 1920×1080

Everyone is missing the whole point of this article. Yea you can turn down the resolution on a LCD monitor but your going to have a fuzzy, pixelated picture on your screen rather you notice it or not because LCD monitors still use all their pixels regardless if your using native resolution or not. So if your using your computer for your TV like I do or for gaming and you have your resolution turned down then your movie is going to suck. But gamers of-course would always use native resolution. Also, it’s a total waste of money for everyone buying High Def monitors so when they get them home they just turn down the resolution. Your no longer watching High Def.

I am a .net developer and I spend countless hours looking at monitors. After about two years working with computers my eyes begin to burn so bad to the point that I thought I was going to have to look at another profession.

This is 9 years later now and my eyes do not burn any longer; however, my eyes are not what they use to be either. They strain really easy and I have to constantly pay close attention to the time spent looking at a screen because if I pull a couple of 10 hour days working they can get over strained again and they can take up to a week to heel.

I have found that if I look away every 20 minutes or so and break my work down into segments that I can get a full day in with-out over strain my eyes. Because once the O’LL eyes are over strained once they never fully go get back to where they once were.

I have also found that setting the text size to 125%, brightness turned down from the factory settings, and all my working environments backgrounds set to black on 19″ LCD monitors causes less eye strain.

Im not a gamer so I do not know what is the best way to set up a monitor resolution would be, but if you’re a developer or programmer and have to look at one area over a long period of time the things I’ve listed above seem to work well for me.

I have also wondered if setting all my screens to display in black and white, if that would help cut down on the eye strain?

@Erik: An eye doctor gave me this rule of thumb on this subject; Look away from the screen every 20 minutes, for at least 20 seconds, and look at something at least 20 feet away (optical infinity- eye muscles relaxed- or, close your eyes). I find that this helps a lot. Also, unrelated as it may seem, getting enough sleep is vital for healthy eyes. Using a high contrast theme helps some people, as well as optimizing the brightness of the screen.