Author
Topic: HDMI + too much overscan... (Read 4743 times)

Short of trying to switch to a VGA cable or trying to find some setting on the TV itself, is there anything I can do about too much overscan while using HDMI out on an nVidia card? It's my understanding that nVidia + HDMI = no overscan adjustment. In LMCE it's livable, but in Myth (including watching TV, the guide, menus, etc.) the overscan is cutting off just too much. I can't even see check boxes on the side of the screen when going through setup options. I even tried changing the screen size and offset in Myth but it didn't do anything.

So modeline works with HDMI? I did a bunch of searching before asking, but I got the impression that TV-Out (including HDMI) worked different from VGA with the nVidia drivers and modeline was ignored/auto configed (why I thought changing to VGA may be needed). That gives me another option.

Honestly, I have no clue what UseEDID is or where to set it, nor was I able to find any "Just Scan" type settings for the TV.

Searching for TV specific info (Vizio VX32L) came back with I'm basically SOL with HDMI. Apparently this model has a large overscan over HDMI, and there are no TV settings to fix. According to the posts, VGA does not have the same issue, but is not as crisp/clear of a picture.

Not really an issue with LMCE as I used the AV wizard to adjust the screen size (it just adjusts where the menus are drawn right?), but trying to do the same in Myth just doesn't seem to work. Then again, I can't see if things like separate menu/gui options are checked or not cause the check box is off the screen, but setting screen size and offset within myth appearance settings does nothing.

Skeptic, first, check your TV's native resolution and enable it in resolutions.conf. Then, set use EDID to true in your xorg.conf and rerun AVWizard.That should get rid of the overscan.BTW, if your resolution is not listed in resolution.conf, check the wiki, there should be a guide on how to add the correct resolution to that file. Also make sure that driver name is correct, as most hidden resolutions have "VIA" listed as the driver.

Itsik - that suggestion might help, but I note that if you run AV Wizard again, it tends to overwrite many of the changes you make to the xorg.conf file.

skeptic - it does seem from the modeline database that some vizios are a problem, particularly with HDMI, however there is still hope. If you connect some CE equipment to the TV via HDMI (like a BD player, or cable box) and it doesn't give you an overscanned picture, then there must be some setting that allows a valid set of timings that you can emulate, we just need to find out what. If it still gives overscan then you have a perfectly valid reason to go back to the manufacturer and say "FIX IT!" as that is totally unacceptable for a screen. If they can't then you have valid reason to demand a refund. TV manufacturers are often dick heads when it comes to setting these things up, but preventing a standard timing over HDMI is about as in poor faith as it gets.

So, if you do get a non-overscanned picture using some CE equipment.... next you need to find the correct _native resolution_ of the panel. This should be in the specs. Just googled and found it is 1366x768... you may be better off using 720p so that the screen isn't trying to scale the image (often the cause of overscan). But if you insist on 1080, then we could try a cutom modeline and slowly change the parameters to see if we can reduce the overscan and recentre the image. But consider 720p first as this will mean there is (almost) no pixel scaling which degrades images.

I've had the TV for a couple years, so too late to take it back. I also tried Itsik's suggestion, and after learning about the existence of those files I found a wiki. I had to modify Resolutions.conf, xorg.conf.in, as well as a script (it's in the wiki, but I forget). After working through a number of things, and running the A/V wizard at least a half a dozen times, still no joy.

More googling and lots of reading later it appears my only solution is to use the vga connector instead of hdmi. Apparently the HDMI cable only supports up to 720p (1280x720) where the VGA connector will let the TV use it's max native resolution of 1366x768. I also noticed if I enabled UseEDID it kicks out errors regarding unsuppored resolutions during boot. Unfortunately I don't have a single other HDMI device to even try. Too much overscan and no way to fix it seems to be a common complaint with this model.

I just ordered a 10' vga cable so hopefully that will solve the issues.

If you are happy to use the VGA port then I agree that is the best way to go. But do note that the difference between 768 and 720 is very marginal, and the typical source material in this range will always be 720 not 768 (720 is a video/media resolution, 768 is a computer resolution), thus either your video chip or TV will have to scale up the image from 720 to 768 which will reduce quality slightly. For the best possible quality, it might be better to configure 720p without scaling... this will leave very thin black bars at the top and bottom (3% of screen height each)... an option...

Also, if you mean that the /var/log/Xorg.0.log file is generating errors, then that is perfectly normal... that is the purpose of EDID, the video card interogates the TV, and then tests all the resolutions it is capable of against the capabilities of the TV. It then builds a table of all the valid resolutions that it will allow... the messages are simply informing you that some resolutions it tested are not valid for that TV because of restrictions on its capabilities, limitations set in the xorg.conf file, etc...

Skeptic, it just came to me that I had a TV I was strugling with and trying the same things I suggested to you with no luck, and for some strange reason, 1280x800(custom) (that's how it's written in the AVWizard...) worked. I have no idea why but it worked, so I'd try it just in case.Can you write down the make and model of your TV, or better yet, provide a link to download it's users manual ?

If you are happy to use the VGA port then I agree that is the best way to go. But do note that the difference between 768 and 720 is very marginal, and the typical source material in this range will always be 720 not 768 (720 is a video/media resolution, 768 is a computer resolution), thus either your video chip or TV will have to scale up the image from 720 to 768 which will reduce quality slightly.

I'd rather stick with 720p over hdmi, but the overscan is too extreme.

Quote

For the best possible quality, it might be better to configure 720p without scaling... this will leave very thin black bars at the top and bottom (3% of screen height each)... an option...

I don't mind a thin black bar, but what I'm seeing is overscan cutting off the edge of the picture (ok) and cutting off part of the menus in Myth (not ok). I'm not sure what you mean by not scaling though. As far as I know there is no scaling on/off option.

Quote

Also, if you mean that the /var/log/Xorg.0.log file is generating errors, then that is perfectly normal... that is the purpose of EDID, the video card interogates the TV, and then tests all the resolutions it is capable of against the capabilities of the TV. It then builds a table of all the valid resolutions that it will allow... the messages are simply informing you that some resolutions it tested are not valid for that TV because of restrictions on its capabilities, limitations set in the xorg.conf file, etc...

Right, I'm not worried about it, just pointing out that it's kicking out error messages on screen (not in logs) that it wasn't before. This just indicates to me it's actually doing EDID checking properly.

Skeptic, it just came to me that I had a TV I was strugling with and trying the same things I suggested to you with no luck, and for some strange reason, 1280x800(custom) (that's how it's written in the AVWizard...) worked. I have no idea why but it worked, so I'd try it just in case.Can you write down the make and model of your TV, or better yet, provide a link to download it's users manual ?

Itsik

At this point I've ordered the VGA cable and I'll try that next. 1280x800 doesn't make any sense, but if I have issues over VGA I'll certainly give it a shot.

quick update - the VGA cable came in and now I remember why I'm using HDMI in the first place. I have one of the nvidia cards with the bug where X will ONLY output through DVI/HDMI. I have upgraded the nVidia driver hoping that I could get X through VGA (nope) or that I'd have more control over screen size with the new driver (nope).

I'm pretty frustrated right now - a TV that can't do HDMI without way too much overscan, and a video card that can't do VGA out. I'll get around to trying other resolutions in the wizard and maybe hunt down other modlines, but I don't have much hope. During boot where the MB splash screen comes up and the grub loader message is it's cutting off a full line of text at the top and bottom and 3+ characters of text on the sides.

I may just see if I have another nvidia video card in the closet. The current 7300 SE is pretty weak, and I just need something to get me by until my basement is finished, 810 comes out, and I can make this a dedicated core and switch to Revos or similar ION based MDs.

If you start playing with the custom modelines, you should be able to manipulate the overscan and centring. The objective being to find a set of timings that the screen will accept, and that remove your overscan.

ModeLine "1920x1080" 148.5 1920 1960 2016 2200 1080 1082 1088 1125

The first number is the pixel clock in MHz. So here, we have 148,500,000 pixels per second.The rest of the numbers all indicate timing positions as the image is scanned out onto the screen from left to right, top to bottom, in pixels (not seconds!) The bold numbers are the actual number of visible pixels. Then the italic numbers are the start of the sync pulse that tells the beam to return to the left or top of screen... so here, the horizontal sync starts 40 pixels after the right edge of the screen.The underlined numbers are the end of that sync pulse. So the horizontal sync pulse is 56 pixels wide.Finally, the unhighlighted number is the position a the end of that scan line or frame.

You need to keep the 1920 and 1080 the same, of course. And note that the 148.5MHz pixel clock is the entire frame multiplied by the refresh rate (2200x1125 pixels x 60 frame per second = 148.5Mp/s or MHz) Ensure you recalculate this if you change the overall size of the frame, and that you don't go too high with that number as the screen will eventually not be able to comply.

If you move the start of the sync pulse left or right, you will move the screen left or right (same for the vertical pulse). The width of the pulse doesn't matter all that much, but don't make it too small, just move it by the same amount to retain its size, initially. If you change the overall size of the screen, by changing the 2200 or 1125, you will scale the screen larger or smaller. But when you do that, you will need to recalculate the pixel clock again.

So you can see from here there is no 1 single modeline for a particular resolution. There are very many! Moreover, with those 2 variations I described, you can scale the screen down to rid yourself of overscan, then recentre it properly. You only need to be aware that sometimes, you will hit a minimum or maximum timing value (or combination of values!) that the screen cannot handle, or a pixel rate that either the screen or the video card cannot handle. Hopefully, you will not, but if you do, you can see from above that there are many other combinations of these numbers that will generate the same resolution, screen size and position, so you will likely have to tinker quite a bit to get the combination correct.

Start by changing the overall screen size (2200 and 1125) and recalculating the clock. Just to get an idea as to whether you can scale the screen small enough. If so, then you can work on recentring it.

Success!!! I'm not touching a thing! What I did was dig through my box of computer junk and found a PCI-e nVidia 6600.

I did have to use 720p instead of the native 1366x768 (custom) that I had previously enabled in the setup. Screen would go blank, not sure why when googling says it's the correct option, but I don't care. Using vga instead of hdmi makes everything fit just like it should. Not only that, but I'm quite surprised to see the already clean clear picture actually looks a little better. Colors are deeper, and it looks a bit more clear. Probably due more to per input picture settings than actual connection type.

The only downside is the $7 10-ft VGA cable I bought off ebay was defective. For now I'm stuck with a cable dangling down from my TV to my computer instead of ran through the wall into the back of the media stand, but that's easy enough to resolve.

I probably should have spent more time trying to see if a custom modeline would fix HDMI, that last post from Colin would have been VERY helpful had I waited instead of throwing hardware at the google answer of switching to VGA. 'Course my TV is only 720p so I'd have to find a different modeline to start with.