You can make mythtv set your display to best match the video being played.

You can make mythtv set your display to best match the video being played.

This is something commonly done by your bluray player, PS3 etc..

This is something commonly done by your bluray player, PS3 etc..

Line 21:

Line 23:

At this stage, it is assumed that MythTV will perform the required de-interlacing of the signal ; you can not force MythTV to output an interlaced signal through this mechanism.

At this stage, it is assumed that MythTV will perform the required de-interlacing of the signal ; you can not force MythTV to output an interlaced signal through this mechanism.

+

+

==NVIDIA driver requirements==

+

The 300 series nvidia driver changed this functionality and cannot be used for judder free playback. Use 295.XX instead, see this thread for details: http://www.gossamer-threads.com/lists/mythtv/users/529400#529400

How to achieve judder free perfectly synced playback

Overview

You can make mythtv set your display to best match the video being played.
This is something commonly done by your bluray player, PS3 etc..

If you watch a 24fps video, it usually sets your display at 24Hz..

This is a two cases scenario however, for most, the basic one will be sufficient ; if you want the best result however, you will be interested with the advanced case.

In both cases, there’s only one configuration to set in MythTV.
Go to Utilities / Setup -> Setup -> Appearance
Go to the “Video Mode Settings” page, and check “Separate video modes for GUI and TV playback”

For GUI and Video Output, enter the native resolution of your screen ; for a full HD TV, that is 1920x1080.

Rate should be set as “Any” so Myth will select the best available one.

Select Next until you reach the end of the configuration screen.

At this stage, it is assumed that MythTV will perform the required de-interlacing of the signal ; you can not force MythTV to output an interlaced signal through this mechanism.

Basic scenario

MythTV can extract this information, and will then automatically set up the screen refresh rate to best match the content being played. So if watching a 24fps video, and if your TV supports 24Hz this is what it will use.
If there’s no matching refresh rate, it will use the highest one available (usually 60Hz).
MythTV gives higher priority to rates that are twice the required frame rate ; e.g. for a 25fps video (50Hz interlaced) ; if you TV supports both 25Hz and 50Hz, 50Hz will be used.

No additional configuration is required except when using nvidia graphic adapters.
You need to either run the nvidia-settings application and uncheck in the GPU section “Force Full GPU scaling”

or edit /etc/X11/xorg.conf
and add in the screen or device section:

Option "FlatPanelProperties" "Scaling = Native"

Advanced Scenario

One problem with the scenario described above ; is that the Earth isn’t flat and standard refresh rates aren’t integers in all part of the world.

You may have read that PAL is 50Hz and NTSC is 60Hz... Not quite so. some PAL standards are 50Hz, but NTSC (and PAL-M/N) is 59.94Hz (actually 60Hz * 1000 / 1001)
The signal being interlaced, the effective frame rate is 29.97fps.

The problem is that the X11 framework that allows changing resolution and refresh rates on the fly (called XRANDR) only supports integers. So it can not easily tell the difference between 24Hz and 23.976Hz or 60 and 59.94. If you want Myth to be able to differentiate between those rates, follow the instructions below.

With Intel and ATI graphic adapters ; 24, 50 and 60Hz is the best you will get.
With nvidia adapters however, nvidia has put a work around this limitation by presenting a unique refresh rate per screen configuration.
So it will show as the list of available rates: 50, 51, 52, 53 etc... when 50 is 50Hz, 51 is 59.97, 52 is 60Hz etc...

To get access to those “rates”, you must make sure the “Dynamic TwinView” is active (that’s the default).

In the ideal world, whatever your TV reports through the EDID should be sufficient. Unfortunately, it is common for a TV to not properly reports all its capabilities, or that X11 failed to work properly with the information provided.

The idea is to ignore the EDID rates; and use the EIA/CEA-861B standard modelines. This is done with the X option:

You’ll see in the “Monitor” section, that "1920x1080@25", "1920x1080@29.97" and "1920x1080@30" are commented out. That’s because the Panasonic PT-AE4000 doesn’t support those rates. To find out which ones your TV actually supports ; run xrandr.

Now go through all values using xrandr and see which one your screen supports. It it doesn’t support it, it will usually displays nothing, or something like “signal not supported”.
so type:

xrandr -r 51
You’ll see the screen flicker. Usually the TV has an info button, that will show the refresh rate in use.
doing: tail /var/log/Xorg.0.log
yield:
Jun 07 13:29:40 NVIDIA(0): Setting mode "1920x1080@60"
so you now rate 51 is 1920x1080 at 60Hz.

In the "Screen" section, as shown in the xorg.conf above, list all the modes your monitor/TV supports.

Now that X has been configured to see all the possible refresh rates ; MythTV will be able to switch to the most suitable rate as required...

This is an example of xorg.conf for a single screen system (Sony 46X3100 and nvidia ION)

Common problems

Mouse pointer appearing over the video

On some systems, once the refresh rate has changed, you see the mouse pointer suddenly appearing over the video.

In /etc/X11/xorg.conf, in the "Device" section add the line:

Option "HWCursor" "false"

The other work around, is to position the mouse pointer to a position where it can not be seen using a utility such as ""xwit""
If you have a 1920x1080 TV, before starting mythfrontend you would do:

/usr/bin/xwit -root -warp 1920 1080

Important notes

For TVs not using the exact 1920x1080 or 1280x720 resolution, and instead have a screen of say 1366x768 pixels ; you will often find that when using the native screen resolution, it will not accept anything but a 60Hz signal, while it will accept 50, 60 and 59.94 for 1280x720 signal.
If this is the case, you are better off using the 1280x720 resolution. The human brain is more sensitive to motion than pure resolution and a smooth and judder free motion is going to be a more noticeable improvement.