Clock jitter is caused by a lack of synchronization between the three playback clocks: the system clock, video clock and audio clock. When the clock speed of the audio and video clocks differ by subtle amounts, occasional clock adjustments must be made during playback to restore audio/video synchronization. This will result in a few dropped or repeated frames occurring as the video plays through its runtime.

The system clock always runs at 1.0x. The audio and video clocks tick away independent of each other. Proper synchronization between the audio and video necessitates that the video clock perfectly matches the audio clock. However, these two clocks will always have some imprecision caused by subtle clock inaccuracies present in any of the chain of A/V hardware, video drivers or playback software that will always produce some deviation from a perfect 23.976 Hz clock speed.

The audio clock is used as the reference clock for video playback. Any difference between the video clock and audio clock in relation to the system clock is indicated by the reported display refresh rate and audio clock deviation displayed in the madVR OSD.

madVR Debug OSD:

Let's use an example:

display (video): 23.97859 Hz

With an ideal value of 23.976 Hz, the 23.97859 Hz rate of the video clock means it is faster than the system clock.

clock deviation (audio): 0.00580%

With a deviation of 0.00580% (23.976 * (1 + [0.00580 / 100]) = 23.97739 Hz), the audio clock is also slightly faster than the system clock. This would be acceptable if the audio clock randomly matched the video clock. However, this is not the case:

The audio and video are out-of-sync. This small deviation would lead to a slow drift between the audio and video during playback. The video clock yields to the audio clock — a frame is repeated or dropped every few minutes to resynchronize.

Creating a custom mode is a means to improve the synchronization of the video clock in relation to the audio clock. This should result in fewer dropped or repeated frames.

The number of dropped or repeated frames estimated in the OSD should be kept in perspective. In a 23.976 fps source, each dropped or repeated frame only lasts 41.71ms. Most will never manage to notice these occasional hiccups because they are so brief and often blend naturally into the next frame. The actual number of dropped or repeated frames also tends to be a fraction of the total number of frames in the source file.

For example:

In a two-hour, 24 fps movie, there are 172,800 total frames displayed.

10 dropped frames / 172,800 total frames x 100 = 0.00579% of all frames in the source are lost.

This amounts to 0.00579% of all frames in the video being either dropped or repeated. That is not to say a perfectionist couldn't strive to improve these numbers by creating a custom resolution with madVR!

(2016-05-13, 00:00)hassanmahmood Wrote: Wouldn't a 980 Ti be better than a 950/960?

The 980 Ti doesn't support HEVC hardware decoding. Only the 950/960 include this feature.

This is a dedicated VPU (video processing unit) devoted to decoding various video codecs (H.264, VP9, etc.). It has nothing to do with the processing power of the card.

I was a little unsure about this, but according to the doc below, Warner is right. 950/960 show as VP7 / feature set F (the latest, which was the first to do dedicated HEVC decoding), whereas 980Ti is VP6 / feature set E

(2016-05-14, 09:38)gotham_x Wrote: How does it use the new algorithm (Reduce Ringing and Reduce Dark Alos) with sources such as benefits in activating.

You can use anti-ringing with all sources, but not all sources will display ringing. If edge enhancement is bothersome to you, I would use anti-ringing. Sources without edge enhancement will not need anti-ringing.

Hi Warner306 , thanks for your reply, I imagined this, I just wanted + or - confirmation for this.
I take this opportunity to ask you something that I think will create some confusion, if not well informed.
at the new GPU will come soon Pascal and Polaris, the Nvidia Pascal specific shows that will have HDMI 2.0b - from AMD Polaris specific shows that will have HDMI 2.0a
HDMI 2.0a for vehicular HDR-10 static , the demand is very simple , "joke" if HDMI 2.0a conveys tags HDR-10 static , which involves difference with HDMI 2.0b Nvidia , is castrated for conveys tags HDR-10 static ? and with less band at a 18GB/s ?

(2016-05-20, 23:40)gotham_x Wrote: Hi Warner306 , thanks for your reply, I imagined this, I just wanted + or - confirmation for this.
I take this opportunity to ask you something that I think will create some confusion, if not well informed.
at the new GPU will come soon Pascal and Polaris, the Nvidia Pascal specific shows that will have HDMI 2.0b - from AMD Polaris specific shows that will have HDMI 2.0a
HDMI 2.0a for vehicular HDR-10 static , the demand is very simple , "joke" if HDMI 2.0a conveys tags HDR-10 static , which involves difference with HDMI 2.0b Nvidia , is castrated for conveys tags HDR-10 static ? and with less band at a 18GB/s ?

the french site there made an "error" calling a 10 gbit/s chip HDMI 2.0B and a 18 gbit/s chip HDMI2.0A. both these spec didn't existed at this time and most important they don't mean them.
they are using them to give a 10 gbit/s HDMI 2.0 device a name but that is backfiring now...
and it is not the first time HDMI had an a, b or even c as a new spec.

HDMI 2.0b doesn't mean everything that HDMI 2.0b can do can be done with a device that has this sticker or that this device even needs more than a 10 gbit/s connector.

you can assume all new cards can send 18 gbit/s.
all maxwell 2.0 cards are 18 gbit/s already.
all kepler/maxwell 1.0 cards can do basic HDMI 2.0 "stuff" with 10 gbit/s without HDCP 2.2. and these cards are older than HDMI 2.0 if nvidia updates them and give them the possibility to send HDR than they would be technically HDMI 2.0a still without HDCP 2.2 and of cause still with 10 gbit/s.

b is the new spec btw.

i don't see an hardware level change with hdmi 2.0a to hdmi 2.0b so polaris and vega could updated or are released with hdmi 2.0b too same for maxwell 2.0 cards.
hdmi 2.0b is so new it is not on the wiki yet: https://en.wikipedia.org/wiki/HDMI

and you have to look at what a device can do not what number is written on it anyway.
sorry but the HDMI spec is not transparent at all.

Thanks for the explanation, in essence, if I translated well your reasoning, in practice only Pascal with its HDMI 2.0b would need a firmware update to convey Metadata HDR-10 Static, where Polaris competition with its HDMI 2.0a would conform to convey Metadata HDR-10 static?

my oled tv fully reproduce 4:4:4 chroma only if uhd deep color is on, hdmi input labeled as pc and fed with 2160p,4:4:4,0-255 htpc signal output.
is there any advantage in term of picture quality using this mode with everything 0-255. or shall i stick with gpu full, madvr limited and tv limited video mode?

(2016-05-21, 10:01)gotham_x Wrote: Thanks for the explanation, in essence, if I translated well your reasoning, in practice only Pascal with its HDMI 2.0b would need a firmware update to convey Metadata HDR-10 Static, where Polaris competition with its HDMI 2.0a would conform to convey Metadata HDR-10 static?

HDMI 2.0a can send HDR so HDMI 2.0b can do the same.

but here we are again that doesn't mean that any device with HDMI 2.0a or HDMI 2.0b supports this.

nvidia said that pascal can output HDR with HDMI 2.0b using DP 1.4 and HDMI 2.0B.

but at the same time they are saying HDMI 2.0B isn't new on pascal so maxwell has it too.http://i.imgur.com/zAXoCHT.jpg
at the end it is not possible to send this data yet anyway.
and i personally still believe that letting madVR handling all these conversation results in a better result.

and again HDMI 2.0B is the newest spec not HDMI 2.0A. AMD is "outdated" that doesn't mean it matter it is just an older spec.

Hello and thank you for explaining all the different settings in madVR, it's really helpful. I noticed that you didn't write a profile with the best settings for a 1440p monitor (I have a 144Hz 2560x1440 monitor) so I'm wondering what settings I should choose for the best image quality. The source material is 720p/1080p Blu-Ray but also some old SD movies. My CPU is a 4770K and my GPU is a 980 Ti.

(2016-05-22, 16:23)dolken Wrote: Hello and thank you for explaining all the different settings in madVR, it's really helpful. I noticed that you didn't write a profile with the best settings for a 1440p monitor (I have a 144Hz 2560x1440 monitor) so I'm wondering what settings I should choose for the best image quality. The source material is 720p/1080p Blu-Ray but also some old SD movies. My CPU is a 4770K and my GPU is a 980 Ti.

I would use the following settings:

SD Image doubling:

Chroma: NNEDI3 32 neurons

Image: Jinc3 + AR

Double Luma: 2x or greater - NNEDI3 32 to 256 neurons

Double Chroma: Off

Upscaling refinement: SuperRes (3)

Artifact removal - Debanding: Medium/High

Artifact removal - Deringing: Off

Image enhancements: Off

Dithering: Error Diffusion 2

720p Image doubling:

Chroma: NNEDI3 32 neurons

Image: Jinc3 + AR

Double Luma: 2x or greater - NNEDI3 32 to 256 neurons

Double Chroma: Off

Upscaling refinement: SuperRes (3)

Artifact removal - Debanding: Medium/High

Artifact removal - Deringing: Off

Image enhancements: Off

Dithering: Error Diffusion 2

1080p Regular upscaling:

Chroma: NNEDI3 32 neurons

Image: Jinc3 + AR

Image doubling: Off

Upscaling refinement: SuperRes (1)

Artifact removal - Debanding: Medium/High

Artifact removal - Deringing: Off

Image enhancements: Off

Dithering: Error Diffusion 2

You could also try image doubling with 1080p sources. I'm not sure whether any improvement would be seen.

(2016-05-22, 19:19)Warner306 Wrote: I would use the following settings.

Thank you, these settings work like a charm. If my understanding of NNEDI3 is correct, then it won't work on 1080p sources because 1080p -> 1440p is less than 2.0x, right? However NNEDI3 will work on 720p sources because 720p -> 1440p is larger than 2.0x, right? The performance so far seems really good. There are no dropped frames and all the queues are working as they should.

Quick Links

About Kodi

Kodi is a free and open source media player application developed by the XBMC Foundation, a non-profit technology consortium.

Kodi is available for multiple operating-systems and hardware platforms, featuring a 10-foot user interface for use with televisions and remote controls. It allows users to play and view most videos, music, podcasts, and other digital media files from local and network storage media and the internet.