Although from what I've seen of NON-native 4K projectors, the pixel shifting has an effect similar to analog upscaling (scanning), In that case, I'd say again NGU may not be so necessary, since it's not producing the intended NGU output anyway.

Yes, you'd want a native 4K projector if you placed a premium on upscaling. Even then, many are using very high settings for image upscaling and derive a lot more visual improvement than say a 55" 4K TV.

I am not sure where to ask it, but I think it's probably a Windows or maybe NVidia thing?

I set up a HTPC for a friend, but for some reason, whenever a video opens or closes (Like starting a video in MPC-HC or then closing MP-HC itself) it causes his display to have to re-acquire it's signal.

This is pretty annoying as he has an older JVC (RS600) with a near 20-second sync time.

Now, I am completely familiar with refresh rate switching and such, but I have all of that completely disabled. I have the HTPC set for 3840x2160 23Hz RGB 12-bit (tried 8-bit as well). Refresh rate switching is disabled in madVR and disabled in MPC-HC.

Any ideas of how I can have it so that the JVC doesn't need to re-sync every time I open and close a video?

Well, I figured out the screen re-sync issue.

It was a combination of software and hardware.

It happened when using in madVR calibration tab "Report BT2020 to display" enabled and when sending the video through a Marantz 8802.

If we send the video to the JVC RS600 directly, it never blanks with "Report BT2020 to display" enabled, or if we send the video through the Marantz 8802 with "Report BT2020 to display" disabled it also never blanks.

But send "Report BT2020 to display" enabled into the Marantz and it blanks every time opening and closing a video through madVR.

Is your Marantz set to HDMI enhanced in the HDMI config? Otherwise it reduces your bandwidth to 10Gb/s for compatibility.

I have an X8500H and HDMI Enhanced needs to be enabled to enable the full 18Gb/s bandwidth.

If you don't enable BT2020 when using 12bits, the native gamut isn't selected and your calibration is all wonky.

It needs to be enable in 12bits to get the correct gamut in SDR BT2020.

Not a problem with 8bits.

It is set to Enhanced.

Also, to solve the problem we ended up just outputting 2 cables from the HTPC. One is run straight to the JVC for video, and the other is run straight to the Marantz for audio. Lip sync is then set manually to line up correctly.

This way we can have "Report BT2020" enabled without any issues, because we are using 2160p 23Hz 12-bit RGB on the RS600.

Also, to solve the problem we ended up just outputting 2 cables from the HTPC. One is run straight to the JVC for video, and the other is run straight to the Marantz for audio. Lip sync is then set manually to line up correctly.

This way we can have "Report BT2020" enabled without and issues, because we are using 2160p 23Hz 12-bit RGB on the RS600.

As you know, using 12bits on the JVCs isn't recommended because it forces to YCC422 behind the renderer's back.

Worth trying 8bits and see if you still have these issues. You won't get any banding (if madVR is set to 8bits dithering) and you'll get better chroma.

As you know, using 12bits on the JVCs isn't recommended because it forces to YCC422 behind the renderer's back.

Worth trying 8bits and see if you still have these issues. You won't get any banding (if madVR is set to 8bits dithering) and you'll get better chroma.

Yes, but again this is a friends HT that I am configuring and it's an RS600, so 12-bit RGB works fine.

Yes on my own HT with my NX5 of course I use 8-bit with madVR error diffusion dithering and I don't need the report BT2020. But it also does not cause re-sync through my Denon X4200W anyways for if JVC does ever fix 12-bit.

Yes, but again this is a friends HT that I am configuring and it's an RS600, so 12-bit RGB works fine.

Yes on my own HT with my NX5 of course I use 8-bit with madVR error diffusion dithering and I don't need the report BT2020. But it also does not cause re-sync through my Denon X4200W anyways for if JVC does ever fix 12-bit.

12bits doesn't work fine on the rs600, it does exactly the same re chroma (forces YCC422 with RGB or YCC444 12bits), but you can't use 8bits without getting the magenta bug, so not an option indeed unless you only use 60p, you have to live with the forced YCC422 (or select YCC422 to start with).

12bits doesn't work fine on the rs600, it does exactly the same re chroma (forces YCC422 with RGB or YCC444 12bits), but you can't use 8bits without getting the magenta bug, so not an option indeed unless you only use 60p, you have to live with the forced YCC422 (or select YCC422 to start with).

Oh, I thought I remember you saying 12-bit RGB worked on your RS500.

So what are you saying for how should he run his HTPC to RS600 then for tone-mapped HDR? What should we set in nVidia, YCC422 so that it's not happening behind madVR's back?

So what are you saying for how should he run his HTPC to RS600 then for tone-mapped HDR? What should we set in nVidia, YCC422 so that it's not happening behind madVR's back?

Yes, 12bits "works" with the rs500, and I had no choice because of the magenta bug in 8bits.

You should test chroma with various options. Given that the RS600 will force YCC422 with any colorspace in 12bits, I would suppose that selecting this in the driver would be best, but when I do this with the rs2000 it's still a worse result than using 8bits.

I just had a couple of questions I'm struggling to find answers to. Apologies if I'm not fully understanding this!

The issues I have are:

I want to use madvr to upscale TV at 50hz to 2160p, however switching to 50hz only works if I set the colour output in the Nvidia control panel to 4:2:0 chroma/luma sampling OR 8 bit precision (due to the bandwith limitations of HDMI 2.0). For some reason I can only select 8/12 bit for precision and not 10 bit. My understanding is that HDMI 2.0 has enough bandwidth for full RGB/10 bit output at 50hz so it's annoying the option doesn't seem to be available. (Is there a reason for this?)

But I also want HDR output for films at 23p. So I guess I am forced to set the Nvidia control panel to output 12 bit and not 8 bit? If this is the case I am also forced to use 4:2:0 chroma/luma to allow 50hz to work.

So I guess my question is are you 'stuck' with what you set in the Nvidia control panel for all refresh rates, or can/does Madvr change the colour precision/subsampling output dynamically, along with the refresh rate? I'm running in DX11 FSE mode

ie Can/does madvr switch the output to 10/12bit RGB at 2160p23 and 8bit RGB (or even 12bit 4:2:0) at 50hz?

If not then is the quality difference of always outputting 4:2:0 vs RGB negligible for HTPC use?

Why is 12 bit important to you? 4:2:0 at 12 bit is much lower quality than RGB at 8 bit.

I suggest you simply use 8 bit RGB all the time. madVR cannot change the bit depth but if you set 23Hz as 12bit the driver will use 12 bit when madVR switches to 23Hz.

Please do some visual tests, if 12 bit looks the same as 8 bit to you please do not obsess about getting 12 bit. 12 bit really is the least important improvement and not worth sacrificing anything else for.

Thanks for the reply... Sorry I was under the impression that the display needed to be configured to 10 or 12 bit for HDR to work. Is that not the case? This was the only reason I was trying to use it.

If it does require it then it seems I must choose between having HDR available (but using 4:2:0) or full RGB, as I can't have both (due to needing to switch to 50hz for TV)