jorimt wrote:Beyond the conceptual basics that I've already covered in my series, short of reverse engineering the G-SYNC module and G-SYNC driver component, I doubt we'll ever have the exact answer to such specific questions regarding G-SYNC inner workings, of which I'm sure Nvidia prefers; industry secrets and all that.

This is true, G-SYNC is proprietary, so my answer will be based on FreeSync / Adaptive-Sync / HDMI VRR behavior which all behave exactly the same at the blanking interval behaviour level (Timings & Resolution)

Portions will definitely also apply to G-SYNC because the panel is still the same (in both GSYNC and FreeSync) even if the monitor motherboard driving them is different. For example, the ViewSonic XG2530 FreeSync and XG2560 G-SYNC uses the same panel. Just different electronics driving the panel. At the end of the day, the panel is accepting synchronization signals, and so, the terminologies still kind of apply at the TCON level. Today, GSYNC is often (but not always) built into a custom TCON programmed by NVIDIA, so the answer may or may not apply at the LCD pixel level...

However, this will cover fundamental concepts of VRR since all transmissions over video cables still have synchronization signals (Porches, Sync, etc).

KKNDT wrote:I have questions about the VBI:

Before I answer, I need to explain to readers about the purpose of VBI. (Ever wonder where "VSYNC" comes from?)

Even a 2018 DisplayPort cable is transmitting the same signal topology as a 1930s analog TV signal transmitting a calendar-style-sequence of imagery, top-to-bottom, left-to-right, involving 1930s era Porch signals and Sync signals, that are still used to this date regardless of cable format, and still applies to signals between an NVIDIA card and a GSYNC monitor. How the card and monitor handles them is up to question, but the method of serializing two-dimensions into one-dimensions remains unchanged for almost a century.

This is the way it has been done with serializing a 2D image over a one-dimensional wire or radio signal, and it has transferred over to digital and packetization standards, still in use digitally, even if the sync intervals are greatly reduced.

So, because the fundamental "serialization of 2D image into 1D wire" concepts remain unchanged for the better part of 100 years -- we can easily generalize how VRR was piggybacked onto this. While GSYNC is proprietary, it still has to adhere to a "cram 2D into 1D" concept, sticking to a reasonable amount of cable signal standardization (to be compatible with things like amplifiers, switches, etc), so a significant percentage of the answer will still apply to GSYNC even if extra proprietary data is embedded in the signal.

Here's my answer:

KKNDT wrote:1. Default VBI consists of VBPD, VFPD, VSPW. Which one is to be changed during VRR opertation?

KKNDT wrote:2. When buffer swap occurs, the game will trgger a new refresh cycle. Is it the VBPD who is leading the new cycle?

Correct. During VRR operation, the monitor is held in continual transmission of Back Porch scanlines (VBPD) -- (metaphorically -- basically dummy rows of blank pixels hidden above the top edge of the screen). A variable Back Porch, while Sync and Front Porch remains fixed.

(For those who program Custom Resolutions and do FreeSync range overriding techniques: What gets entered in ToastyX CRU simply becomes the minimum-size Back Porch, and the FreeSync driver will automatically vary it to vary time between refresh cycles, down to the minimum Hz of the VRR range).

Interestingly, on this topic:This adds an ultra-minor microseconds-league granularity behavior that few people know about:During VRR, the timing of the next vertical refresh cycle is the granularity of the horizontal scanrate!

Once the buffer swap occurs (e.g. API call, Present() or glutSwapBuffers() occurs), the next scanline transmitted out of the graphics output will be the first row of pixels of the frame buffer (top edge of Active), right after the current still-scanning-out Back Porch scanline is complete. This cleverness allows adherance to classic video signal structure, while adding a VRR upgrade to it to allow display refresh cycles to be software-triggered.

NOTE1: If too much time passes without a buffer swap, the graphics card (at least on open VRR standards) will begin re-transmitting a duplicate of the previous refresh cycle. This is because pixels on a panel will go stale (e.g. fade away in a glitchy way) or go blank (e.g. go black or go white, due to electronics on the TCON) if too much time passes without a refresh cycle. So that's why you have a minimum Hz for a VRR.

NOTE2: For framerates below minimum Hz, there's a trick available. Low Frame Rate Compensation logic is simply intelligently timing the repeat-refresh cycles in a way to prevent colliding with the timing of API-triggered refresh cycles. So 24fps may actually cause the drivers to intelligently time the repeat refresh cycles exactly between API calls, creating a 48Hz out of 24fps (24 API calls per second). Done well, there's no stutter. However, random framerates (highly variable frametimes) can defeat this guessing logic of Low Frame Rate Compensation and timings of auto-repeat-refreshes will begin colliding with timings of software-API-triggered refresh cycles -- creating microstutter for framerates below minimum Hz of the VRR range.

If the display is currently scanning at ~160KHz horizontal scan rate (160,000 pixel rows per second including blanking interval), that means the new refresh cycle may be delayed by something like 1/160,000sec before it finally "hits the wire". That's because it waits for the current still-scanning out Back Porch scanline to finish, before beginning Pixel row #1 of the visible refresh cycle. There may be other driver and codec granularities being introduced (e.g. DisplayPort packetization) that adds a few scanlines of delay but practically, it's immediate.

Granularity in VRR operation that reaches the millisecond level, can become human-visible as ultra-minor microstutter, so it's critical that refresh timing remains in the <100us range. At 160KHz scanrate, that's less than 10 microseconds granularity between API call and the pixels hitting the cable -- far below a human's ability to detect.

It's an infinite repeating sequence (Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-etc-etc-etc) ... I usually think of it in this order: Sync-FrontPorch-Active-BackPorch even if from a VRR perspective it is sometimes more easily diagrammed as Active-BackPorch-Sync-FrontPorch like your diagram. But the infinite repeating sequence is still the same. [...]-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-[...] looping in that order every refresh cycle. With the Vertical Back Porch being used to time-pad between refresh cycles.

The horizontals are full height. Be noted that the vertical porches will often still have the horizontals embedded in them. Only the vertical sync is completely absent of the horizontal sync signal (in the analog era). Porches can essentially behave as extra virtual resolution and data has sometimes been hidden in them (e.g. closed captioning signal - offscreen Scan Line #21 just above the top edge).

So your black left edge will extend the full height (except it'll be absent from the vertical sync). Because the overscan area (vertical porches) still have horizontals embedded in them too. So they are still stuck at horizontal scanrate granularity.

From a signal structure -- it's the same for higher and lower resolution. Most of the time, the monitor is responsible for deciding what to do with the resolution (centered or scaled). If the monitor is doing the windowboxing, of course.

But yes, in the analog era at least -- porches are black-colored. In NTSC signals, the porches are 7.5 IRE (7.5/100ths of the voltage between complete black and complete white). And Sync is 0 IRE (0/100ths of the voltage between complete black and complete white). That's my understanding. Digitally, the meanings are often quite different, with the confusion of HDTV sets having a toggle for 0-to-100 IRE versus HDTV 7.5-to-100 IRE -- creating a black level difference. In the digital era, it's wasted dynamic range so we often use full range signals in the digital era.

Things can be confusing between the porches and active when things are being underscanned... but on analog displays you saw the image data of the porches. And you often saw the structure with ghosted signals (e.g. two NTSC signals overlapping each other -- one distant station showing a ghost image) -- with that, sometimes you saw the actual structure of the porches and sync.

jorimt wrote:I actually used a -2 fps limit across the board in all those tests. I simply came to the conclusion that the absolute safe minimum would be -3, but this could vary with certain in-game limiters, that can fluctuate like crazy. -2 in most instances is perfectly safe regardless.

As for G-SYNC borderless/windowed and DWM lag, I wasn't originally going to even test for that scenario until someone suggested it, and when I did, I was surprised at the results. As I say in the article, "Further testing may be required, but it appears on the latest public build of Windows 10 with out-of-the-box settings (with or without “Game Mode”), G-SYNC somehow bypasses the 1 frame of delay added by the DWM."

It could vary by system, but on my setup in the article (which I've now upgraded from), I couldn't get it to tear in borderless or windowed without G-SYNC, where others can, so it's possible that may have been a factor, but then that means I was testing the worst case scenario, and there was still no more lag in G-SYNC borderless/windowed than in exclusive fullscreen.

The slight increase for the windowed mode in some instances could be because the frame update starts right at the top of the screen with G-SYNC, and with windowed, there was obviously the window title bar at the very top, which made detection of the start of the frame impossible in that area. Otherwise, yeah, everything was well within my margin of error.

Battlenonsense also got a notable reduction in input lag with Game mode, and my undocumented results in this thread showed no improvement whatsoever.

System differences make direct comparison of some testing scenarios difficult.

Does this basically mean that for all G-sync users, this new ‘Full screen optimization’ DWM Bypass technique is not relevant anymore?

Because If there is theoretically the lowest possible amount of input lag achieveable now in (full) windowed mode, and basically equally performing versus full screen exclusive mode + Gsync / Vsync on...I dont see the point. (Setting aside for a sec the general performance implications of using anything but full screen exclusive)

Edit:I pulled the trigger on Windows 10 (1803) to observe this new hybrid mode, and I have to say I’m impressed.

All three games I run flawlessly play nice with this new feature, observing no noticeable fps drops, in fact they all seem to show a minor increase.

To made sure the mode worked I observed a few criteria:

- As good as instant alt+tabbing (<100ms rough guess), no windows will be put in front of the game when doing so.- With NVCP V-sync forced off I get tearing.- Running G-sync full screen-only mode and enabled G-sync indicator shows it as being active.

One game out of the box was really laggy, and im not sure if the full screen optimization actually applied initially as alt tabbing was slower and flashed my screen twice, but further investigation showed me that the game kept putting my monitor back from 120 to 60hz after the first alt+tab occured. Strange as my global setting for refresh rate in NVCP was set to ‘Prefer highest available’ and game in particular doesnt have manual refresh rate option.

I fixed this issue with CRU and ripped out any resolution/refresh rate other than 120hz. Resulting in globally (sytem wide) enforcement of 120hz mode.

Another thing I noticed is when Gamebar is deactivated in Windows settings, full screen optimization will not apply at all.

I was really sceptical at first as 99% of the posts on google were negative ‘Disable FSE to fix Windows 10 game fps issues’. But in all honesty it appears to be a very welcomed new technology that probably doesn’t seem to be suited for people that have no clue about how to make it work properly or let alone know what its supposed to do. Perhaps its partly true that people these days are so used to Full window + triple buffering and smooth laggy gameplay they switch to full screen just to observe tearing?

Anyway, a New era of low latency gaming is at hand! Without the quirks

Hi,I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?

knypol wrote:Hi,I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?

G-sync and V-sync under the refresh rate limit still work together to do frame time stabilization. How much you’ll notice most likely depends on your (un)optimized system, the game in particular etcetera.

I would use G-sync withour V-sync for a more ‘raw’ experience, and with V-sync for less serious games.

knypol wrote:Hi,I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?

Gsync can have tearing too. Enabling vsync will fix that at no additional latency cost (as long as you cap your FPS correctly.) It's basically "for free."

This was originally how gsync was shipped by nvidia. The ability to disable vsync was added later on, since some people wanted it. But disabling it can result in tearing even if you cap your FPS.