G-Sync Module:The G-Sync module is a small chip that replaces the display's standard internal scaler, and contains enough onboard memory to hold and process a single frame at a time. The module exploits the vertical blank period (the span between the previous and next frame scan) to manipulate the display’s internal timings, perform G2G (gray to gray) overdrive calculations to prevent ghosting, and synchronize the display's refresh rate to the GPU’s render rate.

G-Sync Activation:"Enable G-SYNC for full screen mode" (exclusive fullscreen functionality only) will automatically engage when a supported display is connected to the GPU. If G-Sync behavior is suspect or non-functioning, untick the "Enable G-SYNC" box, apply, re-tick, and apply.

G-Sync Windowed Mode:"Enable G-SYNC for windowed and full screen mode" allows G-Sync support for windowed and borderless windowed games. This option was introduced in a 2015 driver update, and by manipulating the DWM (Desktop Window Manager) framebuffer, enables G-Sync's VRR (variable refresh rate) to synchronize to the focused window's render rate; unfocused windows remain at the desktop's fixed refresh rate until focused on.

G-Sync only functions on one window at a time, and thus any unfocused window that contains moving content will appear to stutter or slow down, a reason why a variety of non-gaming applications often include predefined Nvidia profiles that disable G-Sync support.

G-Sync Preferred Refresh Rate:"Highest available" automatically engages when G-Sync is enabled, and overrides the in-game refresh rate option (if present), defaulting to the highest supported refresh rate of the display. This is useful for games that don't include a selector, and ensures the display’s native refresh rate is utilized. "Application-controlled" defers refresh rate control to the game.

G-Sync & V-Sync:G-Sync (GPU Synchronization) works on the same principle as double buffer v-sync; buffer A begins to render frame A, and upon completion, scans it to the display. Meanwhile, as buffer A finishes scanning its first frame, buffer B begins to render frame B, and upon completion, scans it to the display, repeat.

The primary difference between G-Sync and v-sync is the method in which rendered frames are synchronized. With v-sync, the GPU’s render rate is synchronized to the fixed refresh rate of the display. With G-Sync, the display’s VRR (variable refresh rate) is synchronized to the GPU’s render rate.

On release, G-Sync’s ability to fall back on fixed refresh rate v-sync behavior when exceeding the maximum refresh rate of the display was built-in and non-optional. A 2015 driver update later exposed the option.

This update led to recurring confusion, creating a misconception that G-Sync and v-sync are entirely separate options. However, the "Vertical sync" option in the control panel actually dictates whether, one, the G-Sync module compensates for frametime variances (see "Upper Frametime Variances" in "G-Sync Range" section), and two, whether G-Sync falls back on fixed refresh rate v-sync behavior; if v-sync is "On," G-Sync will revert to v-sync behavior above its range, if v-sync is "Off," G-Sync will disable above its range. Within its supported range, G-Sync is the only synchronization method active, no matter the v-sync setting.

Currently, when G-Sync is enabled, the control panel’s "Vertical sync" entry is automatically engaged to “Use the 3D application setting,” which defers v-sync fallback behavior control to the in-game v-sync option. This can be manually overridden by changing the “Vertical sync” entry in the control panel to “Off,” “On,” or “Fast” (see “G-Sync Input Latency & Optimal Settings” for suggested scenarios).

Fast Sync*:G-Sync disengages, Fast Sync engages, 2-12ms of additional input latency is introduced.(*Fast Sync is best used with a framerate in excess of two to three times that of the display's maximum refresh rate, as its third buffer selects from the best frames to display as the final render; the higher the sample rate, the better it functions. Do note that even at its most optimal, Fast Sync introduces uneven frame pacing, which can manifest as recurring micro stutter).

WITHIN G-SYNC RANGE

Refer to “Upper Frametime Variances" below.

UPPER FRAMETIME VARIANCES

V-Sync Off:G-Sync remains engaged, tearing may begin at the bottom of the display, no additional input latency is introduced.The tearing seen at the bottom of the display (example: https://youtu.be/XfFG1r7Uf00) in this relatively narrow range, is due to frametime variances output by the system, which will vary from setup to setup, and from game to game. Setting v-sync to "Off" disables the G-Sync module's ability to compensate for frametime variances, meaning when an affected frame is unable to complete its scan before the next, instead of suspending the frame long enough to display it completely, the module will display the next frame immediately, resulting in a partial tear. Not only does v-sync "Off" have no input latency reduction over v-sync "On" (see “G-Sync Input Latency & Optimal Settings”), but it disables a core G-Sync functionality and should be avoided.

V-Sync On:G-Sync remains engaged, module may suspend frames, no additional input latency is introduced.This is how G-Sync was originally intended to function (see “G-Sync & V-Sync”). With v-sync "On," the G-Sync module compensates for frametime variances by suspending the affected frame long enough to complete its scan before the next, preventing the tearing seen at the bottom of the display in the “V-Sync Off” scenario above. Since this operation is performed during the vertical blank period (the span between the previous and next frame scan), it does not introduce additional input latency (see “G-Sync Input Latency & Optimal Settings”).

Fast Sync: Refer to “V-Sync On” above.

MINIMUM REFRESH RANGE

Once the framerate reaches the 36 and below mark, the G-Sync module begins inserting duplicate frames to maintain the display’s minimum physical refresh rate, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Do note that regardless of the currently reported framerate and variable refresh rate of the display, each frame scan will still physically complete (from top to bottom) at the display's maximum supported refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144hz, and so on.

V-Sync On:G-Sync remains engaged, module may suspend frames during frametime spikes/asset loads. Paired with an appropriate framerate limit (see “G-Sync Input Latency & Optimal Settings”), this scenario is recommended for a 100% tear-free G-Sync experience. The G-Sync module may suspend frames during frametime spikes (see “What are frametime spikes?” further below), but said instances last mere microseconds, and thus have no appreciable impact on input response.

Fast Sync: Refer to “V-Sync On” above.

What are frametime spikes?Frametime spikes occur due to asset loads when transitioning from one area to the next, and/or when a script or physics system is triggered. Not to be confused with other performance issues, like framerate slowdown or v-sync-induced stutter, the spikes manifest as the occasional hitch or pause, and usually last for mere microseconds at a time, plummeting the framerate into the single digits, and concurrently raising the frametime upwards of 1000ms before re-normalizing. The better the game, and the stronger the system, the less there are (and the shorter they last), but no game, or system, can fully avoid their occurrence.

G-Sync Input Latency & Optimal Settings

Test Methodology:The methodology described in Chief Blur Buster's article (http://www.blurbusters.com/gsync/preview2/) was used for the below tests. I employed a Casio Exilim EX-ZR200 camera capable of 1000 fps video capture, and created an app to light up the onboard LED on my Deathadder Chroma's scrollwheel when clicked. All results were sampled from the middle (crosshair level) of the screen.

While I saved myself the trouble of wiring my mouse to accept an external LED, it did come at the cost of additional delay. Due to inherit mouse click latency, driver overhead, and the USB poll, there is an alternating 9 to 10ms delay from mouse click to LED light up. To calculate my error margin, I recorded my index finger depressing the mouse's scrollwheel a total of 20 times, giving an average of 9.45ms, which was added to each sample's final number.

V-Sync Off vs. Fast Sync vs. G-Sync (RTSS):Is there an input latency difference between v-sync off and G-Sync + v-sync on/off at the same framerate limit with RTSS?

Conclusion: No input latency difference (<1ms differences within margin of error). As evident by the above chart, RTSS adds up to 1 frame of input latency with standalone v-sync engaged, but with v-sync off and G-Sync + v-sync on/off, there is no input latency increase.

RTSS Update (01/24/2017): A recent video by Battle(non)sense (https://youtu.be/rs0PYCpBJjc?t=2m32s) has posed the possibility that RTSS is adding 1 frame of input latency with G-Sync. I'm currently investigating the cause of this discrepancy, and will update here when I learn more.

RTSS Update (03/26/2017): RTSS does indeed appear to introduce up to 1 additional frame of latency, even with G-Sync:

The above test results were captured with identical scenarios, rig, and the mouse I used in previous tests, but the mouse has been modified with an external LED for more consistent, accurate results.

So what was causing the discrepancy? CS:GO's quirky "Multicore Rendering" option. I had it disabled for my previous tests, as it is known to allow the lowest input latency in this specific game. Disabling it makes CS:GO run on a single core of the CPU, and since RTSS limits frames on the CPU side, the specific interaction between this setting and RTSS likely allows it (for whatever reason) to deliver frames without its usual delay, at least when running CS:GO in single-core mode on a multi-core CPU.

V-Sync Off vs. Fast Sync vs. G-Sync (Nvidia Inspector):Is there an input latency difference between v-sync off and G-Sync + v-sync on at the same framerate limit with Nvidia Inspector?

Conclusion: No input latency difference. However, as evident by the above chart, both Nvidia Inspector's "v1" & "v2" framerate limiting methods add up to 2 additional frames of input latency, even with v-sync disabled. As such, this method should only be used to limit frames when paired with standalone v-sync.

Nvidia Control Panel V-Sync vs. In-game V-Sync:While Nvidia v-sync has no input latency advantage over in-game v-sync, and when used with G-Sync + fps limit, it should never engage, some in-game v-sync solutions may introduce strange frame pacing behaviors, enable triple buffer v-sync automatically (not optimal for the native double buffer of G-Sync), or simply not function at all. And as described in the "G-Sync Range" section, the G-Sync module relies on v-sync "On" to compensate for frametime variances and avoid tearing at all times. There are rare occasions, however, where v-sync will only function with the in-game solution enabled, and thus, if tearing or other anomalous behavior is observed with Nvidia v-sync (or visa-versa), user experimentation may be required.

In-game vs. External Framerate Limiters:In-game framerate limiters are the superior method of capping, as they do not introduce additional input latency, and (with exceptions) provide more consistent frame pacing over external methods. External framerate limiters are too far down the rendering chain, and similar to v-sync, effectively throttle the framerate. As long as the external cap is the framerate’s limiting factor, additional input latency will be introduced. Nvidia Inspector limits frames on the driver-side, and adds up to 2 frames of input latency (as much as double buffer v-sync), while RTSS limits frames on the CPU-side, and adds up to 1 frame of input latency.

Finally, in-game framerate limiters are known to drift upwards of +/- 3 frames, while RTSS has very little frame drift in comparison. As such, while an RTSS fps limit of 142 may suffice in keeping the framerate below the G-Sync ceiling on a 144 Hz display, certain in-game limiters may (or may not) need a slightly lower limit to achieve the same result.

If only a framerate limiter is required, the standalone download will suffice. MSI Afterburner itself is an excellent overclocking tool that can be used in conjunction with RTSS to inject an in-game overlay with multiple customizable performance readouts.

RTSS can limit the framerate either globally or per profile. To add a profile, click the cross button in the lower left corner of the RTSS windows and navigate to the exe. To set a frame limit, click the "Framerate limit" box and input a number.

Do note that RTSS is currently not supported in most Windows Store games utilizing the UWP (Universal Windows Platform).

To set a frame limit, locate the "Frame Rate Limiter" dropdown in the "2 - Sync and Refresh" section, select a "(v1)" or "(v2)" limit (there are no official details on the differences between the two available versions), and then click the "Apply Changes" button in the upper right corner of the Nvidia Inspector window.

G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.

I predominately based my wording on your past comments in the other thread, and the above link. If I'm phrasing something incorrectly, by all means let me know, and I will amend it.

I'm not so sure about your 165 Hz VSYNC-Off optimal scenario. Ever since they made the VSYNC option configurable, I have had to settle on the same FPS range as your 144 Hz findings in order to fix the screen tearing, 120-125 FPS limit.

0 through 30~40 fps: Panel self refresh. Monitor repeats frames at a multiple of framerate, so that the next frame can still be displayed on time. This assumes a frame will take approximately as long as the previous frame, so individual frames may be delayed by up to 1/max refresh or tear if the frametime is changing or inconsistent. (basically about 7 out of 25ms are taken up by the self-refresh, so you have about 9ms leeway on either side of the frametime to avoid a tear/delay)

High framerates within VRR window(I would avoid putting numbers in for this, as it varies with game): If any single frame finishes faster than 1/max refresh rate while the average framerate is below the max refresh rate, it either gets delayed to the start of the next refresh(v sync/fast sync on), or it causes a tear(vsync off).

sekta wrote:I'm not so sure about your 165 Hz VSYNC-Off optimal scenario. Ever since they made the VSYNC option configurable, I have had to settle on the same FPS range as your 144 Hz findings in order to fix the screen tearing, 120-125 FPS limit.

Strange, I own the same monitor. Possibly obnoxious question incoming here, but are you sure you're in 165 Hz mode when testing the framerate limits? Otherwise, I can't think of a reason the bottom of the screen would be tearing until you set such a low limit. May be the specific game?

Sparky wrote:I might suggest a few changes to that graphic:

Thanks for the suggestions. After your first post on this thread, I did a couple hours worth of research, and dug some specifics on the minimum refresh range up. That's what I'm currently basing my graph/wording on.

Sparky wrote:0-1 fps: Remove from the graph

I see how that's inaccurate. I'll make it "< 0" instead, as that range is suppose to reflect frametime spikes only, unless of course you're saying that range doesn't actually exist the way I'm saying it is currently.

Sparky wrote:0 through 30~40 fps: Panel self refresh. Monitor repeats frames at a multiple of framerate, so that the next frame can still be displayed on time. This assumes a frame will take approximately as long as the previous frame, so individual frames may be delayed by up to 1/max refresh or tear if the frametime is changing or inconsistent. (basically about 7 out of 25ms are taken up by the self-refresh, so you have about 9ms leeway on either side of the frametime to avoid a tear/delay)

High framerates within VRR window(I would avoid putting numbers in for this, as it varies with game): If any single frame finishes faster than 1/max refresh rate while the average framerate is below the max refresh rate, it either gets delayed to the start of the next refresh(v sync/fast sync on), or it causes a tear(vsync off).

I went with "30-36" because 36 fps was the exact moment the refresh rate began to double, not 40 (I queued the below video to the proper place):https://youtu.be/VkrJU5d2RfA?t=8m7s

I'm aware in G-Sync mode, the monitor completes each frame scan at the maximum refresh rate (@144 Hz, 6.9ms), regardless of current framerate/refresh rate. So you're saying it has that 9ms time frame/padding on top/bottom of the incoming scan to compensate and display the frame without tearing with G-Sync + V-Sync on, and that's how it prevents it compared to G-Sync + V-Sync off? If so, I will work on rephrasing those sections.

I agree, the specific fps limits in those sections on the chart aren't optimal. I'll test a few more games in that range and see if the numbers can be trusted, otherwise I'll scrap them.

I appreciate your hard work and the time you put into that. While I'm not using a G-sync monitor, I'm sure sometime in the future I will so I'll bookmark this thread in case it gets overshadowed by newer posts. Request sticky?

Let's call the threshold 36fps for now, it may be monitor specific, because as far as I know we only have a sample size of one. That gives us a 20.7ms window to play with(1/36 of a second - 1/144 of a second). so 10 above, 10 below, or maybe they got clever and made the centering of it depend on how quickly and in which direction framerate is changing.

It's not quite possible to have a framerate below zero. The tearing can happen anywhere in the 0~36 range, the issue is how fast it's changing. Say one frame is 31ms(32fps), so the monitor will throw in an extra refresh, say it starts that self refresh at 14ms. If the next frame finishes between 14ms and 21ms(48~71fps), or between 41ms and 48ms(21~24fps), you'll get a tear or delay(depending on v-sync setting), but if instead the next frame is between 7 and 13ms(72~144 fps), or 49 and 70ms(14~20fps), you wouldn't get that tear/delay, because the monitor would have completed its self refresh by then. Also, if the framerate slowly creeps down over many frames, you won't get tearing because the monitor has time to adjust when it does the additional refresh.

Paul wrote:I appreciate your hard work and the time you put into that. While I'm not using a G-sync monitor, I'm sure sometime in the future I will so I'll bookmark this thread in case it gets overshadowed by newer posts. Request sticky?

Thanks Paul

Sparky wrote:Let's call the threshold 36fps for now, it may be monitor specific, because as far as I know we only have a sample size of one. That gives us a 20.7ms window to play with(1/36 of a second - 1/144 of a second). so 10 above, 10 below, or maybe they got clever and made the centering of it depend on how quickly and in which direction framerate is changing.

It's not quite possible to have a framerate below zero. The tearing can happen anywhere in the 0~36 range, the issue is how fast it's changing. Say one frame is 31ms(32fps), so the monitor will throw in an extra refresh, say it starts that self refresh at 14ms. If the next frame finishes between 14ms and 21ms(48~71fps), or between 41ms and 48ms(21~24fps), you'll get a tear or delay(depending on v-sync setting), but if instead the next frame is between 7 and 13ms(72~144 fps), or 49 and 70ms(14~20fps), you wouldn't get that tear/delay, because the monitor would have completed its self refresh by then. Also, if the framerate slowly creeps down over many frames, you won't get tearing because the monitor has time to adjust when it does the additional refresh.

I too thought the minimum number may vary by monitor, but after seeing that both the official and unofficial source stated an almost identical number, we're just to have to assume it's the correct one.

I see, so the way I have it split into three sections now ("Below G-Sync Range," "Within 1ms Polling Range," "Exceeds 1ms Polling Range") is redundant, seeing as one behavior is causing all instances.

Basically, it tears when it misses the frametime window with V-Sync off, and suspends with V-Sync on. I'll have to carefully rearrange and word all of this to make it digestible to the average G-Sync user.

Just to confirm, this happens so quickly, there is no way it is adding additional input latency with G-Sync + V-Sync on, unless of course you hit the G-Sync ceiling (where this doesn't occur anyway), correct?