Preview of NVIDIA G-SYNC, Part #1 (Fluidity)

A last-minute surprise preview arrived at the Blur Busters Lab – NVIDIA sent us a G-SYNC display to preview, a new gaming computer monitor technology. Welcome to the first part of a series Blur Busters articles on previewing G-SYNC.

For those who missed our earlier NVIDIA G-SYNC post, G-SYNC is a new hardware technology that allows a monitor to have a variable refresh rate. In short, it is a monitor hardware technology that simultaneously (1) eliminate tearing, (2) reduce input lag, and (3) eliminate stutters.

The Status Quo For 75+ Years

For more than 75 years, displays have been refreshing in a top-to-bottom manner, at 60 cycles a second, in a synchronous interval, as these high-speed videos illustrate:

Ever since the first televisions of the early 20th century (the Baird and Farnsworth), it always took a finite amount of time to transmit (scan) the frames from a video source (e.g. GPU) to the display (e.g. LCD). Due to this old standard of fixed refresh rates, there are side effects that occur during games (e.g. tearing, stutters) caused by the display. Before we dive into G-SYNC benefits, we demonstrate motion fluidity issues found on displays.

NOTE: Ensure that the animations below are running flawlessly. See web browser requirements. We recommend temporarily closing other apps and browser tabs. Re-enable Windows Aero mode (browser animations stutters more in Classic mode), and use the primary monitor only. Although most major browsers can do accurate animations, Google Chrome works most reliably.

Demonstration of Tearing

This animation demonstrates what often happens when you turn VSYNC OFF. You witness tearing, which shows up as disjointed artifacts in the moving vertical bar.

Positions of tear lines often fluctuate, as shown in the animation above. Tearing also becomes more visible during faster horizontal motion (e.g. turning, panning, strafing), especially at lower refresh rates. It shows up in many games, as parts of different frames:

This screenshot (Bioshock Infinite) shows tearing, where the statue in the middle is disjointed, as well as the rail and buildings to the side. During fast horizontal panning motion, disjoints show up randomly all over the screen, similiar to the above animation.

Trying to replace the frame during mid-scan, creates a tearline. The position of the tear depends on how far along the scanning is, as seen in the high speed videos earlier.

Demonstration of Stutters

Stutter is shaky motion. This animation below demonstrates a very bad case of stutters:

Game engines can also be the source of stutters. However, the link between a variable-framerate source (GPU) and a fixed-refresh-rate display (status quo!) also creates stutters as well. Stutters occur when the GPU is forced to wait for the next scan (for VSYNC ON)

It can also be caused by the display, since the graphics card has to wait for the display to finish refreshing (an old fashioned scan-out) before refreshing a new frame.

Sudden Frame Rate Slowdowns

During VSYNC ON operation, there can be sudden slow down in frame rates when the GPU has to work harder. This creates situations where the framerate suddenly halves, such as 60 frames per second slowing down to 30 frames per second:

During VSYNC ON, if your graphics card is not running flat-out, these frame rate transitions can be very jarring. These sudden changes to frame rates creates sudden changes in input lag, and this can throw off gameplay and aiming, especially as action gets heavy in a first-person shooter game. The red line (VSYNC ON) below illustrates this jarring effect relative to the green line (which represents VSYNC OFF and G-SYNC).

The G-SYNC Solution

G-SYNC simultaneously solves all the above problems by allowing the graphics card to drive the monitor’s timing of refreshes. The monitor’s refresh can now occur at arbitrary intervals, staying in sync with the GPU:

We have created a web-based animation that demonstrates variable frame rates with no erratic stutter effects, to simulate what a user experiences when using G-SYNC.

The web-based animation below shows a seamless transition through variable frame rates (using a software-based interpolation technique). G-SYNC does this even better, in hardware (no interpolation), and in an ultra-low latency manner, and with no tearing.

Note that a real G-SYNC monitor will have less ghosting than this. This software animation is approximate, to demonstrate that it is visually possible to have variable frame rates without erratic stutters caused by frame rate transitions. (Make sure you’re viewing this animation in a stutter-free web browser, or the animation is invalid)

Smooth Motion With Variable Frame Rates

The first game we played, a game of Battlefield 4, with G-SYNC. With G-SYNC, you can continuously turn from complex scenery (e.g. open space) to low detail scenery (e.g. wall or floor), without seeing a single erratic stutter.

The series of photos below show that you can turn back and fourth from complex scenery to simple scenery, experiencing massive changes in frame rate, in an ultra-smooth manner.

You do get the “low frame-rate feel” if the frame rates go low (e.g. 30fps), however, now the monitor refresh is always synchronized to the refresh rate, eliminating stutters.
Playing at 47 frames per second? The monitor is now at 47Hz.
Playing at 131 frames per second? The monitor is now at 131Hz.

The refresh rate of the monitor can change every single frame – The refresh rate can change over 100 times per second. Via NVIDIA’s G-SYNC, the monitor is now always synchronized to the frame rate – the graphics card is now controlling the monitor’s refresh dynamically, without waiting for scheduled, granular refresh intervals. This is accomplished through an equivalent of variable-length blanking intervals between refreshes. The monitor is held in a blanking interval until the next frame is ready from the GPU, and then the monitor is immediately refreshed.

Since the game is already pre-positioning object positions in the frame, random frame rates can look smooth, provided the timing of the on-screen object positions within frames, stays in synchrony with timing of presentation of the frame to human eyes (i.e. when the frame is scanned-out to the display). With variable frame rates, the only side effect is variable motion blur (as seen in the above animation). We already know that lower framerates create more motion blur even at fixed refresh rates too (on a regular LCD), as shown at the www.testufo.com 30fps-versus-60fps animation demo.

As you track moving objects on a G-SYNC monitor, the object positions stay in sync with the frame presentation to your eyes. As most game engines adjusts object positions based on the current instantaneous frame rate, object positions are already compensated for momentary early/late presentation of frames. That makes it possible for random frame rates to remain miraculously smooth! This is also clearly demonstrated in NVIDIA’s pendulum demo, which has a built-in 40-60fps random framerate feature. From our earlier article, How Does G-SYNC Fix Stutters, this diagram demonstrates sync between eye-tracking and on-screen object motion:

Without stutters, it feels like a major upgrade bigger than a typical full-generation GPU upgrade! In this situation, any gamer can be more comfortable in Ultra quality settings. G-SYNC feels like I’ve upgraded the computer with a major GPU upgrade (that feels bigger than one generation). I also upgraded to a GeForce GTX Titan simultaneously with testing G-SYNC, for a double-whammy upgrade in gaming experience.

With G-SYNC, you gain the best of both VSYNC ON and VSYNC OFF worlds. You get the higher frame rates of VSYNC OFF (during the G-SYNC range of 30Hz through 144Hz) without the input lag penalty, and no tearing at all.

G-SYNC Easter Egg #1: The “fps_max” command in Source engine games, ends up controlling your monitor’s refresh rate! With fps_max 111; the monitor runs at 111Hz. With fps_max 143, the monitor runs at 143Hz. This can potentially reduce input lag, if this allows the game engine to do input reads (mouse/keyboard) immediately during rendering the frame, which is immediately sent to the G-SYNC display. This is fresher input reads than pre-rendering the frame and then waiting for traditional VSYNC.

G-SYNC Easter Egg #2: Quake Live runs at a fully synchronized 125 fps with G-SYNC. Because the monitor is not maxed out (at its maximum 144Hz), there is actually no VSYNC ON input lag penalty with G-SYNC at this particular frame rate!

G-SYNC Easter Egg #3: Input lag in Sky Rim is noticeably reduced in G-SYNC mode. Even though it only runs reliably at 60 frames per second, G-SYNC delivers each “60Hz” frame faster to the monitor, reducing input lag without needing to increase frames per second. Each 60 individual frames are refreshed to the display in a 1/144 second scan, rather than the traditional 1/60 second scan (top-to-bottom refresh process).

LightBoost Sequel: More information about the earlier mentioned LightBoost sequel will be forthcoming at some future, once NVIDIA lets us reveal all! Keep tuned.

55 Responses to Preview of NVIDIA G-SYNC, Part #1 (Fluidity)

Casual gamer here who enjoys the occassional public round of CS:GO and such. Will G-Sync be more beneficial if I have a 144Hz monitor than a 60Hz monitor when taking into account that G-Sync will only benefit you in the area 30-60 FPS as opposed to 30-144 FPS?

G-SYNC will benefit you all the way to 144fps. I see improvements in eliminating stutters even when framerates fluctuate from 100fps-144fps. I am sensitive enough to see a single frame drop (single stutter) even at triple-digit frame rates.

However, CS:GO often runs maxed out at the full frame rate. So it may not benefit much from G-SYNC’s stutter elimination. However, you still have tearing elimination! If you hate tearing, you can easily set an fps_max of 142 or 143 with CS:GO, and get tearing-free operation without the input lag of VSYNC ON. That’s because the frames never wait for the monitor; they get refreshed immediately to the screen when rendered.

Actually, I know Overlord Computer is working to create a 1440p overclockable G-SYNC monitor, see http://overlordforum.com/topic/603-nvidia-g-sync/ …Also, G-SYNC works at 60Hz too, so probably will arrive on a 4K G-SYNC monitor (say, within the coming two years). We will have to wait until further information/confirmations to get the answers to your questions out in the open!

Awesome, Overlord makes some amazing monitor and i was worried about keeping above 120 fps steadily on 1440p but now with g sync, fps drop below 120 wont be much of an issue anymore so i can see myself getting one in the future maybe

Any info on the cost of the module that can be added to the VG248QE?
Also how much soldering will be needed or how complicated will the upgrade be? I’m pretty sure some place will offer to upgrade for you if you pay but i’d rather just buy the g sync unit and install it myself.

- No soldering. It’s just like replacing a computer motherboard. If you can take apart a whole desktop tower and replace a computer motherboard, the G-SYNC upgrade is actually somewhat simpler than that. Just remember, specifically for the GSYNC upgrade, you will lose the DVI and HDMI port, specifically for the VG248QE monitor.

– Keep tuned about new information about board timing/sources. I’ve asked NVIDIA about the board upgrade schedule, though it seems also quite possible the G-SYNC upgraded monitors may come out before the G-SYNC upgrade boards.

I don’t care for HDMI but losing DVI . . . . so it will only have VGA? That might be a problem since most video cards only have DVI outputs nowadays . . .
My Zotac GTX 660ti for example only has DVI-I and DVI-D and maybe HDMI which i find useless anyways . . . So ill have to get an older Video card if i swap the board to a g sync one?

This G-SYNC prototype preview monitor only has DisplayPort (no HDMI, DVI, VGA).
Fortunately, it is not necessarily representative of the future: Future G-SYNC monitors later in the year will likely have other ports (not necessarily GSYNC mode, but traditional fixed-refresh-rate output through these other ports).

Ok, i’ve looked it up so it uses displayport which means ill just need to get a cable for it, although ill lose dual screen setup since my video card only has 1 display port : \ unless we can split it or if i can use 2 different card output in SLI maybe?

Actually, three different DisplayPorts works with surround for non-GSYNC monitors.

Technically (on the technology side of things), multi-monitor is not necessarily mutually exclusive with GSYNC. It’s probably just a limitation of the first iteration of GSYNC, and could be solved in subsequent revisions. Maybe, even, possibly a driver upgrade, rather than just a new model.

1: Exactly which monitor was given to you?
2: I’ve recently made a purchase without researching IPS/VA/TN/PLS. My monitor ended up being TN and I hate the awful brightness and viewing angle problems it presents, detracting from the 144hz I wanted from the monitor. Thankfully, getting a refund is easy and free. Turns out, I can’t stand TN panels. Does G-SYNC require a specific type?

With G sync the type of panel shouldnt matter as they are working on some IPS gsync monitor already and it will come on some TN as well.
Its a special hardware component so to be able to have it you will have to buy specific model that comes with it. As for existing monitor only a few will be compatible with the upgrade that will be specifically made for that particular model.
In the future maybe there will be a way to add g sync to almost all monitor but for the moment you either need to buy a g sync compatible model that you can upgrade or wait for retail version to come out sometime next year that have g sync onboard.
Hope i was clear and didnt confuse you

This is a prototype preview (test) version of a modified ASUS VG248QE with a TN-panel and G-SYNC upgrade board put into it. I must stress that NVIDIA told everyone that this was a preview (e.g. early release with limited adjustments) and that the final GSYNC monitors will have improved firmware (for example, more adjustments, etc). So for now, we are specifically focussing on testing the GSYNC technical aspects for now.

If you dislike TN panels, you may wish to wait for IPS G-SYNC monitors, as I have heard Overlord is working on it. You may, however, not get a strobe backlight with the first models of IPS-based G-SYNC monitors.

This LCD (frame grab from high speed video), for example, would not work well at all with strobe backlights:

Strobe backlights are likely coming to IPS, but it will take longer, because of this issue. A pre-requisite for effective strobing, is that pixel transitions are virtually completely finished, within far less than a refresh cycle (approaching 99% completeness or more). The panel can’t clearly be in a strong perpetual ghosting state (like the old LCD’s of yesteryear).

There’s a third option: The VA panels. It actually has a better contrast ratio than IPS, and the viewing angles are more similar to IPS than TN. VA has its issues, but the pros can outweigh the cons if you have specific needs. I have the Eizo FG2421 with Turbo240, and I really enjoy it for 120fps@120Hz games, the FG2421 has a bright and colorful “LightBoost” equivalent.

Can understand the buying proposition dilemma. Some of us are escastic just to play on a $100 GeForce/Radeon card, while others of us are able to save up for a GeForce GTX 780 or TITAN. Likewise for monitors, some of us are happy with a $100 special, while for others of us, it just never will do. You’re already one of the fortunate, and it’s wholly possible G-SYNC upgrade boards will become cheaper later (The current quoted price is $199).

There is a huge explosion of new choice in gaming monitors with 120Hz, 144Hz, 240Hz, overclockable, LightBoost, Turbo240, GSYNC, BENQ Blur Reduction, and more. You’d think some of the monitors now have a windshield wiper option!

Also, what sort of latency differences are we looking at here for someone who is getting 300fps in source, using 144hz mode, relative to g-sync. Are there advantages to be had other than tearing in a scenario like this?

GSYNC latency and VSYNC off latency scales synchronously with each other until the limit (144fps) is reached. After that, latency diverges.

Mathematically, I see approximately 3.5ms of a difference between 144fps GSYNC and 288fps VSYNC OFF. (288fps is an easier number to calculate from, as it’s double 144fps). The difference between 1/144sec and 1/288 sec is 1/288sec (1/144 = 2/288 … so subtract 1/288, and you end up with 1/288). 1/288sec is about 3.5ms. Add 1ms for GSYNC overhead. And you’ve got a total of about 4.5ms latency difference between 144fps GSYNC versus 288fps VSYNC OFF. That’s pretty much zero for most people — Only serious competitive players that play old games, would find this important. And most games don’t hit 144fps anyway, so in those cases, you’re not getting meaningful lag differences for sub-144fps situations between VSYNC OFF versus GSYNC.

Hi,
I was considering buying the ASUS VG248QE monitor.
But then I read reviews about people complaining that it gives them eyestrain. Since I get migraines and am very careful about what monitors I use to minimize any eyestrain, I decided to look further.

BenQ has a flicker free range, which I presume would be more comfortable for my eyes. Reviews I have read seem to agree with that. But I just love the idea of G-Sync.

My question is, if Lightboost introduces flickering since its strobed and thus not suitable for me, is there any chance that G-Sync will be available on flicker free monitors? Or maybe the fact that the frames are synced with the GPU means that it eliminates any flickers?

Also, G-Sync has an updated Lightboost feature. Will that be optional? Can you have G-Sync without that Lightboost?

so i’ve read that G-SYNC just like VSYNC does not work in window mode (fullscreen window mode/borderless) – it seems logical but is that really the case?

Fullscreen window mode is a “must have” for most MMO players with 2+ monitor setups – basically i play all of my games in window mode, not only MMO’s. For me it is a NOGO to “loose” all these additional informations (skype, fb, mirc…) while playing a game.

If this is true i think this would be worth mentioning in your article – even if most of the better informed readers may know this (-> similar to vsync).

“There are some additional limitations as well, GSYNC can be used with an SLI system but G-SYNC can’t operate in Windowed Mode (for now), currently it doesn’t support NVIDIA’s Surround multi monitor technology either.”

For lesser GPU and fluctuating variable frame rates (<100fps), the G-SYNC monitor is a preferable experience. For powerful GPU and solid 120fps@120Hz, I prefer strobe backlights, and the best one is Eizo's FG2421.

Meanwhile, the new Blur Busters Forums at forums.blurbusters.com have just launched! This is the best place for the “best monitor” talk; I’ll be writing much longer replies over there.