Share This article

Nvidia has demonstrated a new display refresh technology that’s meant to move v-sync (vertical synchronization) out of the stone age and create a solution better suited to the needs of modern displays. In doing so, it’s exposed a fundamental problem of technology — oftentimes, our standards aren’t based on any sort of objective evaluation of what’s “good,” but simply built on what worked at a given point in time.

24 frames-per-second in film was standardized because it was fast enough for eyes to perceive as motion, fast enough to keep the highly combustible film from igniting due to exposure to the projection lamp, but slow enough not to cost enormous amounts of money to shoot one movie. The 60 Hz refresh rate we’re all familiar with was standardized because vacuum tube technology needed to run at a multiple of the AC frequency. When we moved to LCDs, we shifted away from using an electron gun to redraw the screen, but still redraw the screen a given number of times per second. Nvidia wants to fix that with its new G-Sync display technology.

The issue, in a nutshell, is that graphics cards don’t render at fixed speeds. While we’ve discussed this our coverage of frame latency issues, those discussions have focused entirely on the video card side of the equation — how long it takes the GPU to draw and output each frame, and how variations in that timing can lead to suboptimal displays. The entire reason we use v-sync, for example, is because v-sync prevents tearing. With v-sync off, you can end up with visual displays that look like this:

That’s with v-sync off, which means the video card shoves new images to the monitor as quickly as it can. The monitor, in turn, updates as quickly as it can, with no regard for whether the image being overwritten is fully synchronized at top and bottom. V-sync fixes this by limiting the upper frame rate. You can buy a display with a 60-144 Hz refresh rate (the 144 Hz displays are “true” 144 Hz and do not use interpolation as some high-end televisions do.) But refreshing the screen at a higher frame rate doesn’t fix the problem that v-sync starts tearing again if the frame rate dips below the set boundary, too. Nvidia has previously attempted to fix this on the GPU side, by integrating what it calls Adaptive V-Sync, but G-Sync is something different. Instead of just being based on GPU-side timing, G-Sync is a physical chip that integrates directly into a monitor. G-Sync is compatible with all Kepler-based graphics cards, and should be compatible with all Nvidia GPUs going forward.

According to NV, G-Sync synchronizes the graphics card to the monitor rather than the monitor to the graphics card, and promises such smooth gameplay that internal testers “have found themselves overshooting and missing targets because of the input and display lag that they have subconsciously accounted for during their many years of gaming.” By only drawing frames once they’re ready, the display allows for variable frame rates and smooth playback at the same time. Feedback from people who have seen the system in person has been enthusiastic.

Nvidia’s G-Sync includes a 768MB buffer combined with a custom FPGA

Nvidia plans to make the technology available two different ways. First, those of you with existing monitors — otherwise known as “everyone” — will be able to upgrade an Asus VG248QE display with a standalone kit. No word on whether everyone who doesn’t own a VG248QE display will be able to upgrade or not, or whether the upgrade will be available on Asus monitors in that family at the 27-inch size.

Otherwise, you’ll be able to buy a monitor next year, at resolutions of 1920×1080 all the way up to 4K. Given the feedback from testers, it seems like this could be a major boon for the gaming industry — and, coincidentally, it’s an NV-only feature. If you think about it, this is damned smart of Nvidia. While a G-Sync module will presumably allow a monitor to work with any video card (just like normal), there’s no reason to think an AMD GPU will be able to hook into the feature and use the specialized capabilities.

Since most gamers tend to upgrade video cards every 2-3 years but may use a display for considerably longer, this increases the chance of a person buying several video cards from Team Green in a row. AMD will almost inevitably answer this kind of project with its own initiative, possibly as an open-source initiative. Whether gamers will want to pay premiums for G-Sync tech is fair question, but I suspect a number will — after all, improved image quality is why people ostensibly buy into monitors, and the boost here, according to all sources, is quite significant.

Tagged In

Post a Comment

Wussupi83

I like the concept and applaude Nvidia’s effort to bring a solution to a problem that enthusiast PC gamers often experience. They’re really stepping up as a leader here. I would like to see the direction of this go in a more industry wide fashion. But I’m not Nvidia trying to run a profitable company. So I’ll just post my idealist opinion here and leave it at that.

some_guy_said

I can’t stand tearing. Vsync always. So I’m kinda interested in this.

Carlos Pedro Vasconcelos

Problem with vsync is input lag. iRacer here.

Ray C

Looks good, but I guess we will see how it works out. I agree with one of the original lines. Many times tech is based what we can do now or what works now. We often have to come up with a work-around to do what we need to do. But many times when new technology and methods come out, we don’t go back and revisit our old ways and see if we still need do it that way.

Mark Rejhon

nVidia’s variable refresh rate idea is an impressive technology. It’s a one small step towards tomorrow’s Holodeck-quality displays, and could theoretically work with IPS / VA / OLED in the future too!

G-SYNC even has a superior LightBoost sequel included (optional “low-persistence” strobe backlight mode) that is available at 85Hz, 100Hz, 120Hz and 144Hz, according to Blur Busters research. John Carmack’s videos about this is very interesting.

Jimmy Lin

I might have to plan for buying a new monitor next year, unless mine gets an upgrade kit.

However personally I still want a high rez and high refresh rate IPS or OLED display, that option currently doesn’t exist atm. G-Sync just add another requirement to that list now.

On a side note however, it seems G-Sync have a 30 FPS minimum or it will repeat frame like V-Sync. Well when it dips below 30 FPS no amount of synchronization will help anyways.

I do wonder how does G-Sync work with high refresh rate panels, someone from Nvidia mentioned high refresh rate is even better, but didn’t explain how it would work exactly.

shadowhedgehogz

What’s so good about it?, it just seems like it tries to mask the effect of FPS going below 60.. by removing tearing… and that’s all it has to offer? I wouldn’t be playing games if they ran as low as 30fps anyway so.. i don’t see a point in this unless you intend to game at low frame rates.

Well i have a 120hz display anyway.

dan87

The entire point of this tech is that you don’t *need* to run at 60fps for a smooth experience. 30fps will be able to provide an equally smooth experience.

Do you think the movie you watched in theatres last Friday was smooth? Guess what: that was *24* fps.

dan87

The entire point of this tech is that you don’t *need* to run at 60fps for a smooth experience. 30fps will be able to provide an equally smooth experience.

Do you think the movie you watched in theatres last Friday was smooth? Guess what: that was *24* fps.

Michael Vasovski

Film was standardized at 18fps. 24fps came into play because the audio track, at 18fps, wasn’t high fidelity.

alur

Nvidia better figure out how to capture the dumb-downed social media kidz that are hooked on their texting and touch screens. Gamers, like me, are decreasing in numbers. Partly because of the bulling in MMORPG and partly due to the rigors of long hours required to be competitive. I don’t know the answer, but all I see in this article is very high-priced hardware for a very, very shrinking market. It’s sad that so much power is available, but the dumbed-down kidz have no reason to take advantage of it.

Phil

Kids these days, don’t game like they did back in my day. *waves cane* Why I remember back when the germans were invading poland how we played GLquake on a chalkboard. Now it’s all mmorpg this and whipper snappers bullying people that.
Where did I put my dentures?

alur

Touche. LOL, great post. I don’t wear dentures, but I’m probably one of the oldest gamers and retired from WoW.

alur

Touche. LOL, great post. I don’t wear dentures, but I’m probably one of the oldest gamers and retired from WoW.

David Blanco

So glad I own the VG248Q was about to get two more. Bet them kits will be overpriced like always though. Wonder if it will work over three monitors with three kits and SLI 780 Ti’s.

Sergiy Kryvonos

Does anyone care that there is no problem to solve with this thing and there is no described problem that it actually solves?

DanDustEmOff

What like tearing with v sync off or stutter and lag with v sync on? Yeah cause those aren’t problems at all.

Sergiy Kryvonos

Those aren’t problems really because it fixed by proper implementation of WDDM interface.

DanDustEmOff

Maybe in theory yes, however in reality a lot of games do not have WDDM properly implemented. So these are real problems that can be fixed with hardware rather than expecting someone to do something that is often not done correctly. This tech means that dev’s don’t have to solve these issues within drivers. It also means we get the true frame rate up to 144 fps.

Sergiy Kryvonos

Why you mention games? I meant graphics driver Windows Display Driver Interface implementation. Video card sends interrupts to CPU. In this case its VSYNC interrupt. It would be sent disregarding of existence any device on monitor. There is no difference for graphics driver when to process vsync. All these issues are solved long time ago. Yo can play game with >100fps using any monitor that can handle such frequency. Another question is which frequency does your monitor support.

DanDustEmOff

I have a 60 htz IPS for a second screen and a 120htz TN for my main screen. I mention games because that is why this tech has been developed.
Let me put it like this if I play a game and my GPU is outputting 90 fps but my refresh rate is 120htz I will get a tearing. GSYNC solves this by altering my monitors refresh cycle to suit my frame rate, in this example 90htz if the frame rate drops the refresh cycle matches it giving me the smoothest possible experience without adding latency or stutter.
The problem with VSYNC is that if the GPU fails to produce a frame in time for the refresh cycle then the system sends the last frame creating lag and stutter. This may or may not have been solved but in nearly every modern game I play VSYNC is a horrible experience and I choose to avoid it at all costs, I also find adaptive VSYNC to only be a marginal improvement.
What ever solutions you claim there are out there do not work on any game that I have encountered that stresses my system. Sure if you just play Quake or some other less demanding game with a modern card you will not notice these issues because your GPU smashes out frames so quickly that there is always one ready for the display.

Sergiy Kryvonos

In your configuration the proper implementation of WDDM is as follows:
VSYNC for each monitor. Though our vendors use only one general VSYNC and this is not good if monitors start desynchronizing.

On monitor flip video card sends an interrupt. In a nanosecond WDDM sends it to driver that know exactly if the adapter has next frame drawn already. That is right, it drawn asynchronously, no reason to draw on vsync. If there is next frame ready then driver asks video card to flip frame, only pointing buffer address changed. That is why frame will be flipped next frame for monitor in proper implementation.
No reason for additional devices.

DanDustEmOff

Thing is you seem well conversed in software programing and implementation. People like myself are not.

I make the effort to learn and understand but how you describe the solution may be simple to you but to most of us it is beyond our means to solve, simply because we lack the knowledge to resolve this issue.

So the problem is real to most people, which is where GSYNC offers a plug and play solution with the benefit of having a higher frame rate cap.

Just out of interest what is your proffesion? When you stated “our vendors” I got the impression that you are a software engineer of some type.

Sergiy Kryvonos

yes, I am

DanDustEmOff

I want me one of these monitors now please.

John Krisfalusci

What about an upgrade kit for the Asus PB278Q? You know, the 27 inch 1440p monitor!? CMON I NEED ANSWERS! THX IN ADVANCE!! =(

Hikari .

Indeed very smart from nVidia. Any monitor using it will have that chip, sold only by nVidia! But I fear about AMD, this technology is proprietary, so even if it’s easy to be compatible to, it will still require to play royalities! It’s the same with SLI, which required AMD to create CrossFie. The issue here is that nobody will want to support 2 chips in a monitor, so we might be in a monopoly here. Also, if AMD creates an Open alternative, they will lose once again the chance to profit on a good technology.

Use of this site is governed by our Terms of Use and Privacy Policy. Copyright 1996-2015 Ziff Davis, LLC.PCMag Digital Group All Rights Reserved. ExtremeTech is a registered trademark of Ziff Davis, LLC. Reproduction in whole or in part in any form or medium without express written permission of Ziff Davis, LLC. is prohibited.