Nvidia Slays The Beast, V-Sync, With G-Sync

Share this:

Hello, V-Sync. Yes, thank you for meeting me here today. I invited you out because I felt the need to share some very important news: no one actually likes you. We just put up with you because, well, there’s really not a better alternative. In truth, you’re inconsistent, awkward, difficult to be around, cause obnoxious stuttering, and IT’S YOUR SURPRISE BIRTHDAY PAAAARRTY wheee everyone leap out now! OK, not really. But I figured those couple seconds of revelatory glee might help offset this falling pain piano of existential misery: you’re being replaced. By something younger, faster, and more practical. Or at least, that’s how it’ll be if Nvidia has its way. G-Sync claims to eliminate hassles like stuttering, screen tearing, and the like by synchronizing monitor refresh to the GPU render rate instead of vice versa, which is what V-Sync does. The result, apparently, is worlds better.

Monitors, you see, are fixed at 60Hz refresh rates, but modern GPUs can output so much more. So, as is, you either enable V-Sync to keep the GPU in clumsy lockstep with the monitor (which leads to response lag, stuttering, etc), or you can disable V-Sync to get better response times, but risk screen tearing when the two fall out of sync. Both methods are far from optimal. Nvidia, however, claims that it’s finally found a best-of-both-worlds solution. It explained in a blog post:

“With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.”

“This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.”

G-Sync-enabled displays will work with Nvidia’s Kepler series and be available early next year from the likes of Asus, BenQ, Philips, and ViewSonic. They seem rather miraculous, so we’ll have to wait and see how well they work in practice. On paper, though, this solution sounds pretty water-tight. Here’s hoping it a) holds up once we’re able to put it through its paces and b) isn’t too expensive. It is, however, pretty proprietary at the moment, which is basically a deal-breaker for those running non-Nvidia hardware. Ah, format wars. Aren’t they grand?

You shouldn’t really be able to tell, though, since flatscreens don’t “refresh” like CRTs did, with a series of scanlines that fade away, AFAIK. It shouldn’t be all flickery.

I’m guessing this is “just” a pair of framebuffers in the display, and the GPU can poke the display to tell it when to flip which one it’s holding as shown, vs which one is being written to, i.e. hardware double buffering. Which is actually pretty neat. If this works and isn’t horrendously Evil (driver hate, proprietary dickery, premium pricing, the usual), it’s a pretty nice solution.

It seems that Nvidia is talking about a low-persistence LightBoost mode with g-sync that they say will be on every g-sync equipped monitor. The strobe-backlighting would cause the monitors to act in a way analogous to a CRT and could be the reason why low refresh rates would cause flickering.

The whole point of G-Sync is that you don’t need to use buffering (which increases input latency) or deal with any tearing whatsoever, however minor it may be at 120hz.

Of course, tearing is indeed quite a bit less noticeable at 120+ refreshes per second than 60 refreshes or less, but there’s still tearing. Some people will be very sensitive to that, some people may not be. This kind of product is aimed at the former.

Like it or not, the simple fact is that the console market beats the ever-loving shit out of PC game sales.

Yeah, certain genres rake in megabucks, but the average is weighted heavily on the console market. Though PC sales tend do reliably have a much better longtail than the console market. After 6 months, you might as stop selling new titles and just let the used market take over. You’re especially not going to sell any new console games on titles 1 to 1.5 years out.

I usually don’t use v-sync, instead I just cap the framerate at 120. In so many games v-sync seems to cause mouse lag. Although I am pretty picky about visual quality, I don’t notice the tearing unless I stop to look. Still if this G-sync really works I am gonna be psyched.

Tearing is one of the most jarring things for me. It’s weird how different things can seem to obvious/invisible to different people. Personally when I need performance AA is the first thing to go. I don’t notice the jagged lines at all.

In one of his keynotes John Carmack remarked how some people might care a lot about the latest extremely expensive shader technique that may make some pixels a tiny bit lighter, while they’re ignoring the giant tear line going across the screen…

I’ll always prefer a smooth, tear free 60 fps (or 120 fps if I could afford it) over whatever minor graphical improvement comes from the “ultra” settings compared to “very high”. And I wish all developers were the same, prioritising performance over eye-candy.

I have a dirty little secret: in my, oh, 20 years of PC gaming, I have never once checked or unchecked the v-sync option. I have no idea what it’s supposed to do or help with and I don’t seem to have ever needed it. Thank you. I feel better now that’s off my chest.

I once returned a PC to small-business dealership because I didn’t understand what was causing screen tear. I thought there was something wrong with the graphics card, or the cpu. The business in question soon went under, and the replacement PC I bought from a large megachain cost more and had a lower spec.

G-sync seems pretty exciting, though it seems so obvious even to me that I wonder how the industry has gone so long without thinking of it.

Well, usually V Sync is on by default with many games. You would have noticed the screen tearing with it off I’m sure. The only reason to turn it off is for competitve fps gaming, or if a particular game has stuttering or responce issues with it on.

Whether V Sync is an issue or not varies wildly and is usually a crapshoot from computer to computer.

What I meant was that I had literally no understanding of what screen tear was, or that V-sync being disabled was the reason for it. Had I known I would have been able to make peace with it either way, but I didn’t, so I saw the effects of screen tear and assumed there was a hardware defect. It was my own ignorance that was the problem.

Constant screen tearing (in explorer etc) suggests the drivers were never installed properly in the first place. You were well to return it. Tearing in games only, would probably just be a setting somewhere, they should have pointed you in the right direction, and even demonstrated the fix to you in the store.

But both of the above require work, and a good employee who can engage customers (as we all know we can be problem customers at times, so need some help understanding things).

Search google images for ‘screen tearing’ you will notice that the images look like they are cut and offset horizontally, some games suffer from this much worse than others, the short version is that checking the v-synch options will stop this from happening, but will also cause you game to perform slightly worse in terms of frames per second.

the long version: I’m not even nearly smart enough to explain this version properly.

V-sync is supposed to stop screen tearing, where the bottom half of the screen draws a different frame than the top half.

What it actually tends to do is introduce control lag. I found the second level of Bit.Trip.Runner unplayable until I disabled V-sync in the game, and the Steam forum for the game showed others who had the same issue. KOF XIII forced V-sync on in fullscreen, which is most likely connected to why fullscreen had around twice the lag of playing windowed.

Screen tearing is a nuisance, but control lag can outright ruin a game. A sadly funny thing is that V-sync doesn’t even eliminate screen tearing sometimes, with it still happening occasionally. I’ve played a couple of games where having v-sync on actually made screen tearing worse.

Yeah, the fact he’s so excited about this is perhaps the biggest endorsement I can get. Having Carmack, Sweeney and Andersson on stage during the announcement is also a good endorsement, but trickier since you just don’t know how much they’re holding off due to being paid to be there.

This sounds great in theory. My worry is that it’s a proprietary solution. Does this mean that we will have an ATI solution as well? Will we be locking ourselves in to a graphic card companies products when we buy our monitor? Or even worse, will this thing, even if it’s as great as it sounds, simply not take off because it requires new hardware to function. Will the various monitor companies stop prodcuing these models if they fail to sell.

I suppose it will depends on whether the new tech will neccessitate a price hike by the monitor companies. If it does, this potentially great idea could just fail regardless; at least until someone comes up with an open solution and its a safe enough bet to include it in gaming monitors as a matter of course. Monitors, while genrally pretty cheap these days, are a peice of hardware gamers don’t count on replacing until they absolutely have to. It might be a tough sell unless it really is that great of a leap forward (and is widely adopted).

I cannot imagine monitor companies buying into a ‘one platform’ solution to this.

Either they will enable a software-driven ‘frame control’ or they will not. It needs to be a standard (as with all the other stuff which monitors send back up the cable) or it will be mostly pointless.

Which is what about 80% of nVidias ideas become – sadly – they’re not team players by any means…

AnandTech’s speculating that the cicuitry required for the tech (replaces the “scaler” in a standard LCD) will cost $100 initially at least, but may be available as a standalone kit so that people can mod their own monitors.

If this deliver what is describe, could be really interesting. How is now, games with v-sync on must deliver 30 or 60 fps, numbers like that. If you can’t deliver 60, you must do 30. If this work in a way that if some game can’t deliver 60, but can deliver 50, we will have a smoother gameplay with the best possible framerate.

SSD and Monitors operate emulated much older technologies. A SSD works emulating a hard-disc, with sectors and discs. And a monitor emulate a CRT TV. Its time to stop emulating things (with all the expensive foobars, and sacrifices) and being the real thing a SSD or a LCD can be.

If you want to update all the pixels in a lcd at once instead of scanning though and updating them one at a time then you would need a separate signal path from each pixel in the buffer to each pixel in the monitor. That would be about 2 million channels for a 1080p image. Its possible to send multiple (~160) signals over a single fibre cable but that would still be a pretty thick monitor cable.
Pretty much every time a computer works with an image, it does so pixel by pixel and I don’t know how we could get away from that.

Well that sounds neat and all but doesn’t solve the big problem:
If you’ve got a really powerful machine you don’t use V-sync because of any screen tearing or stuff like that. You use it because it stops your machine from overheating and drawing too much power because in the last console cycle developers got so incredibly lazy they don’t stop the game from trying to get as many frames as possible even when there’s nothing going on. Such as in Menus.

G-Sync wouldn’t work like V-sync in the respect that it stops generic old main menu from trying to draw 300 frames at max GPU and CPU load.

It’s a better solution in the intended category, but destroys the more important “protect your system from incompetent developers” use.

I used to use vsync for that reason and hate any game without vsync options, that vsync often doesn’t work in windowed mode, that driver level vsync options often didn’t function for all games, and so on. Then I finally discovered MSI (I don’t have an MSI card anymore but it works fine for all) Afterburner’s embedded RivaTuner program which allows frame limiting and has also replaced FRAPS for my FPS overlay and screenshot/video taking for me. Afterburner also works as a fine temperature/load etc monitor, it’s primarily for such GPU diagnostics and overclocking/fan speed options (that I don’t use atm). There are other similar solutions too but this is the one I recommend. Great free utility.

DisplayPort/HDMI/DVI only has finite bandwidth, so it can only ever send so many frames to the display per second. That’s one hard cap. Another will be on the panel’s own capabilities, given most of them max out at 60, and even 120 shouldn’t be as bad as uncapped rendering making you graphics card make that delightful high-pitched whine of pain.

With v-sync off the card renders as fast as it can and renders complete frames even if it only finishes sending part of each frame to the monitor before switching to the next one. This increases power draw/heat/noise. A properly ventilated case and card should be able to handle this high load (if it can’t then you need more airflow), but some people get nervous when their card gets hotter than normal.
As others have said there are now easy ways to limit framerate without v-sync if that’s your preference. I like use NvidiaInspector to set the limiter built into the Nvidia driver.

I think its implicit that the render rate would be capped at the monitor refresh rate. If it wasn’t then you would either be overdriving the monitor’s refresh rate, building up a large queue of frames, or never letting the monitor finish displaying a full image and the bottom lines of the screen would never be updated.

So is this why we still don’t have a driver level triple buffered vsync option (basically vsync without the performance penalty, I’m not a competitive FPS player and any input lag is imperceptible for me too) that works for both D3D and OGL and instead have to hope the game we want to play has it as an option? Because they were busy developing more hardware to sell and triple buffering reduces its value?

Anyway, until it’s integrated in monitors and doesn’t cost any more than a monitor without and the monitor I’m using now which is relatively new dies, I won’t really care to investigate unless I see it in person and it’s the revelation it’s hyped to be. Right now, personally I don’t notice stutters unless my PC struggles to run a game and I don’t see how better monitor syncing can fix that struggling, or somehow make low 30-40fps frame rates look smoother or magically make text in motion appear more readable (obviously outside the occasion of a struggle or tear happening right then and at that part of the screen) or whatever else it’s currently hyped to do. Maybe I’m being dumb in this matter.

How come tearing is worse in some games than others anyway? It kinda makes me think that a software solution without any real drawbacks should be possible by the engine’s rendering.

You do realize the game has no control over screen tearing, right? The GPU doesn’t even know there’s tearing. It’s the monitor that’s reading the framebuffer just as it’s being updated by the GPU.

V-sync was implemented specifically to counter this issue. It’s the “software solution” you’re talking about. You can’t have any better without monitors getting more intelligent, which is what G-sync does.

Why yes, I do realize all that (though again, I don’t know about vsync being as good as it can be, you’d probably say the same before triple buffering was introduced and you’ve failed to provide reasons for it or even an answer to my question about why some games/engines are more affected by tearing than others, with vsync off in all cases obviously), and nothing I said implied otherwise, so I don’t know why you replied to me just to patronize.

I haven’t experienced tearing or stuttering in years while playing PC games, now that I think of it, the last time was on my Asus GeForce EN7950GT. I since then owned an Asus Radeon EAH4870 and now a Gigabyte Radeon 7870OC. Could it be possible that AMD GPUs already do a better job at preventing tearing stuttering?

Well, yes if it does what it says. You could have this turned on and not have to put up with screen tearing. The majority of people who game competitvely won’t use V sync for the reasons listed in the article.

Speaking as someone with both a U2713 and a PB278 on my desk, I can say that ASUS make monitors that easily equal Dell, though when I need wide gaumet, I use a U3014 (which makes my games look all kinds of oversaturated awful!) and wouldn’t use any other brand.

As I understand, it has to do with the mismatch between when the gpu spits out a new frame and when the monitor looks for a new frame. If the two are out of time with one another, you get half an old frame plus half a new one. The result is a line between the two halves that look like a “tear”. V-sync makes the two wait for each other until they are both ready for a new ‘page’, causing a delay and hogging resources. I’m sure someone will come along with a better decscription however.

I guess there’s also the fact that 1) nVidia’s tech will arguably benefit all games (as long as you have the right monitor…), and 2) AMD’s tech will arguably be unnecessary for high-powered PCs anyway, given the typical performance delta between PCs and consoles (tho it could be very helpful for mid-range or low-end PCs…).

2) is kind of silly, even the high powered PC could simply get additional room for even more AA, enhancements a la icehancer and sweetfx, user mods, supersampling (if that ever becomes easy on AMD) and much more. Not to mention ~4k and/or 3D gaming severely cripples just about any current GPU and there are plenty games way too demanding for their own good (justified like Crysis 3 and Arma 3 or not like GTAIV) that bring high powered PCs to their knees when maxed even in reasonable resolution. The point of buying the latest and greatest is to have the best performance and quality, why would an additional x percentage to said performance be meaningless? Of course the majority of even enthusiast gamers don’t have twin GPU beasts or even a single of the most expensive GPU so will most welcome additional performance gains.

I think the stress is on ‘fixed’. CRTs go really high (and I love them for it), but you still run on a fixed refresh rate. This is the most ideal solution yet devised: your monitor is now literally 1:1 with what the graphics card is outputting.

Nice technology for the people who need it, but with a 120Hz screen I can just leave v-sync on and get neither tearing nor a framerate drop. Haven’t seen tearing since I got my current LCD.

Of course most games implement tripple buffering these days anyway, which means you can have no tearing and very little framerate drop from having v-sync on, but that does introduce some lag, but no more than everyone using a HDTV is getting (they average around 30-35ms input lag).

Yes, it’s made to take advantage of AMD’s modern architecture, even if it was open (which it isn’t) it would be pointless for others to attempt to utilize it rather than simply make their own API. The whole point is to customize development to this one architecture, like console development is customized to specific models, in order to yield better performance than you get with a generic standard any architecture then interprets like DirectX.

Really excited for what this could potentially become! I’ve been utilizing V-sync for many years to mitigate screen tearing, but always hated the noticeable input lag as well.

To get the optimal feeling from games I currently limit the framerate to 59 with DXtory and force V-sync through Nvidia control panel. This gets rid of both screen tearing and input lag, but some games absolutely hate DXtory injecting itself into them or V-sync in general.

I hope this is as good as it sounds. It all depends on the pricing and display quality though, and I have a suspicion this feature will only be available on 120/144Hz TN monitors. I’ve been thinking about getting a new monitor recently since my newer one is 7 years old now. Now I think I’ll wait and see what’s available with G-sync support before I make a decision.

One, I’m pretty sure that triple buffering solved all the problems with V-sync, so I dunno why we really need this G-sync.

Two, to say that G-sync is going to eliminate stuttering is silly. It’s just that now your monitor will stutter in sync with your framerate, instead of happily continuing along at 60 Hz and repeating some frames.

It’s true that the whole fixed refresh rate thing is pretty archaic and it’s time to get rid of it. And having no tearing and immediate refresh is nifty. But while it’s true that this deserves to be the future of computer monitors, I do think the benefits are almost certainly overstated. (Although I bet there’ll be a lot of competitive gamers jumping on this ASAP, because they just can’t stand that horrible 8ms maximum response time on their 120 Hz monitors.)

Triple buffering mostly solves input lag, but doesn’t solve Vsync’s major problem: the moment that the FPS drops below the refresh rate, you’re going to end up drawing at the next highest factor of the refresh rate.

So 55 fps will appear as 30 fps on a 60 Hz monitor with vsync enabled. This is why Nvidia introduced adaptive vsync, though that re-introduces the tearing issue. In my experience triple buffering doesn’t solve tearing, will have to recheck.

G-sync DOES eliminate VISIBLE stuttering, which is the whole point. Since the monitor is merely drawing what the GPU is putting out at the same rate, there aren’t going to be any half rendered or dropped frames. But, if the stuttering isn’t originating from the GPU side of things (but the CPU, for example) then i guess it’ll still be visible.

I think Nvidia kept repeating that you’d still want a min. of 30 fps for it to work well, so i think it does need the fps to be bearable in the first place. I’d guess that the number would be closer to 24 fps for a lot of games.

I suppose this means all those monitors will have to be 120Hz then, since otherwise for any fps above 60 that you card is able to pull, you won’t be seeing any improvement in experience. 120fps though is more than enough to satisfy anyone sensible.

Well if it works, then I know what I am going to be buying as a second monitor I was planning to buy soon­™

You won’t see an “improvement in experience” for framerates above 60 anyway, with or without G-sync?

Never mind human perception (I’m sure you’re Superman), if you rendered frames more often than the monitor displayed them, you are guaranteed to just throw away frames or parts of frames (the latter introducing tearing).

Eyes and our brains don’t work in frames per second, each individual rod and cone may have a chemical refresh rate of 10 times per second but clearly we can perceive faster motion than that. In theory, the brain can interpret an fps rate equal to the number of rods and cones you have total, if it needed too as it can desync their firing and this may explain the “slowdown” effect people experience during moments of tremendous stress. The cost would be colour, three d vision and frankly your vision would be of “movement”, literally no details – your brain would fill those in later, when you access memories of the event!

Realistically, any human can easily tell the difference between 60 and 120 fps, but smoothness aside, it allows your eyes to double their desyncing and gather twice the amount of information which leads to less headaches, nausea and better reaction times.

Note that one of the peer-reviewed papers is a study on framerates in FPS games. Their game of choice was Quake III, not some sloppy auto-aiming 360-pad QTE-o-rama, and their participants included a chunk of 20-something gamers. The difference in percieved quality and game performance is already falling off sharply between 30 and 60 FPS.

So then what’s the minimum frame rate at which a video game should be rendered to ensure that it doesn’t suffer from jitter or choppiness? There’s no maximal rendering framerate above which aliasing effects are guaranteed to be eliminated. […] And the relevant question is really whether the choppiness bothers you or not, rather than whether it’s visible. Claypool, Claypool and Damaa (2006) report that performance saturates in a first-person shooter game at about 30 fps (above; notice that confidence intervals overlap between 30 and 60 fps). So, say it loud, say it proud: 30 fps is good enough for me!

What this means is that the minimum average tolerable framerate is 30fps, with some people requiring much more to avoid intolerable choppiness. Increasing the framerate will increase the apparent temporal smoothness, with no upper bound (“There’s no maximal rendering framerate above which aliasing effects are guaranteed to be eliminated.”) Therefore once you’ve past your minimum tolerable framerate, likely around 30fps, it’s a matter of how smooth you need/want/like your display to be. It’s fine to be satisfied with 60fps, but it’s disingenuous to imply there’s no perceptible difference between 60fps and 120fps. After all, 120fps monitors aren’t exactly relying on the placebo effect for sales.

Have you used a 120Hz monitor LionsPhil, because otherwise you’re not speaking from experience. I own one and there’s a big jump in smoothness of movement. Just moving the mouse quickly on the desktop, or dragging a window about, you can instantly see the difference, and when a game gets up to 120 fps to match the refresh rate, there’s a liquidity to the movement which isn’t there at 60 Hz.

The actual capabilities of various parts of the eye aren’t really that relevant, because it’s about how many frames the eye has to sample from during that period. The more frames, the smoother things look, and the less likely the eye is to notice that it’s looking at a collection of still frames, rather than fluid movement.

Of course I won’t. If Monitor refresh rate is below the given frame rate, the only perceived difference can be the controls. And since I am not really an e-sports gamer, I won’t feel that difference. As far as visual difference is concerned, the monitor’s refresh rate will have to be greater or the same as the fps pulled by the GPU.

I am not sure what the “Superhuman” thing is about there. It’s common knowledge that 120Hz monitors and framerates feel much smoother and less tiring to any human eye, with the strength of the effect varying from person to person.

G-Sync-enabled displays will work with Nvidia’s Kepler series and be available early next year from the likes of Asus, BenQ, Philips, and ViewSonic.

This is why something like this is not interesting whatsoever. Just like monitors with an 120 Hz vertical refresh rate. If companies that couldn’t produce a proper display if their life depended on it are the only ones making displays with this tech, what’s the point?

Not only do I have a degree related to this, but my work has included advising the TV industry about what equipment they should buy (including things like monitors and displays).

Now, I don’t know if you work for BenQ, or you’re just generally ignorant about display tech, but BenQ isn’t generally regarded as a producer of “very good monitors”. They, like any other producer, have access to good panels, so it’s not like they exclusively make unusable displays, but on a whole, they’re considered to be opposite to NEC and Eizo, on the scale of which producers are worth their salt.

Ok, this could be pretty interesting, as you could get high frequency temporal patterns out of a monitor that have actually never happened before, thanks to prime divisors of frequency etc. It also suggests that bullet hell shooters could get even more rediculous, and that computer stutters could start causing epileptic fits, swerving their way through the frequency spectrum and hitting those that set people off.