Frame Rating: Visual Effects of Vsync on Gaming Animation

Sleeping Dogs - You pick the winner

On this page we are doing the exact same comparisons, but leaving the guesswork on which side is which to YOU. I want you all to make your own decisions, which is better, which is worse, which is more smooth versus which is just a higher frame rate SOMETIMES.

I would recommend that you actually download the video files below each YouTube embed for the best experience.

We want to know your guesses before you answer! Just click the link below to open up the poll in a new tab/window!!

My sentiments exactly! Obviously, even with no direct comparison between the two giant GPU manufacturers, there's no doubt in my mind that Adaptive VSync really "brings the best of both worlds" in terms of smooth transitions between various frame times while under the 60FPS barrier. Even at constant 60 FPS I could live with extra 16ms input latency even on more demanding games in this respect (like maybe Dirt 3), provided I get state of the art image quality.
Also, bare in mind that when I turn on VSync in Crysis 3 menu on my GTX690, it is for sure Adaptive VSync that's at work, and not standard VSync. That is with default settings in NVidia Control Panel! So no worries there on the NVidia side...
Thank you Ryan for your dedication and professionalism in bringing us all these really fine reviews! You guys are doing a great job there, and most impressive it's out of real passion.
Keep it up! Sky is the limit! :-)

It took me a maximum of 5 seconds in every case to identify those. Identifying them was immediately obvious, I did not need the 50% or the 20% slow down to be able to tell the difference. You could have done that test with considerably smaller files.

I would like to propose adding a canned benchmark to the mix. They might not be ideal for real world Benchmark between cards, but I think they should do fine in judging smoothness. Either in split copy or half and half or both =).

My picks for canned benchmark:

-Batman - Arkham City ( This benchmark has good amount of panning especially that "hiccup" in the beginning when panning up the stairs, and bunch of fast shooting objects although done by physx).

-AVP (lots of up close and large moving "beings")

-Lost Planet 2 Benchmark (the one with 3rd person in-game play, and closest to what the previous poster Bryan Watson suggest)

-Catzilla (non-real game but uses newer capabilities (OpenGL 4.0 and DirectX 9/11... and faster pace as oppsed to heaven, and just plain cooler)

-there are probably better ones, but these are the ones I've played around with on my 680.

Other thoughts... The 60fps vs Vsync difference is very unnoticeable, but jumping to a possible next level of testing is mouse/keyboard lag comparison (you mentioned it somewhere but I can't remember) which in my opinion is icing on the cake. (maybe that could lead to debunking "gaming" keyboard and mice vs ps2,standard or even wireless! very destructive info imho)

This time I'm a bit confused with your results. Everybody knows that enabling standard vsync on a game where the GPU can't sustain fps above 60 is a bad idea because even if the fps go down to 59, then the everything is rendered at 30 fps until the GPU can go back to 60 or above.

The real question is, what about the case where a Crossfire configuration is capable to run a game without ever going below 60 fps and then, when you enable vsync the games runs at a constant 60fps with no drops to 30fps at all? Does that solve the runt frames and stuttering issue?

"And of course, enabling Vsync results in a situation where the frame rate does NOT dip below 60 FPS, as we did with our reference videos used in this article, will not result in animation inconsistencies."

So why not use crossfire 7970 as well? 7970 on it's own doesn't have tearing issues with v-sync off.

You know what? I think too Ryan's blood runs Green, and to be perfectly honest I wouldn't mind if mine ran the same color, just as long as what's been showed it's from the editorial stance and one's dedication to show us ALL how actual technologies impact on various game experiences.

Even if he hasn't clearly stated (and it is said from the beginning of this article that we're not comparing the two GPU giants), it is my understanding that any GPU configuration powerful enough to consistently run over 60FPS will have no problems delivering smooth gaming experience at 60Hz-60FPS. By all means, that's good news for me and my GTX690, and it'd be the same if I owned a pair of 7970s (JUST for me personally no thanks! - due to greater power consumption, extra space requirements, worse noise and temperature levels inside my CM690 II Advanced case).

I think what Ryan means is to show us ALL (and THAT means not only me or others with ARES II in their cases), just what happens when you have a GTX670, GTX680, a HD7970, or other SLI/CF GPU configurations which don't always stay above 60FPS barrier and what your choices and tradeoffs would be in respect to VSync, image quality and gaming experience.

As for his sponsors, I think that's entirely his problem and doesn't concern us, just as long as what we read, see and hear is pertinent and objectively explained on specific topics. As for what everyone understands, that's another story...

PS: Just to be clear, not taking into consideration the high-end/highly expensive GTX690 and Titan (especially this one!), I think that hardware implemented Adaptive VSync is an obvious step forward from the Green Team, and that is especially valid for those of us who don't necessarily get to buy extreme graphics cards. I can't wait for AMD to pick up the pace and give Nvidia some rough times, but for now please don't mind if "my blood runs green" too!

Thanks again, Ryan - for your dedication and professionalism!

Peace to all! May it be that we fight only on game servers and really nice forums like PC-PER.

good comparison but certain sections distracted me from the left vs right like when one scene has motion and the other doesn't like a car or motorbike...for example the motorcycle at the end on the left distracted me from viewing the right.

first video was obvious when at 50%

second video not so clear to me which was better even at 50%, by 20% maybe a little yes.

third video was impossible for me to tell the difference between the 2.

After watching the first and third videos:
In 30 FPS vs. real world V-sync--at 100% and 50% speeds--I had a hard decision to make. However, at 20% speed, the real world V-sync pulsated (at first, I described it as a heartbeat). After rewinding a couple minutes later, I could identify this quality at the higher speeds.

So after seeing the slowed-down footage, I'm on board with saying a consistent 30 FPS is better. I liked that PC Per got the real-world experience right, meaning it closely mirrors my own computer experiences. As a new builder, it is interesting to see what's gained purely from consistent frame rates (as opposed to only higher frame rates). Excluding than V-sync, what needs to happen between software and hardware devs to overcome the "heartbeat" users have accepted as the best that can be done (well, until retroactively running a game or app way down the road)? I'd be interested in seeing that discussion.

Great work! Keep on observing.

Intel i7-3770K
2x GTX 670s
1920x1080 (only), 60Hz, Adpative V-Sync

- I think an improvement for future videos of this kind would be to use built-in benchmarks to have identical split screens (as performance rating is not the focus).

Great stuff. It seems to me this article puts the many forum's assembly of "known facts" on a much more scientific base. I guess in a sense everybody knew that when experiencing screen tearing one of the things to try was Vsync - for some worked, for some no, and we guessed why; now we can say we know.

The Vsync can still be a good solution. It can be even better if for example we'd have manual control on it. From this article I can conclude the worst case scenario for Vsync would be a game whose output is between 50-70FPS - the Vsync makes the animation just jump between 33 and 16ms; it would be nice to cap manually at 33ms and be done with it.

Not shocked by the results in the least. Despite the limitations of the human eye, the jump from 60FPS to 30FPS is quite noticeable, and most people would agree the constant 30FPS is "better" to watch.

This is all very confusing (and interesting), can you tell me in your opinion how I should setup if I had the following cards (enable / disable V-Sync or Adaptive V-Sync or what) and cannot stand frame tearing but want maximum smoothness?

I'm pretty sure adaptive v-sync is your option, unless the limited tearing you get below your refresh rate still bothers you. At that point, v-sync and lower your settings is your only option. If you want to throw in hardware, getting a 120hz monitor helps, as it gives you a 25ms frame time, between 16.7 and 33.3ms that you get with a 60hz monitor.

Triple buffering helps prevent you getting stuck at a solid 30 FPS when you cannot maintain 60 FPS, but when v-sync is on, and you are getting 45 FPS, you still get displayed times of 16.7 ms and 33.3 ms between frames. Since you cannot update more than one frame per refresh, the system will alternate between waiting one refresh and two in order to maintain 45 FPS.

Triple buffering just allows your GPU to continue rendering a new frame, on another buffer, while the next frame is waiting for the frame buffer to be writable.

I'm looking forward to seeing more about using frame rating tools to empirically calculate input latency; I believe this deserves as much attention as frame delivery smoothness as it can be equally detrimental to immersive gameplay to have slow, "soupy" controls simply because a developer can't be arsed to minimize the 3-5 frames of delay their poorly coded engine introduces!

Ryan, one very good input lag test would be simracing.
There is a brazilian guy iRacing world champion called Hugo Luis that I'm sure would be very pleased to tell you how vsync types and triple buffer affect input lag.
Or Greg Huttu, another champion.

When nVidia tried to market 3dVision surround years ago with 400 series, simracers spot on detected that "something was wrong":

Unnatural changes in motion is the concern; therefore, while examining “Battlefield 3 - 30 FPS vs Standard Vsync Comparison,” the right side of the screen, with standard vertical synchronization, was smooth; while the left side of the screen, with static 30 frames per second, with Adaptive Vsync, was less smooth.

This is empirically evident while paying attention to the wall between 2 min. 5 seconds and 2 min. 11 seconds or while paying attention to the yellow dumpster between 3 min. 15 seconds and 3 min. 21 seconds; jumpy motion is easily seen at 20% speed on the left side of the screen, with the right side of the screen demonstrating much smoother motion.

With “Sleeping Dogs - 30 FPS vs Standard Vsync Comparison,” the left side of the screen is smoother. Although both sides are irregular, the right side is more so, with sharp regular jumps, attributed to the Adaptive Vsync, being much more disturbing.

Please, examine this from 1 min. 50 seconds through 1 min. 54 seconds at 20%. Your thoughts about my conclusion would be appreciated.

I could tell within a few seconds for the first two videos (and then double-check on 20%, just to be safe), but for the third video I am completely unable to tell which one it is which. I can almost-sorta-kinda see it at 50%, and it is obvious at 20%.

My fix for all these issues....get a 120Hz monitor....set graphical settings high enough so you don't go above the refresh rate and experience tearing, even down clock the gpu if the game isn't demanding enough to hold you under 120hz....so you can enjoy a nice smooth 60-80fps without screen tearing, additional input lag or frame rate switching issues....job done.

Ryan, thanks for doing this work.
I'm not sure you will belive me, but when I have vsync disabled, I can tell with my eyes when my games are hitting exactly 60fps, plus or minus 3 or 4. I can do it every time, all the time, and I use fraps to check. The motion "clicks" into smootthness, often for only a second or two, but long enough for me to glance at my fraps readout and see that the number is between 57-63 every time. Is this because the tear lines are showing up at the very top or bottom of the screen (where I'm not looking?.

Please remove the 50% and 20% from the blind test. The whole point is to see if you can see a difference between 60 and 30 FPS, so slowing it down changes the scenario completely. It just makes the file size larger for no benefit. The constant 60 FPS was best, the 30 FPS was worst and the v-sync was in between. But who runs v-sync if you cannot maintain >60 FPS anyway?

Just fuck of pcper and end the witch hunt against AMD. You don't even know how to use a fucking computer. You can create frame latency issues in bf3 with Nvidia or AMD cards but not knowing how to use Vsync or in the case of knowing what you are doing reduce frame latency issues with it. If you fucking morons don't know how just stop writing these articles seriously. The AMD witch hunt is bullshit and needs to end.

I can assure you, your frame rating results for BF3 are wrong and you don't know how to use Vsync.

This was a subjective test. It is to show you what different refresh rates look like with v-sync on. Do you notice stuttering when you are not maintaining a constant FPS? Everyone would likely agree that solid 60 FPS with v-sync on looks the best, but do you find solid 30 FPS with v-sync to be better than 40-50 FPS with v-sync? Some people do, some people don't, but it does show the weakness of v-sync regardless.

So, I think something's wrong with the BF3 videos, or at least BF3-60v30.mp4 that I downloaded.

The phenomenon I'm referring to is easily seen between 2:03 and 2:10 of said video, when looking at the tile around the door that you're turning into on the left side and the rug on the floor in the kitchen.

On the left side of the video, from the 60 FPS test, both the tile and the rug stay crystal clear with every frame. On the right side, the 30 FPS part, there clearly is some form of motion blur applied. The tile, the rug, in fact everything but the gun is blurred while the player is turning left.

Just pause the video at 2:08~2:09 and have a direct comparison - clear tile and textures on the left, motion-blur on the right:

The download site Mega has changed it's terms of service. Two days ago, I was able to download the Battlefield 3 comparison video files. Today, the download site is claiming this:

"Please update your browser. Warning: You are using an outdated browser that is not supported by MEGA. Please update your browser and make sure that you keep the default settings."

This is completely untrue since I have done nothing in the last two days in regards to my browser which is IE 10. Mega only gives me the option of downloading and installing Google chrome. I will not be forced to install that browser just to see the Sleeping Dogs comparison videos. Please inform Mega that this is completely unacceptable behavior to viewers of PC Perspective. Thank you.

Good article but what I miss is a game test with v-sync enabled with triple buffering build in game. World of Warcraft is good example of this. It runs smoothly without tearing even if v-sync is enabled thanks to triple buffering.

Most games do not have a way to force triple-buffering, but it is pretty safe bet that if a game runs in the 40-50 FPS range with v-sync on, triple buffering is being used.

Triple buffering does not fix the problem v-sync causes, in which some frames take 16.7ms to display, and others take 33.3ms of time to display. It can't. It is not possible. Triple buffering makes it possible for frames to be rendered while the current frame is waiting to be sent to the frame buffer. Without triple buffering, you end up forcing the GPU to wait until the previous frame is displayed, before it can create a new frame, resulting in a constant 30 FPS, even if 50 is possible.

The comparison you want is the constant 30 FPS example, compared to normal v-sync on the 7970. The constant 30 FPS with v-sync is like not having triple buffering, and the normal v-sync on the 7970 is like having triple buffering, because it does.

Assuming it's 33ms and not 33.33ms, Isn't the best solution to this is to simply cap at a divisible of 16.5ms? That's what I do and it feels smooth as butter.

16.5x4 - cap at 66
8.25x16 - cap at 132

Which will obviously give us an obvious multiple for each step. If you wanted to play at 30fps you'd cap at 33 - play double that frame rate, you need to cap at 66 - on a 120hz monitor you'd need to cap at 132 - and so on.

The game engine would be sending you frame-per-frame. It seems to improve hit-detection and all that other crap too.