Frame Rating Part 2: Finding and Defining Stutter

Another update

In our previous article and video, I introduced you to our upcoming testing methodology for evaluating graphics cards based not only frame rates but on frame smoothness and the efficiency of those frame rates. I showed off some of the new hardware we are using for this process and detailed how direct capture of graphics card output allows us to find interesting frame and animation anomalies using some Photoshop still frames.

Today we are taking that a step further and looking at a couple of captured videos that demonstrate a "stutter" and walking you through, frame by frame, how we can detect, visualize and even start to measure them.

This video takes a couple of examples of stutter in games, DiRT 3 and Dishonored to be exact, and shows what they look like in real time, at 25% speed and then finally in a much more detailed frame-by-frame analysis.

Video Loading...

Obviously this is just a couple instances of what a stutter is and there are often times less apparent in-game stutters that are even harder to see in video playback. Not to worry - this capture method is capable of seeing those issues as well and we plan on diving into the "micro" level as well shortly.

We aren't going to start talking about whose card and what driver is being used yet and I know that there are still a lot of questions to be answered on this topic. You will be hearing more quite soon from us and I thank you all for your comments, critiques and support.

Let me know below what you thought of this video and any questions that you might have.

Yeah total thumbs up on the video and taking the time to point out the frames and moving back and forth, and showing the sliver frame (the 2nd/last time you made it CLEAR so marking that sliver frame helped).

Very good, video not too long (which is good), good explanation and walk through with the green penning in very necessary.

I am so glad you are testing this way.
That still with the 5 frames alone shows why this is a superior method.

What a waste of power.

Will you have some way of pointing this kind of thing out for regular benchmarks?
Would it be possible to note how many times the game stutters, and for how long? Maybe how many times scenes are rendered with unnecessary frames?
Or would that be a pain?
I really want to know how many useless frames my computer is wasting cycles on, that dont enhance my experience.

An easy way to show this, is to make fraps record all the individual frame times, and not just record the fps each second. That way you get a visual picture of the drops like in Hardocp's reviews. The green bar is the target 60fps and the red bar is the least acceptable 30fps line. If we experience any major drops like seen between 281 and 301, then you have a stuttering. Now Hardocp aren't recording all the individual frames here, but it's fortunately very easy to enable in fraps and use.

These graphs and results doesn't identify where and when these stutters occurs. There could be 80 frames at a period which would sound great, however they could be displayed in a stuttered pattern.
The reverse reasoning apply for dropped frames.

Nilbog, yes we are working on things to do exactly as you are asking: how many stutters, how BAD the stutters are, etc. Runts = frames that are wasted and shouldn't be used as part of a frame rate measurement.

Well so much for the the long standing argument on the human eye not being able to see more than 30 FPS... lol

What we have here is a fake "5 FPS" in one screen, cut to megga sliver jags, so the eyeball equivalent is 6FPS (on the worst stutter frame shown).

So it certainly appears people claiming 30-60 FPS is a very noticeable difference for them have a lot of facts to back themselves up with here. Of course it would mean they don't have "supervision" or a "superfastbrain", but some lousy vidcard output... LOL

Yes, exactly where things should wind up, a reasonable explanation... yet so funny...

Great video!
I love these type of explanatory and comparison research work!

Frame stutter that you note, I wonder if this is caused by how the 3D Engine generates all objects based on the camera view's perspective.
What I mean is, this project may be tapping into the relationship of Game Engine with GPU and Drivers. Generally, to resolve this, the whole math may be required to be re-thought. Reminds me of how 3D Image Rendering works.

I do agree that less of these cuts does make the experience more enjoyable and closer to realism.

Do you have data on how long each frame takes to render? The previous anonymous's link to the FPS histogram was interesting, but it might be interesting to see this histogram of frame rendering times over the course of the benchmark rather than averaged. I think stutter would show up as "tailouts" to long rendering times next to an otherwise tight grouping of times.

Single stutters aren't that big of a deal really. They happen, the causes can be multiple (and usually I/O related), they're annoying but don't excessively alter the experience as long as they're not frequent
The matter at hand, in my opinion, is smooth frame delivery at a micro level: framerates that don't equate to the same level of animation smoothness, the very thing that has always been so very apparent with multi-gpu setups and that to a lesser extent also exists on single cards. That's something that can substantially alter the experience and that needs to be focused on.
I'll never be thankful enough to Scott Wasson at tech report for finally putting a magnifying glass on this long standing issue.

That's to say:
Ryan, you have equipment that allows you to delve deeper into it and with the scientific objectivity that should convince everybody;
you already mentioned that you will take it to the micro level, I just hope that that's where you will focus your work instead of on normal stutters, which are already clearly visible to everyone without slow motion videos.

So I guess micro stutter and and tearing is something that comes with being on PC. When I was a strict console gamer I was not very aware of these issue. Now that I am strictly a PC gamer I have become more sensitive to them.

I have notice stutter and tearing on console system for a well. Yes, even on the Nintendo system; although the low resolution make it less noticeable or negligible. During that time, there wasn't anything better.

However, I believe no one complained about them because you could not apply any such upgrade as possibly with PC.

This is another reason why I prefer PC.
You can tune it without a corporate organization telling you otherwise.

It's in fact very difficult in this full of activity life to listen news on TV, thus I only use world wide web for that reason, and get the hottest information. you can easily buy targeted twitter followers, Facebook fans/likes in addition to Youtube . com views in addition to reviews.
If you want to buy targeted twitter followers for your twitter accounts, there are numerous great means.
Have more Targeted visitors simply by buy targeted twitter followers on your Account powerful way.

Micro stutter is a specially a problem wit CF, i wonder if you gone do some work on that two?

Also heavy loaded Eyefinty could be interesting to test, as ofc you can not capture all screens, you still can capture one monitor, and what i see that would be enough for testing.

I also had tons of problems in the beginning, and was on the verge of selling my second 5870, when i was reading on the WSGF of a guy that got a 3th card, and it reduced the problem for him whit in tolerable limits.

I then got a 3th 2GB Matrix, thinking, if it dose not fix my problem i just send it back and sell my second card, but yes for me, the problem also reduced significantly, now i got 6 months a go a cheap 4th 5870 Matrix to bridge me over till there are 8970s or 780s with more then 4GB mem. (as i want to use a 6000x1920 setup)

And the 4th even reduced the micro stutters even a little more, the general theory was that every card got more time to prepare for the next frame.

Now i hear that the driver for GCN 7xx0 cards is badly optimized to prevent Micro stutters, compared to my VLIW 5870 cards, nut new optimized anti micro stutter drivers are in the make.
(so i am actually glad i waited a extra generation ;-)

Now i am wondering, if you have your testbed ready for prime time, if you gone do some single, CF/SLI, Tri-CF/SLI and Quad-CF/SLI testing?

And i think it actually would be interesting to see 5870 vs 6970 vs 7970 vs 480 vs 580 vs 680, up to quad setups, to see if different generations act different, tho its properly a bid mouths to test.

Also what i think would make reading the frame-time graph better is if you would arrange the frame-time data in a graph that dose not go up and down, but starts with the longest frame-time and then the second longest frame-time.

Also display the frame-time in the graph, so that a 150ms frame is on the graph just as long as ten 15ms frames.

So in the example image/graph below Graph A and B are the same, only A is displaying the data as it occurs, ware in B the data is ordered neatly from long to short.

There is one game genre whose developers have been dealing with the matter since long ago (due to the genre gameplay nature): racing sims.
For stutter, the game simply do not display everything it can "spit". But this causes input lag.
For input lag, one common tweak is to disconnect (up to a point) the video from the controller input and from the physics simulation.

rFactor (2005) config:

Render Once Per VSync="0" // Attempts to render once per vsync; 1 = use timer, no wait; 2 = use vblank, no wait; 3 = use vblank, wait
Max Framerate="100.00000" // 0 to disable (note: positive numbers only, we always use the 'alternate' method now)
Steady Framerate Thresh="0.00000" // Allowed threshold in seconds to try to 'catch up' when falling behind using Max Framerate (use 0 for original behavior). This helps steady the framerate but may introduce more latency.
Flush Previous Frame="0" // Make sure command queue from previous frame is finished (may help prevent stuttering)
Synchronize Frame="0" // Extrapolate graphics using estimated render time in attempt to more accurately synchronize physics with graphics, 0.0 (off) - 1.0 (full)
Delay Video Swap="0" // Whether to delay video swap if card is busy - this should only be used if framerate clearly improves - otherwise it is only delaying response time

This highlights one of the great advantages of consoles, in that developers can iron out these problems with absolute certainty prior to the game shipping.

I am a long-time PC gamer, but recently I started playing more PS3 titles simply because the overall experience (as opposed to the gee-whiz tech specs) was superior to that offered by my PC.

I have a high-powered rig: 4.8ghz i7-2600k, GTX670 OC'ed, SSD, etc. And it all looks good on paper. But when you start playing a lot of these PC games, they're really glitchy and I find myself yearning for the ability to just play the game without worrying so much. Throw in all the poorly-optimized console ports with blurry textures and we've got a real problem here.

Just curious as to whether the frame rate limit feature of nvidia cards can help with screen tearing issues or make them worse?

I use a GTX680, and introducing vsync is a pain as the input lag becomes more noticeable with my high-input lag screen (a multi input catleap 27")

I've tried capping the frame rate limiter on the drivers to 59fps, but still experience screen tearing. I'd love to know how tweaking this can free up gpu resource/lower power use and also improve frame rates overall? -

Since you seem to have some Programming experience can you develop a "Benchmark" or "Testing Tool" that produces the issues you seek to detect (IE: "frame smoothness", "frame and animation anomalies", "stutter", "runts", and "other").

Have the 'Capture Computer' hold a copy of the "True Output" and compare that to what is captured. Via USB you can send a "Fault Detected Signal" to the "Benchmark Computer" and your Program can vary Parameters in THAT particular Subroutine to fully explore (and detect) those Faults.

Sort of like a "Feedback Loop" for Auto-Focus (in a Camera) that finds parts of your Program that successfully detect Faults and make a Report of what works and what does not detect Problems.

Maybe you could then develop Theories and ANSWERS as to exactly which Problems occur and under what circumstances (along with knowing which Video Card combinations are better, or worse).

It may turn out to be something simple (like moving two Textures at once) or much more complicated (moving more Textures than the Buffers can hold AND also trying to redraw the Screen, etc.).