If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. Registration is $1 to post on this forum. To start viewing messages,
select the forum that you want to visit from the selection below.

Microstutter in latest-gen cards - examples included

Not 100% sure where to post this, so I'm putting it here because I'd like to see some results from ATI users as well.

Microstutter, for those that don't know, is the irregular output of frames - usually as a result of multi-GPU setups operating in AFR mode. Since the human eye notices the gap between frames as a measure of smoothness, rather than the raw number of frames spit out, then a game with irregular frame output will not look as smooth as one with uniform output.

It stands to reason, for example, that a game outputting two frames 0.1ms apart, with a 49.9ms gap until the next frame, would look like it was running at 20fps, even though the framerate counter would read 40fps.

Aaaanyway. A while back I wrote a little program to quantify microstutter from FRAPS benchmarks. You can download it here. Basically it looks at the frame-by-frame variation from the local frametime, as a percentage of the average frametime (more details in the readme). From this you can gain a "microstutter index", which you can think of as a "percentage microstutter", 100% being the scenario I described above. It also shows an "effective framerate", which is a measure of smoothness with microstutter taken into account. An equivalent framerate if there was no microstutter.

Okay, now onto some examples. First I'll show you a snapshot of 35 consecutive frametimes I took from the middle of a crysis benchmark, using my GTX480 SLI setup. Not all the scene was as bad as this, but it gives you an idea of the problem:

In terms of the microstutter index, the runs that generated the plots above, had the following results.

Single GPU (2560*1600, 4xAA):

SLI (2560*1600, 4xAA):

This isn't as bad as with my old 4870x2, but it certainly shows that microstutter is still with us.

Now, it's important to note that as the game becomes less and less GPU limited, the microstutter effect reduces significantly. For example, the above benchmark had the GPUs running at near-100% load. At 1920*1200, 8xQAA, the GPUs are in the region 85-95% load for the most part. Here the microstutter index drops significantly:

Okay, well that's my findings. What I'd love to see is some results from ATI users, with multi-GPU setups. I'd like to compare the two technologies and see how they compare. Please post any results you generate, but please make sure you're really at, or close to, 100% GPU load, otherwise you will see an artificially low microstutter index. Afterburner is a good way to check this.

My hope is that maybe, just maybe, if we can generate more awareness of this problem we can get ATI / nvidia driver teams to pay attention and do something about it. A 5% performance drop (say) would be a small price to pay for regular frame output. Of course, this will not happen until review sites start taking note of microstutter when reviewing cards!

FPS in the benchmark was 116. 1680x1050, maxed, 4xAA. Funny enough i found i need to enable vsync to get the smoothness i desire, otherwise i can feel the microstutter. So i would think anything over 15-20% the mircostutter is quite noticeable, if this program is accurate

You meant less CPU dependent, GPU bound/limited to be precise.
High resolution, high game details & AA/AF = nearly 100% GPU limited ( 99.5% of the stress is put on the VGA/s )

Well whatever you choose to call it :p

The point is, that as the effective load on the GPUs is reduced (i.e. as their percentage utilisation over the benchmark drops), the effect of microstutter is rapidly reduced. This makes sense to a certain degree; if the GPUs are waiting for information from the CPU, then they will start to sync their output with the output from the CPU. The CPU will always be expected to compete its frame-by-frame workload in regular intervals (or at least in smoothly varying intervals).

In the case I showed above, with an overclocked GTX480 SLI setup, even crysis at 1920*1200 with 8xAA is not completely GPU bound. The GPU utilisation is around 85-95%, and this helps reduce the microstutter index significantly (compared to the 2560 test, where GPU utilisation is near-100%). I will test even lower resolutions tonight, for a true CPU-bound scenario. I expect very little microstutter then (but who knows!).

My point was that dropping from ~100% GPU load down to even ~90% load makes a big difference to microstutter.

What are you using to measure the VGA/s utilization ?
Not nVperf kit I assume.
Because at 1920x1200 8xAA the VGA utilization should be 100% unless your CPU is clocked under 3.8GHz

I'm using afterburner.

My CPU is an i7-930 clocked at 4.2Ghz. I imagine that if I reduce the CPU clock by a few hundred Mhz I will see 100% GPU utilisation once again (as I do with the 2560 tests), and with it a higher microstutter index. Another thing I could test...

If you're running SLi, then bumping your maximum pre rendered frames from the default 3 to 5 can really help to steady your framerate and make games run smoother; especially in demanding games like Crysis.

Mine is always set to 5, and you and I seem to have a very similar rig, ie 30 inch monitor and GTX 480 SLi.

If you're running SLi, then bumping your maximum pre rendered frames from the default 3 to 5 can really help to steady your framerate and make games run smoother; especially in demanding games like Crysis.

Mine is always set to 5, and you and I seem to have a very similar rig, ie 30 inch monitor and GTX 480 SLi.

Thanks! I'll give this a try in a couple of hours, when I get back from work Sounds like it might be a bit of a trade-off between frame rate regularity and input-lag though. I suppose that rendering more frames ahead will increase the lead time between user actions appearing on-screen?

Unfortunately I'm on a 1920*1200 screen right now My 30"-er died around six weeks ago under warranty, and I'm trying to get in contact with the manufacturer to get a replacement, but they are trying their best to ignore me (seems to be heading towards legal action). But anyway - this is a different story!

The crysis benchmark is odd in that it allows you to run at higher resolutions that your monitor is capable of, and downscale to native res. Makes the green in-bench statistics hard to read, but good enough for simulating 2560 30" performance (and more importantly, maxing out the GPUs).

Okay, well I think I've found a candidate for "worst-case scenario" as far as microstutter goes. The heaven benchmark seems to really make the cards sweat (~100% load), and microstutter like crazy while they're at it.

Running at 1920*1200, with 4xAA, 16xAF, and starting a 60s FRAPS benchmark right as I start the benchmark in heaven, I get the following:

Normal tessellation, render max 3 frames ahead:

Extreme tessellation, render max 3 frames ahead:

Choosing instead to render 5 frames ahead, I see an improvement, albeit very slight:

Normal tessellation:

Extreme tessellation:

I have to say, as I watched the benchmark unfold I KNEW the value was going to be high. I mean, it seemed pretty smooth overall, but for me 60fps is usually rock-solid (by eye anyway). In this, I saw the framerate counter at ~74, 75, and I could still tell that it wasn't perfectly smooth. As object passed by they seemed to be just a little jerky. Looking more closely at the frametimes file, there is some pretty poor behaviour in there. Not quite as bad as the plot I posted above from the 5870 x-fire vantage run, but still pretty disappointing.

Demonkevy666, I'm not sure exactly how to plot the CPU load... I can take a screengrab of the GPU loads from afterburner easily enough, but the output from coretemp isn't so straightforward as to allow a simple plot in excel.

It's been a long time since I've run the Heaven benchmark, but I don't recall that much microstutter

Anyway, benchmarks are just benchmarks. I wouldn't focus on them too much if I were you.

Another thing, use the 259.32 drivers. They are the best drivers I've used so far for my 480s; especially for Crysis.

The 200xx drivers are actually quite immature at this stage, even the official ones. Nvidia is supposed to come out with the 260xx drivers in a few days, so hopefully there'll be some more performance improvements and bug fixes.

Anyway, benchmarks are just benchmarks. I wouldn't focus on them too much if I were you.

I agree... But my main aim in doing this stuff is to create awareness of microstutter as a real "performance detracting" issue, perhaps compile enough data to take to some review sites, who just might be in a position to press nvidia / ATI to do something to fix it. A pipedream probably, but it kind of annoys me that an issue like this exists.

I'll try out the 259.32 drivers when I get chance. I can imagine they improve performance, but to be honest I doubt they will affect microstutter, which seems to be a native AFR-related issue.

The key problem that I see with research like this is the amount of resistance you will meet from some members here tied to the hardware community. *ESPECIALLY* if it can be shown on xyz card that SLI or Crossfire yields no benefit when you factor observed frames vs actual frames per second gains from adding another card.

It is quite likely that what you may in the end prove is what a lot of us m-gpu old timers say that once you reach a certian point adding more cards detracts from peformance not helps which is most apparent when running Tri-SLI or Quad-Fire

It will also be interesting to see the other theory that microstutter is tied to PCI-E satuation when you compare the difference between say an 8x slot vs 16x slot etc etc. Which is why you see some say that increasing your PCI-E frequency, HTT or QPI helps to reduce microstutter.

Overall very interesting and what you describe matches my personal experience.

Nice work here. Funny how nvidias tessellation causes bad microstutter. Did you run it single card?

For reference, single GPU result with heaven benchmark. Same settings as above (1920*1200, 4xAA, 16xAF, normal tessellation, stats on the first 60 seconds of the benchmark):

More microstutter than I've seen with any other single-GPU result, but still pretty low.

I have to say, unfortunately it isn't just nvidia cards that exhibit this behaviour. The worst microstutter results so far have come from a tri-fire 5970+5870 setup (38-48% MS index).

Sentinal: I agree that it's always hard to get across a message that people don't want to hear. Hell, I don't really want it to be true, but it is, and ignoring it won't improve the situation in the future. My hope is that with enough results gathered on enough different configurations, the evidence will become irrefutable. If I had the time and cash I'd buy a whole load of hardware and do a proper analysis and formal results writeup. But unfortunately I don't, so anyone out there who can run this test is appreciated

As for PCI-e and QPI frequency, well I will have a play around with these, but I really don't see how it can have a major effect. The microstutter comes from a lack of control in when frames are output, in AFR mode. What is needed to fix it is some form of communication between the two GPUs to more effectively time the frame outputs, or better yet, a rendering method that has all GPUs working on a single frame (SFR, tiling etc).

@Oztopher: which driver did you use? I am thinking 10.8 makes crossfire more interesting.

Yep i'm on 10.8. Haven't had a single problem with drivers, yet. I always make sure they're uninstalled/installed properly, i.e. express uninstall, reboot, run driver sweeper, reboot, delete all ATI folders etc. then run the ccleaner registry cleaner a few times (always picks up a few ATI reg files, even after running driver sweeper), then install new drivers.

This was with maxed out settings at 2560x1600 resolution, 4xAA, 16xAF, extreme tessellation etc, CPU at 4ghz, and 480s at 850/1700/4200..

The stutter index is high, but from what I could discern, the benchmark ran pretty smooth.

The tool is accurate - it just applies simple statistics to a FRAPS framelog. the exact procedure applied is given in the readme, IIRC.

The reason that it looked smooth is that your framerate is so high anyway. Check the apparent framerate - even taking microstutter into account, your "microstutter-free equivalent" framerate is 107fps. This should be pretty smooth. Like I said, the real-world effect of microstutter is to effectively reduce your framerate. If the framerate is sufficiently high, the game will still be smooth.

... But I have to ask - how in the hell are you getting those kinds of framerate at 2560, 4xAA, 16xAF?! Particularly with extreme tessellation. I get less than half that over the benchmark at 1920 resolution, and that seems to tie in with what others are getting...

... But I have to ask - how in the hell are you getting those kinds of framerate at 2560, 4xAA, 16xAF?! Particularly with extreme tessellation. I get less than half that over the benchmark at 1920 resolution, and that seems to tie in with what others are getting...

For reference, single GPU result with heaven benchmark. Same settings as above (1920*1200, 4xAA, 16xAF, normal tessellation, stats on the first 60 seconds of the benchmark):

More microstutter than I've seen with any other single-GPU result, but still pretty low.

Which confirms my speculation. The card is dropping frames for tessellation work which means optimization. Not a bad thing as long as the framerate is high and ms is not noticable.

Although.. are you sure your formula isn't buggy? The "Global Average Timeframe" is pretty high in that single card run. It's probably due to the low fps, but never hurts to ask

Originally Posted by Oztopher

Yep i'm on 10.8. Haven't had a single problem with drivers, yet. I always make sure they're uninstalled/installed properly, i.e. express uninstall, reboot, run driver sweeper, reboot, delete all ATI folders etc. then run the ccleaner registry cleaner a few times (always picks up a few ATI reg files, even after running driver sweeper), then install new drivers.

I'd be curious to see if microstutter is more noticeable visually on 120hz displays. I must say I noticed it quite a bit with my old 4870x2 and to a slightly lesser extent on 2 260s but this was only at 60hz. I personally enjoyed Crysis more on a single 260 than a 4870x2 interestingly enough and I'm confident the microstuttering was behind this.

Like has been said its not that noticeable at higher framerates but in games which are more taxing ( ie the types of games where multi gpu setups would be most desireable ) it is much more noticeable.