your mico-stutter benchmarks" have become more and more famous lately........ didn't have time to read but 2 1/2 pages but will finish it later as time permits........ wonder if they explain why nvidia has less problems... ???

Does that make Scott THE stutterer? Or lord of stutter? The stutter king?

Smoothness tyrant?

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

Its slowly getting out there. We should dub the 'micro stutter' The Scott Wasson Effect.

Hopefully more and more people will listen and soon enough AMD and Nvidia will notice and try to do something about it with their driver releases. And I dont mean using tricks or cheats to minimize the effect but actual fixes.

(\_/) (O.o)(''')(''') Watch out for evil Terra-Tron; He Does not like you!

I only read the first page (I'm eating breakfast and only have time to skim articles...and post on TR) but he did credit TR in a big way. Anandtech is a big site with lots of followers so it's great publicity. Even if all he did was slag FRAPS, Scott still gets great publicity.

Yeah, I know. Perhaps it's my bias for this site, but the whole article felt a bit like a hatchet job.

"TR is doing a great job over there, but... they're doing it wrong. Sure they found some problems, but that really doesn't invalidate that they're doing it wrong. Check out my appeal to authority when I speak to AMD and NVIDIA and they too say they're doing it wrong. Anandtech will one day do this kind of measurement, but not until we find a tool that keeps us from doing it wrong. Trust Anandtech, trust Ryan, don't trust anyone else."

"Welcome back my friends to the show that never ends. We're so glad you could attend. Come inside! Come inside!"

Yeah this was my comment over at Anandtech after finishing up the article...

I love you guys, but this article comes off a bit like sour grapes. The Tech Report dove into this issue head first and admitted from the beginning the testing methods may not be perfect. They have continued to be clear on this and you made no mention of the high speed video tests that they performed. The bottom line is The Tech Report is primarily responsible for getting AMD to get on the ball with this issue. Regardless of AMD's bag of excuses and their sudden clarity on the best methods for testing we would not be where we are without the sold work of The Tech Report. I feel that if the FRAPS method of testing was sufficient for bringing these issues to light then a job well done. The situation will only improve from there and Scott Wasson and company deserve more praise than this sour attempt of an article to discredit the good work they have done. If that we not your intention then I apologize, but it comes off as such.

Simply comes off as Anandtech's attempt to be relevant on this issue...

Its slowly getting out there. We should dub the 'micro stutter' The Scott Wasson Effect.

I thought we don't want to use the term "micro stutter" because it is not to be confused with the SLI/CF-induced slowdowns?

Ditto.

When I think of latency problems in drivers I think of single GPU systems (primarily, I'm sure the issue is exacerbated in SLI/CF though) struggling to maintain consistent frame times.When I think of micro-stuttering I think of the timing problem of synchronizing the outputs of multiple GPUs.

Yeah, generally stuttering results in frozen frames (time/animation freezes), and the recovery is usually jarring.

Micro-suttering, if you think in terms of FPS*, basically results in a lower than reported measure. The FPS average will be skewed by the short frame times, but the user will perceive the rate to be tied to the longer one. VSync will probably lock the rate to the higher time, but expect to see consistent interruptions with it off.

*I know FPS is a bad pradigm to use, but it is a good way to explain the perception in this case.

Finally finished the article. It was very in-depth and informative (even beyond Scott's inital explanations of his methodology). A few comments:

I got an underlying "crybaby excuse" from anandtech drifting through the entire article. "FRAPS isn't the perfect tool for the job of frame interval measurement, so we're not even going to try." Sure, FRAPS may not be the perfect tool, but Scott disclosed this as well. The fact that TR is using FRAPS to do a comparison between cards on the same test bench hardware (OS was shown not to have a profound effect) while also offering first hand subjective gameplay observations and some high-speed video capture is still in the "plus category." Limiting variables and looking at the results from many angles is key.

Scott used simple methods that were available at the time to bring a major problem to light. (in defense, this is stated briefly a couple times in the anandtech article) Sure, as frame interval and latency issues are improved, the "FRAPS method" of benchmarking may become insufficient. (I seem to remember a TR article a while back saying Nvidia was doing something to smooth frame-times at the end of the pipleine already) That's perfectly fine, Scott never said his method was without limitations. It brought about a major benchmarking change, and further changes/evolution is a welcome outcome.

It was interesting to hear AMD's comments an that it takes very little effort to make driver fixes that reduce stuttering. Seemingly less than one day's work to fix a given game. Yet here we are, two months after the first driver fix and still a relatively short list of "fixed" games. Let's hope this is just a result of them having their eyes pointed to future game releases.

I don't have any problem with the article except that they don't mention that Scott already knows FRAPs is not the be all end all solution to frame latency benchmarking. He says it like some kind of revelation but we've all know that for awhile, TR is just using the best tool they've found so far and sharing the results the best way they know how. It might not be perfect but its a lot better than nothing and the longer Anandtech takes to adopt a similar testing method of their own the less relevant they are (and I say that acknowledging that AnandTech is firmly my #2 hardware site).

Perception may be 9/10ths of reality but as a geek I base my decisions on that last 10th so there's no questions about what I'm perceiving in the first place.

It's surely an interesting technical article, but yeah it comes off to me as 'AMD and NV don't like FRAPS and it has limitations show frame times so we won't use it (and oh btw, that also means we don't have to update our now outmoded GPU testing methods yet)' It kind of gets into the fact that without this less than ideal testing method the issue wouldn't have gained the prominence it did, but it still seems weird to go into that angle and explain that AMD acknowledged the issue and was able to get notable improvements, yet still dismiss the method as 'not good enough.' There are nuances to that 'not good enough' outlined in the article but they are overshadowed by the overall tone.

I know Scott did some followup with AMD even though he didn't write a separate article about it, but it's also a little sour grapes-y that Anandtech makes it sound like they are the end-all-be-all of tech sites because they had a discussion with AMD...ooo, look at us, we're special!

The real "problem" with fraps is that most of the other stuff that is "better" doesn't necessarily make analysis easy, if you can compare numbers at all. PC Perspective had some GREAT video and overlay images due to catching the post-processed stuff, but those images don't really work well to analyze and compare cards. It looks like they have graphs now, but I'm assuming that the "non-fraps" graphs were generated through custom software or Excel formulas.

At least with fraps, it's a widely available tool that can easily generate data, and while that data might not capture the full or final picture, the results seem to coincide with the end experience, and latency still affects that final picture.

I am sure that someone will eventually develop a distributable tool to measure "smoothness" factors (going beyond simple times), but until then, we can make the best use of what we have.

It was a good article and it credited Scott (by name) as finding a problem that AMD ultimately admitted to and (at least partially) fixed.

The other takeaway is that they don't think FRAPs is a good tool for measuring stuttering, but that there really aren't other tools that can do the job either. Nothing new there. They could have done two things better on this front:1. Reference that Scott said that up front when looking at stuttering2. Mention that FRAPs can detect stuttering if the GPU is truly a bottleneck (because it cannot present the frame until the GPU bottleneck is cleared). In this specific instance FRAPs does the job (which is proven by AMD acknowledging this and issuing a driver fix). It can occur for other reasons (as they mentioned) but by comparing multiple cards on the same hardware and doing multiple runs reduces those variances.

The comments were also mostly good, seemed to think it was a good article, with several commenters mentioning that FRAPs is the best tool we have for now, and until GPU manufacturers or someone else comes up with a better tool, it is better then nothing. There was only one comment I saw that was out of line, and it was so poorly worded that I stopped reading it.

It's surely an interesting technical article, but yeah it comes off to me as 'AMD and NV don't like FRAPS and it has limitations show frame times so we won't use it (and oh btw, that also means we don't have to update our now outmoded GPU testing methods yet)'

LOL, yeap, that's what it sounded like...Anyway, I do agree with their statement about practicality of such results:"If the user cannot see stuttering then stuttering should no longer be an issue, even if we can measure some small degree of stuttering still occurring. Like input lag, framerates, and other aspects of rendering, there is going to be a point where stuttering can become “good enough” for most users" even if they said this as a part of an excuse for being "late to the party".

My subscription allows you people to exist on this site and makes me a better human being than you'll ever be

That AMD wasn't already looking for this is downright silly, and frankly hard to imagine. We've known for so long that no game processes each frame at the same speed (except maybe Rage), and that trying to reduce that stutter usually comes from making sure that the whole system can spit out acceptable minimum (or median low) framerates. That framerate was the 'wrong' metric to to measure smoothness was written on the wall, but it was all we had, and we're grateful that Scott has pushed this to the forefront. Even Anandtech wouldn't have come out with this without his work, and I believe Ryan is he's making excuses despite trying to sound like he isn't.

Hell, even HardOCP had a pretty good subjective handle on this stuff, and their work is valuable here too- they've ragged on AMD for being much less smooth with more than one GPU for almost two years now. They explicitly stated that AMD was pushing out higher framerates while Nvidia was delivering a better experience.

Depends on what you call 'in the loop'. TR is certainly aware of it though. I like how the end result of the whole testing process of PC Perspective is that the FRAPS results will indicate that there's a problem just as well as the results from their (obviously more thorough) method. That's a very important result for PC Perspective to have attained, as it enables Techreport (and other publications) to draw conclusions based upon the results from FRAPS as well as their subjective perception, knowing that if FRAPS indicates a problem, the problem will probably show up on the actual display as well.

The PCPER article is definitely more through, but then, Anandtech's admission that this is just a starter article and their work on it is still incomplete is stated upfront.

Very curious what else comes up, especially given the sheer volume of data this produces and the number of sites involved. With at least two major sites working on it (maybe more?) we may see quite a bit more on it, and different opinions of the analysis never hurt...