And although I haven't had a problem with them historically, the pro-AMD crowd is starting to turn me off in this thread and it's predecessor.

I wouldn't give the people on this forum that much credit. Blame it where it belongs - AMD. I've always used an ATI (into AMD) card on my personal main rig (preference, usually goes with my color scheme, whatever) but this is no one's fault but AMDs. If they can't fix this issue then the business they lose is on them.

Thankfully for me (and AMD I guess) I can't see the stuttering during 90% of my gameplay (due to well focusing on the gameplay) but it is a huge disservice to it's user base.

Honestly I've never noticed this issue (think I'm just micro-stutter immune or something), but I'm happy to see AMD following up on these sort of things.

Exactly. The effect is so small that I cannot see it even when I looking for it, nor can my "blind testers" see anything either. It seems like most people who have a problem with the micro stuttering in the 7950 never actually owned one. But if AMD will improve the situation for those that actually have a problem I see no reason why anybody would be complaining.

Exactly. The effect is so small that I cannot see it even when I looking for it, nor can my "blind testers" see anything either.

Of course you can't see it. 'Stutter' manifests itself as lower perceived frame rate, and if that perceived frame rate is still over your threshold for smoothness, you'll view it as smooth.

If I had you play a game at 50FPS and told you that it was running at 60FPS, would you be able to notice? Would you call me out on it? Would anything jump out at you? Of course not, but that doesn't change the fact 60FPS is better and smoother than 50FPS. The same is true when it comes to the distribution of frames in a more even vs. less even manner. Just because you can't "see" it doesn't mean that uneven frame distribution doesn't have a detrimental effect on gameplay.

One of the best ways I have seen to show the impact of microstutter is to have two machines side by side where one of them is doing it and the other is not. It becomes immediately apparent something is wrong on the stuttering machine despite its decent frame rate. Your eyes notice the stutter comparatively. Once you have seen that you have trained your brain to recognise it.

But microstutter comes in different magnitudes. A 2ms swing I doubt anyone could notice. 16ms effectively halves the frame rate on a 60hz monitor (30fps) and beyond that the appearance is below 30fps so almost everyone notices 20ms and beyond.

Of course you can't see it. 'Stutter' manifests itself as lower perceived frame rate, and if that perceived frame rate is still over your threshold for smoothness, you'll view it as smooth.

If I had you play a game at 50FPS and told you that it was running at 60FPS, would you be able to notice? Would you call me out on it? Would anything jump out at you? Of course not, but that doesn't change the fact 60FPS is better and smoother than 50FPS. The same is true when it comes to the distribution of frames in a more even vs. less even manner. Just because you can't "see" it doesn't mean that uneven frame distribution doesn't have a detrimental effect on gameplay.

What you say is true. I definitely can see a difference from 40 FPS to 60 FPS, but as you might recall from my previous post I have done my best to reproduce the exact same scenario in Skyrim as TR did, but I only get one frame in a hundred with a latency corresponding to ~43 FPS instantaneous, and with Radeon Pro tweaks it is 1% ~52 FPS instantaneous. That is nowhere near what TR obtained, and this is why I say they have a problem that I cannot reproduce, despite using the exact same card. Maybe there are people that can perceive the latency I obtained, but if every hundredth frame drops from 60 FPS to 52 FPS (on average) I very much doubt the people who can see it are in a majority.

Edit: I should add that having 1% of frames at 50 instantaneous FPS is not as visible as having a constant 50 FPS, but that I think is obvious.

Of course you can't see it. 'Stutter' manifests itself as lower perceived frame rate, and if that perceived frame rate is still over your threshold for smoothness, you'll view it as smooth.

If I had you play a game at 50FPS and told you that it was running at 60FPS, would you be able to notice? Would you call me out on it? Would anything jump out at you? Of course not, but that doesn't change the fact 60FPS is better and smoother than 50FPS. The same is true when it comes to the distribution of frames in a more even vs. less even manner. Just because you can't "see" it doesn't mean that uneven frame distribution doesn't have a detrimental effect on gameplay.

Obviously you were gaming around during the NVDA 5 series days not so long ago.
Having no doubt seen the huge latency spikes that series had in comparison with the AMD 6 series cards,did you find the stuttering at that high level a problem?
You did see it right?

Obviously you were gaming around during the NVDA 5 series days not so long ago.
Having no doubt seen the huge latency spikes that series had in comparison with the AMD 6 series cards,did you find the stuttering at that high level a problem?
You did see it right?

During the Fermi generation I had a 5870 and then two 6970's in CF. My current card is a 7970. My last Nvidia card was an 8800GTS. If anything I lean slightly towards AMD, but that doesn't stop me from realizing the truth when it's staring me square in the face.

Personally, I'm looking towards an even better gameplay experience from future drivers now that AMD is honing in on evening out frame rate distribution. I'm surprised that so many don't share my anticipation and would rather fight some stupid, uphill fanboy war.

Someone want to point me towards a guide on how to measure le latencies? I'd like to see what is happening during me WoW adventures.

It is very noticeable on my rig. If I'm not fighting or doing something that requires me to focus, I see the stuttering. I can be flying from one end of a zone to another and the stuttering is very visible.

During this time I usually Alt-Tab or rummage my bags, perhaps why I'm not ripping hair out, but it's still there.

Someone want to point me towards a guide on how to measure le latencies? I'd like to see what is happening during me WoW adventures.

It is very noticeable on my rig. If I'm not fighting or doing something that requires me to focus, I see the stuttering. I can be flying from one end of a zone to another and the stuttering is very visible.

During this time I usually Alt-Tab or rummage my bags, perhaps why I'm not ripping hair out, but it's still there.

You needs fraps. In fraps there is an option for capturing frame times. Tick it. Use F11 (or whatever key you set) to capture a a trace. It will produce a .csv file into the benchmark folder, load it into excel/OO calc etc and then create one additional column taking the difference of the frame times. (just =A2-A1 all the way down). Then graph the frames on x and frame times on y and you'll get a chart just like TechReport.

The ideal 60hz picture is one with a straight line at 16.6ms. If its consistently higher than that, say 33.3 then its 30fps (1000/frametime = fps) and its just poor frames per second causing the poor feeling of motion. If however what is happening is its got a consistent back and forth spiking around 16.6ms then that is microstutter and I am calling the magnitude the difference of the top and the bottom. So an oscillation from 10 to 22ms is 12ms microstutter. If you see big jumps from a standard level then those are just momentary drops of fps, most people just call that a stutter or a judder.

So the 3 types of performance problem we can look for in frame time graphs are:
1) Microstutter - oscillations around some average frame time that keeps going above and below that average over and over.
2) Stutter - A big swing (up or down) for a moment that causes a jump in the game play.
3) Poor frame time - an average frame time that is not producing a good fps. Ideally it should read 16.6ms (1000/60), for 30 fps (1000/30) = 33.3ms.

If you want to get fancy I am also interested in the amount of variance in the variance. That is take the difference of the frame times to each other If you graph that you will get a graph that oscillates around the 0 point, goes positive and negative and it shows the amount of change. What I originally thought would happen when I took the deriative of frame times was that if I abs() the values I would get a consistent picture showing the amount of microstutter as a straight graph as well as seeing the judders. But I dont see that, I actually see just as much variance in the variance for all the traces I have. Feel free to make a thread with your graphs and link to your data and some system specs as I would love to add it to my collection of traces that show stutter that a user can see.

You needs fraps. In fraps there is an option for capturing frame times. Tick it. Use F11 (or whatever key you set) to capture a a trace. It will produce a .csv file into the benchmark folder, load it into excel/OO calc etc and then create one additional column taking the difference of the frame times. (just =A2-A1 all the way down). Then graph the frames on x and frame times on y and you'll get a chart just like TechReport.

The ideal 60hz picture is one with a straight line at 16.6ms. If its consistently higher than that, say 33.3 then its 30fps (1000/frametime = fps) and its just poor frames per second causing the poor feeling of motion. If however what is happening is its got a consistent back and forth spiking around 16.6ms then that is microstutter and I am calling the magnitude the difference of the top and the bottom. So an oscillation from 10 to 22ms is 12ms microstutter. If you see big jumps from a standard level then those are just momentary drops of fps, most people just call that a stutter or a judder.

So the 3 types of performance problem we can look for in frame time graphs are:
1) Microstutter - oscillations around some average frame time that keeps going above and below that average over and over.
2) Stutter - A big swing (up or down) for a moment that causes a jump in the game play.
3) Poor frame time - an average frame time that is not producing a good fps. Ideally it should read 16.6ms (1000/60), for 30 fps (1000/30) = 33.3ms.

If you want to get fancy I am also interested in the amount of variance in the variance. That is take the difference of the frame times to each other If you graph that you will get a graph that oscillates around the 0 point, goes positive and negative and it shows the amount of change. What I originally thought would happen when I took the deriative of frame times was that if I abs() the values I would get a consistent picture showing the amount of microstutter as a straight graph as well as seeing the judders. But I dont see that, I actually see just as much variance in the variance for all the traces I have. Feel free to make a thread with your graphs and link to your data and some system specs as I would love to add it to my collection of traces that show stutter that a user can see.

Thanks for the info. I got le Fraps, however I don't got any Office Software installed. I'm sure I can possibly use Google Docs. I'll give it a shot and see what's under the hood.

During the Fermi generation I had a 5870 and then two 6970's in CF. My current card is a 7970. My last Nvidia card was an 8800GTS. If anything I lean slightly towards AMD, but that doesn't stop me from realizing the truth when it's staring me square in the face.

Personally, I'm looking towards an even better gameplay experience from future drivers now that AMD is honing in on evening out frame rate distribution. I'm surprised that so many don't share my anticipation and would rather fight some stupid, uphill fanboy war.

I share your anticipation, but it's hard to deal with the fanboys who don't even use an amd GPU telling me my card is unplayable even though it's fine.

Amd working on a fix is going to make an already excellent GPU better.

I come here to get more info about what is going on and how it can be fixed, and I have to swim through a mudhole of useless posts, such as the one above. Guess I should go back to the more useful parts of the forum...

People who are interested in this research are getting annoyed by AMD fanboys who come into the thread saying "I personally can't see an issue on my PC therefore there can't possibly be an issue".

What they perceive as an individual is totally irrelevant. If someone did a scientific survey that would be interesting, personal anecdotes are not.

At first I was an ardent believer that this was all BS, now it believe it is only down to a few games and can be fixed easily. While TR did find some problems and if AMD improve their drivers as a result then excellent. Unfortunately some are making out the issue is in every game and that AMD increased FPS at the expense of smoothness all across the board with their "never settle" drivers.

My testing of Skyrim did indeed show serious stutter, but it was easily fixed. Other games as showed no such problems. I even tested Hitman in the exact same level TR did and found no problems with frame times.