flip-mode wrote:What bothers me more than settling on either metric is the insinuation that people who prefer the FPS metric are stupid. Please don't go there. It's bad form in every way. That's the absolute worst turn this discussion can take, so I ask you: don't take that turn.

No, not at all. I'm saying that people refusing to expand their views beyond just fps are stupid/uninformed, and it would be better that frame timing be just as much a part of "common language" about graphics performance as FPS is, not to replace it, not to be "better than it", to compliment it.

mkenyon wrote:Therefore, frame time is a supplemental testing to FPS which is still king for measuring performance.

Zarking fardwarks, have you not grokked the last two pages? Your single-minded need to convert any GFX evaluation into an FPS number is simply maddening and ignores all of the data that Damage & Co pulled out, went public with, and used to create a new paradigm. All you care about is that if Kyle says your card has 3 FPS on its competitor your GFX manhood has been confirmed, regardless of the actual frame delivery times.

FPS is dead, gone, and pining for the fjords (and not the ones Slartibartfast designed). My guess is that the fact that GFX performance improves as frame latency numbers get smaller somehow offends your sensitivities as you clearly want measurements where the larger number is better and seem confused by measurements where the smaller number is better. Deal with it. In some cases, smaller is better.

He has erected a multitude of New Offices, and sent hither swarms of Officers to harass our people and eat out their substance.

cynan wrote:While someone might say that there is a distinction between instantaneous rates and average rates, in reality all rates are averages

No, no, no, you're not getting what I'm saying. An "instantaneous frame rate" is just another name for "frame latency converted to frame rate". If you don't like the term "instantaneous frame rate" then use the term "converted frame rate" or cFR or cFPS or whatever term you pick. But it's a mathematical fact that the delivery time of a single frame can be extrapolated and expressed as a theoretical frame rate.

I wasn't responding to your comment specifically. But since you single me out, do you mean mathematics as in calculus which is based on the limit theorem? Which, in lay terms, basically means that once something (like and interval of time) gets sufficiently small or large enough, further decreases or increases in magnitude become irrelevant (because the value the calculus approach will get you will be asymptotically equivalent to the true value)?

My attempt at being a smart ass aside, I don't think I get why you want to call it instantaneous FPS. Aren't we still talking about 99th percentile frame time? If so, wouldn't it be obvious to call it 99th percentile FPS (or something a little more concise and flashy, perhaps, like FPS99)? What is instantaneous about this measure? 99th percentile frame time, as far as I understand it, is calculated by taking all of the frame times over a test run, ranking them and then taking the value that separates 99% of the faster frame times from the 1% slowest frames. If you divide 1000 by it, you get frames per second. However, this FPS value does not indicate the instantaneous performance at any specific instant during the test run. Hence it is not really instantaneous FPS.

I agree that the difference in terminology between "frames per interval of time" (FPS) and "frame time" is esoteric. If Scott prefers the latter, than fine. Makes no difference to me (though FPS99 is starting to grow on me). Instantaneous FPS just makes this more confusing to me as that's not what it really is.

Writing people off as too stupid to understand the data doesn't solve the problem, the data is still hard for people to digest in a context they can understand. For me this is the kicker:

If someone says a game will run at 60 FPS I know roughly what to expect, in general anything less than 30 frames is not great and I can imagine just how smooth a game will run based on an FPS figure. An FPS value gives me a pretty good estimate of performance that I can visualize based on my own experience. I can watch examples of games running at different FPS values and get an understanding of the respective performance.

How can we understand the frame time data in the same way, just how much delay in rendering a frame will it take to have a real world impact that a person would notice? How many spikes in frame rendering time, and how frequently can they occur to still have acceptable performance? This is still a very murky area, all we can do is look at the data side by side and see that lower is better. But just how much impact is there in the real world user experience? Some of the graphs I have seen are only showing the render time for 5000 frames, to me that doesn't seem like a broad enough sample of data to make a reliable judgement. Especially if you consider that certain scenes or maps may show completely different characteristics in games. Why is the 1% slowest frames a more relevant measure than the 5% slowest frames rendered for example?

I can't think of an easy sentence to describe the performance of a card using frame time as the metric that a person would instantly understand and relate to. That is the challenge, and it may be a matter of education rather than changing the way the data is measured or presented. I'm not trying to say FPS is a superior measure, because obviously it isn't. But it is far easier to understand and relate to.

cynan wrote:I wasn't responding to your comment specifically. But since you single me out, do you mean mathematics as in calculus which is based on the limit theorem? Which, in lay terms, basically means that once something (like and interval of time) gets sufficiently small or large enough, further decreases or increases in magnitude become irrelevant (because the value the calculus approach will get you will be asymptotically equivalent to the true value)?

I just meant algebra

I don't think I get why you want to call it instantaneous FPS. Aren't we still talking about 99th percentile frame time?

No. If you have a frame that takes 30ms to render, then that frame was rendered at 33.3 FPS < that's the "instantaneous" FPS for the frame rendered in that instant. As to whether or not the precise term is "instantaneous" or something else, I don't care at all. Call it extrapolated, derived, calculated, theoretical, whatever.

So the difference is that all of Scott's graphs could be labeled in the same metric, and that metric would be the "standard metric" that's been used. And there's nothing wrong with the metric itself; what's actually wrong is the way the metric has been used.

lilbuddhaman wrote:No, not at all. I'm saying that people refusing to expand their views beyond just fps are stupid/uninformed, and it would be better that frame timing be just as much a part of "common language" about graphics performance as FPS is, not to replace it, not to be "better than it", to compliment it.

Yeah, cuz English and Metric compliment each other so well. We don't need two units to measure the same thing. If the world goes to frame time, that's fine by me, I'd just be surprised to see it happen.

mkenyon wrote:Therefore, frame time is a supplemental testing to FPS which is still king for measuring performance.

Zarking fardwarks, have you not grokked the last two pages? Your single-minded need to convert any GFX evaluation into an FPS number is simply maddening and ignores all of the data that Damage & Co pulled out, went public with, and used to create a new paradigm. All you care about is that if Kyle says your card has 3 FPS on its competitor your GFX manhood has been confirmed, regardless of the actual frame delivery times.

FPS is dead, gone, and pining for the fjords (and not the ones Slartibartfast designed). My guess is that the fact that GFX performance improves as frame latency numbers get smaller somehow offends your sensitivities as you clearly want measurements where the larger number is better and seem confused by measurements where the smaller number is better. Deal with it. In some cases, smaller is better.

You did see where I wrote

I'm stepping into serious speculative land here, but the way that I've seen it works something like this:

Before all of that, right?

I think you should maybe take a bit more time and read the posts thoroughly in order to contribute to this discussion. Your posts read as non sequiturs in the context of what is being discussed. It's like you're arguing against someone that isn't even here THROUGH me. I'm completely befuddled.

mkenyon wrote:Therefore, frame time is a supplemental testing to FPS which is still king for measuring performance.

Zarking fardwarks, have you not grokked the last two pages? Your single-minded need to convert any GFX evaluation into an FPS number is simply maddening and ignores all of the data that Damage & Co pulled out, went public with, and used to create a new paradigm.

Woah, that's way too far. It's not ignoring any data at all, it's just asking that the data be presented using the standard unit of measure. The unit of measure can still be used. I agree that frame time is ultimately the better unit on technical grounds, but sticking with the "standard" unit has value too.

FPS is dead, gone, and pining for the fjords (and not the ones Slartibartfast designed). My guess is that the fact that GFX performance improves as frame latency numbers get smaller somehow offends your sensitivities as you clearly want measurements where the larger number is better and seem confused by measurements where the smaller number is better. Deal with it. In some cases, smaller is better.

Dunno, I think you're being much too dramatic. It'd be fine by me, but I'm not sure that's the case at all. Do you see the marketers using "average frame time" on their packaging? I don't see that happening, and even if it did, you'd be using a term that is exactly analogous to average frames per second. In some kind of perfect world, the marketers would talk about frame time deviation, but I don't see that EVER happening.

Ned, even Scott ultimately ties everything back to FPS.

Last edited by flip-mode on Thu Feb 28, 2013 9:57 pm, edited 1 time in total.

flip-mode wrote:So the difference is that all of Scott's graphs could be labeled in the same metric, and that metric would be the "standard metric" that's been used. And there's nothing wrong with the metric itself; what's actually wrong is the way the metric has been used.

So a frame-by-frame listing of FPS per frame transmits more useable data to readers than the time to render each frame? Rate explicitly brings in time, so let's just stick with time instead of converting it to heavily misunderstood and overhyped numbers.

Sorry, but the instant I read Scott's orignal "inside the second" article I mentally ditched FPS forever because his new metric just made perfect sense to me. Just get used to the e-peen being ever smaller numbers instead of ever larger numbers. If anything, frame latency as a basis for comparison will force all of the noobs to actually read the review instead of heading for the conclusions page and saying "Yay. 3 FPS better so that's the one I'll buy!!!" Educating the consumer only hurts the flim-flam artists.

EDIT: I'm an old fart. Back in the late '70s/early '80s the equivalent of today's FPS marketing was stereo receiver (pre-amp, tuner, power amp in the same chassis for those who don't remember the era) watts per channel, damn the resulting Total Harmonic Distortion. I rail against FPS measurements in the same way, in that I'd rather have 60 WPC/FPS with 0.01% THD/latency than 120 WPC/FPS with 5.00% THD/latency. Coming from an audio background I'm always going to choose the solution with the lowest distortion and, to me, the endless promotion of FPS without disclosing distortion/latency is no different than all those late '70s Playboy ads promising 330 watts per channel from cheap receivers (though at an undisclosed 10% THD).

He has erected a multitude of New Offices, and sent hither swarms of Officers to harass our people and eat out their substance.

blitzy wrote:If someone says a game will run at 60 FPS I know roughly what to expect, in general anything less than 30 frames is not great and I can imagine just how smooth a game will run based on an FPS figure. An FPS value gives me a pretty good estimate of performance that I can visualize based on my own experience. I can watch examples of games running at different FPS values and get an understanding of the respective performance.

You are correct in the sense that the best, most informative and concise means of conveying latency information has probably not quite arrived.

However, at the crux of the matter is the point that what you wrote above has been made obsolete. FPS, on average, does positively correlate with performance and a smooth gaming experience. The problem is that it doesn't always. Worse, it does nothing to warn you when it doesn't. An FPS of 50 could be very fluid on a system that delivers frames evenly. But as we've been seeing recently (most egregiously to do with AMD's ongoing driver issues, etc) 50 FPS could be anything but. To reiterate, when a graphics system is working as it should, high FPS and low latency is highly correlated. When it's not, all bets are off. As an aside, this might explain some of the long-held arguments along the lines of why some people consider an FPS as low as 30 as being adequate, while others demand 50 or even 60 FPS.

To use a crazy analogy. A centuy and a half ago in Hungary, birthing mothers were terrified to go to the hospital to deliver their babies. This was because, by word of mouth, it was more or less known that rates of infections were much higher, leading to higher death rates than mothers who gave birth at home. Then along came a physician, Semmelweis, who tracked the higher infection rates to the observation that medical students, who often delivered babies, would perform the deliveries after examining cadavers - without washing their hands in between. When he implemented hand washing, the infection rates plummeted. (For his efforts, Semmelweis was discredited and locked in a mental institution, because his medical colleagues essentially didn't want to admit fault - but that's another story).

To some up this nonsense, associating lower mother survivorship with going to the hospital is like associating smooth gaming with higher average FPS - sure there is a correlation, but it's not the whole story, and maybe not even the one that really matters in many cases. The discovery that most of this risk had to do with contamination with cadaverous material is akin to recognizing the importance of frame latency in the evaluation of smoothness of gaming performance. Back in the mid 1800s, it was long before the public - or even many medical professionals - were able to embrace the idea of microbes and their involvement in infection, just like many now are struggling to wrap their head around frame latency and even how tech journalists are struggling to find the best way to describe it. But just like germ theory, it will come, and it will probably take a lot less time .

Captain Ned wrote:So a frame-by-frame listing of FPS per frame transmits more useable data to readers than the time to render each frame? Rate explicitly brings in time, so let's just stick with time instead of converting it to heavily misunderstood and overhyped numbers.

Sorry, but the instant I read Scott's orignal "inside the second" article I mentally ditched FPS forever because his new metric just made perfect sense to me. Just get used to the e-peen being ever smaller numbers instead of ever larger numbers. If anything, frame latency as a basis for comparison will force all of the noobs to actually read the review instead of heading for the conclusions page and saying "Yay. 3 FPS better so that's the one I'll buy!!!" Educating the consumer only hurts the flim-flam artists.

I think I was too late editing my last post for you to see the last line: Even Scott ultimately ties everything back to FPS. Getting an "average frame time of 30 ms" is saying EXACTLY the same thing as "average of 33.3 FPS".

If the whole industry shifts to frame time, that'd be fine by me. My argument is a practical one. In the end, Scott still lists 99th percentile FPS, so he's still bringing it back to the standard unit of measure, even though all the graphs in the article up to that point use a different measure.

I just don't see what the difference is in the "real world" where products are marketed and in the language that AMD and Nvidia are going to use. So what if they say "average of 23 ms in Crysis"? What have we gained?

flip-mode wrote:[So what if they say "average of 23 ms in Crysis"? What have we gained?

If they advertised 99th percentile latency for GFX card X, they'd be doing game purchasers a service and would likely drive GFX card sales.

EDIT: It must be the day job getting to me here, as I deal in statistics, deviations, and percentiles every day. I have to grok them because that's my paycheck, and there's no one single-number measurement that describes those institutions I work with. One might say that I find single-number measurements to not fully describe the measured system.

He has erected a multitude of New Offices, and sent hither swarms of Officers to harass our people and eat out their substance.

Captain Ned wrote:If they advertised 99th percentile latency for GFX card X, they'd be doing game purchasers a service

Yes, they absolutely would. 99th percentile FPS will be a smaller number than the full average FPS. Think them marketers are going to use a smaller number? Even sill, we really have come back to FPS, even if only 99% of the way back

flip-mode wrote:Yes, they absolutely would. 99th percentile FPS will be a smaller number than the full average FPS. Think them marketers are going to use a smaller number? Even sill, we really have come back to FPS, even if only 99% of the way back

Ugh. All I can say is that you'd rather advertise big numbers over small numbers even when the small number better describes what's really going on.

EDIT: At the end of the day, this is the real problem. Far too many people have a visceral and negative reaction to a measurement where the smaller number is better.

He has erected a multitude of New Offices, and sent hither swarms of Officers to harass our people and eat out their substance.

flip-mode wrote:Yes, they absolutely would. 99th percentile FPS will be a smaller number than the full average FPS. Think them marketers are going to use a smaller number? Even sill, we really have come back to FPS, even if only 99% of the way back

Ugh. All I can say is that you'd rather advertise big numbers over small numbers even when the small number better describes what's really going on.

EDIT: At the end of the day, this is the real problem. Far too many people have a visceral and negative reaction to a measurement where the smaller number is better.

In the car world people can deal with small 0-60/0-100 numbers, why can't we? We deal with memory latencies with low numbers too. So does "time to encode X video".

Ned, I do believe the OP and a few others are not saying they want bigger numbers. They just want the term brought back somehow, even if it is d(FPS)/dt (flip, you need to brush up on your calculus ). Unfortunately for game testing, you can't cut infinitely small where t -> 0. All we have are individual frames, and each frame does take non-zero amount of time to be rendered.

blitzy wrote:If someone says a game will run at 60 FPS I know roughly what to expect, in general anything less than 30 frames is not great and I can imagine just how smooth a game will run based on an FPS figure. An FPS value gives me a pretty good estimate of performance that I can visualize based on my own experience. I can watch examples of games running at different FPS values and get an understanding of the respective performance.

Actually that single number has been somewhat obsolete about 5 years ago, as review sites began to use other numbers in an attempt to get a more complete picture:1. min/max FPS numbers2. a somewhat coarse measurement of "at no point in time the fps number dropped below 30 (this is affected by the measuring interval where it can miss "slow enough" frames), this is kind of a precursor to the 99th percentile, but not quite.3. numbers from game runs using different scenes/maps

The new line graphs actually show us more interesting things in addition to "summarized" results. Constrast the following 2:1. You have a group of high latency frames only close to each other, but the majority of the time the line is pretty smooth.2. You have high latency spikes almost all the time, but each occurrence lasts shorter.#1 will give you long lags but fewer of them. #2 you will be stuttering almost all the time. The number of high latency frames, and the 99th percentile count, are the same in both cases. Which is more preferred? That may be up to the person. Personally I think #2 gives a worse experience overall because it is not smooth all the time. Something just listing a few numbers cannot show us.

We need the numbers and the graphs.

The Model M is not for the faint of heart. You either like them or hate them.

Flying Fox wrote:Ned, I do believe the OP and a few others are not saying they want bigger numbers. They just want the term brought back somehow, even if it is d(FPS)/dt (flip, you need to brush up on your calculus ). Unfortunately for game testing, you can't cut infinitely small where t -> 0. All we have are individual frames, and each frame does take non-zero amount of time to be rendered.

Though this is true, it's only in service to an issue that I see frame time currently having, and will continue to have for a very long while.

I'm also not 100% sold on it, was just a random idea. The thread title is 'Why not display frame time data in a familiar format?', and I think there might be better solutions out there than what I'm proposing. But as it stands, it's just too cryptic for people to care about right now. Maybe that's a result of Scott's articles being *SO* in depth rather than focusing on explicitly saying something. Heck, the same can be seen in my posts here where my points go right over people and they just attack what they think is a common argument about frame time testing vs. second based FPS polling. I had to get extremely explicit before some people even took the time to understand.

My point is not about FPS being a meaningful indicator of smooth performance, we know it's just an average over time. What I mean is that when you talk about FPS people can understand it and immediately relate to it, it is easily perceivable and demonstrable. It is true that no game will run at a constant 60fps, and FPS over time graphs were a way to show how performance is perceived over time, frame time data is a further extension that looks even deeper. The problem as I see it is that people can't look at the frame time data and relate that back to real world performance. I know what 60 fps feels like roughly, I have difficulty quantifying the effect of 99th percentile frame delays.

cynan wrote:While someone might say that there is a distinction between instantaneous rates and average rates, in reality all rates are averages - it's just that when the time interval is small enough, depending on the situation, whether are not there is variance within the interval ceases to be important.

I'd rather not assume that until we know more.People used to think that light and energy were infinitely divisible until Planck and Einstein discovered quanta. So, uhh... Planck-time?

flip-mode wrote:Yes, they absolutely would. 99th percentile FPS will be a smaller number than the full average FPS. Think them marketers are going to use a smaller number? Even sill, we really have come back to FPS, even if only 99% of the way back

Ugh. All I can say is that you'd rather advertise big numbers over small numbers even when the small number better describes what's really going on.

EDIT: At the end of the day, this is the real problem. Far too many people have a visceral and negative reaction to a measurement where the smaller number is better.

Read, man! That's not even a footnote in the argument I'm making. I have no problem with that and I don't think many other people would either.

It seems more and more websites are doing frame latency-based benchmarking. So perhaps the method has better chances at becoming standardized than I thought. The term "frame latency" really does describe the situation quite precisely. It would be great if FPS was ditched completely and entirely by everyone. I just hate having English and Metric. It's stupid.

Captain Ned wrote:Ugh. All I can say is that you'd rather advertise big numbers over small numbers even when the small number better describes what's really going on.

EDIT: At the end of the day, this is the real problem. Far too many people have a visceral and negative reaction to a measurement where the smaller number is better.

There may be a way to tie larger numbers with better performance, but we have to come up with it first. I think using the smaller numbers--at least in the short term--is very important because we are breaking from FPS to change thinking, and if we swtich back to "big numbers are better," it needs to be clear that the new numbers are NOT frames per second. We might just need time for the new paradigm to set before touching terms like "frames per second" and general "larger is better" metrics.

But again, getting to a really good smoothness metric is going to be hairy, and the best measures could still be "smaller is better." We can try % frames within standard deviation, % under a "smoothness" threshold, but other measures, like number of tears and stutters (if we can pick them out) would blow those percentage numbers out of the water (because tears and stutters are directly linked to smoothness). The best you might get in a "larger is better" scheme is a score, which could end up having different levels or units per site due to slightly different testing methods.

"A life is like a garden. Perfect moments can be had, but not preserved, except in memory. LLAP"

What about measuring average fps every second, in a long enough gameplay time period, and plotting some kind of distribution chart. You could see the minimum fps, the maximum, the most common fps and you would get a good idea if fps drop to often or not.

jokinin wrote:What about measuring average fps every second, in a long enough gameplay time period, and plotting some kind of distribution chart. You could see the minimum fps, the maximum, the most common fps and you would get a good idea if fps drop to often or not.

The frames are delivered "inside the second", which is what Damage started from. That is why timeslices of 1 second is not enough. Humans can perceive smoothness or lag/stutter in milliseconds.

Also, as has been discussed for the last few pages, what you are saying is trying to shoehorn the new paradigm in old/obsolete terms, which can be even more confusing and does not encourage people to break from the old/tired way of thinking. We need a clean break. Just like Intel tore their whole roadmap up and switched to performance per watt. Just like don't quite look at topline wattage numbers any more in PSUs.

jokinin wrote:What about measuring average fps every second, in a long enough gameplay time period, and plotting some kind of distribution chart. You could see the minimum fps, the maximum, the most common fps and you would get a good idea if fps drop to often or not.

Scott already does that with the frame latency over time graphs he puts in every GFX review since "Inside the Second".

He has erected a multitude of New Offices, and sent hither swarms of Officers to harass our people and eat out their substance.

flip-mode wrote:Yeah, cuz English and Metric compliment each other so well. We don't need two units to measure the same thing. If the world goes to frame time, that's fine by me, I'd just be surprised to see it happen.

But they don't measure the same thing...and users will form a preference/tolerance for different levels of each Take 45fps w/ poor frame times vs 30fps w/ good frame times. Someone who isn't sensitive to stutter might prefer the former, other more sensitive eyes might prefer the latter.

flip-mode wrote:Yeah, cuz English and Metric compliment each other so well. We don't need two units to measure the same thing. If the world goes to frame time, that's fine by me, I'd just be surprised to see it happen.

But they don't measure the same thing...and users will form a preference/tolerance for different levels of each Take 45fps w/ poor frame times vs 30fps w/ good frame times. Someone who isn't sensitive to stutter might prefer the former, other more sensitive eyes might prefer the latter.