I played Crysis as a FPS and NFS Hot Pursuit as a race game, and both looked OK and very playable even if they were capped at 30fps. I have no issues either, so I think people are overreacting and exaggerating with this FPS crap discussion...

Click to expand...

Have you ever played these games vsynced to 60Hz or 120Hz? I suspect you haven't. If you haven't, then that 30Hz cap might well seem ok to you.

In addition, LightBoost, makes a very big difference by eliminating motion blur. The effect is nothing short of awesome. You basically get to have your cake and eat it.

That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.

Click to expand...

So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.

So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.

Click to expand...

Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.

Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.

Click to expand...

Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.

Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.

Click to expand...

I doubt that "most" people disagree. There's just a few vocal ones that insist on trying to negate what I'm saying, lol.

This isn't rocket science. I've seen all this stuff for myself and the effects are all very obvious, so I know I'm right. There's no way someone can "prove" me wrong with a counter argument, as it's inevitably flawed.

Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.

Click to expand...

Aside from what other people stated about 85% of GTX 690, I'd also want to point out that those are pre-Catalyst 12.11b/13.1WHQL scores, so the difference might be less than 30%... Even so, I take most these wide-margin calculations with shovel full of salt.

So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.

Click to expand...

Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.

Click to expand...

Higher framerates can certainly lower frame latencies. But like the techreport reviews have proven that it can go from 2ms between frames all the way to 50+ms between frames and still maintain a higher avg frame rates. There's no special "packed" way a movies does, it's just a smooth consistent frame rate throughout. That is why 24 fps works in theaters and 30/60 fps works for TVs. Nobody ever complained about that! Your brain will adjust to a consistent frame rates(eg. you can't see fluorescent lights blinking). Inconsistent frame rates will need to have a minimum latency for you to perceive it as smooth. I think anything below 16.7ms(60hz) is hard to detect. I can tell a slight difference between 60 and 75hz but not too many people can.

So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.

Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.

Click to expand...

I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.

Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.

Click to expand...

Apparently we can try to inform these people until we're blue in the face but they'll keep skipping over what we're saying (and backing up with facts) and going on comparing movies/TV to games...

I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.

And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it.

Windows 7 Pro 64bit (not sidegrading to Windoze 10 until I have to...)

Benchmark Scores:

WTF is this...I don't even....

This had better have all of the 15 SMX units enabled. This card should've launched a year ago, I doubt anybody is going to want or accept anything less than a perfect card, especially at the rumoured price of $900...Nvidia have had a year to get their shit together so this had better be their full 2880 shader GPU.

I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Click to expand...

Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view its cos that how we see things

I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.

And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it.

Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view its cos that how we see things

Click to expand...

Last warning. Any more discussion about frame rates/movies will lead to an infraction.

I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.

Click to expand...

Only a retard would use such a sweeping generalisation but seriously I probably do upgrade once a year, I have a Sapphire Vapor-X 7950 (overclocks to 1200/1450 on stock volts)

I buy mainly mid-high end (last few cards: GTX 470, GTX 570, HD 7950) so to some a marginal upgrade every generation but the way I look at it is that I sold my 470 for 2/3 the cost of the 570 and the 570 was better at stock, able to far exceed the 470 OC, used less power ran cooler etc etc. The same is true when I sold my 570 and went with my current 7950, I paid a small upgrade fee after I sold my card and benefitted from a newer architecture, far greater performance when taking overcloking into consideration and of course it's the latest gen hardware so will hold the same kind of resale value as the previous cards when the new generation of cards come out enabling me to again upgrade for a marginal expense whilst having the latest gen hardware, it's a no brainer to me.

Nope erocker, the huge ripoff price for the 7xxx series kept me away from it until the magic driver and bundle came true. What I'm saying guys, is that no matter how high the price will be, there are folks that will pay... at least if the performance is wright.

At the same time, if this would launch at the same price point as gtx 680, it will mean price drops for every card underneath and the killing of gtx680/690. Sure, that should happen if we are talking about a new series, but at least for now, we know to little. If AMD launches at a better price/performance, then it lowers the prices (they have a margin to play with), if not... not. It's a win-win for them.