So I did a quick search didn't find anything on this and was wondering. Do you think there are huge noticeable differences between 30, 60 and 120 fps? If so, by how much? Is 120 that much better than 60? Does the frame rate (not linked to logic) really effect game and by how much?

This is because some dude was ripping on me when I said that from 60 to 120 there isn't such a huge noticeable difference. Most wouldn't even care to notice.

The higher the FPS, the less noticable changes should be(i.e. more smooth) per frame. With lower FPS, you have to have more movement changes per frame to achieve the same difference in the same amount of time when rendering graphics. 120 FPS should be the least noticable, but i dont think human eyes can process the difference higher than like 30-60.

even if your eyes cant process 120fps, i higher frame rate would start to blur together which would appear more natural to your brain. real life runs at pretty high fps, each frame you perceive is an amalgamation of many.

60-120 I see a slight difference. That is I can tell but I need to really pay attention.

The human eye can pick up light from like miles away and detect flashes in tiny fractions of a second so of course we can see at more that 30/60/9999fps. The thing is, diminishing return. At how much faster do things need to run to get the return of more fluid movements. I think that 60 should be the standard. This is because 60 gives the best results without making things unreasonably slow.

What about gameplay? does it really effect it? If the simulation is at, say 60 logic cycles and someone plays at 120 fps, do they have an advantage? I know that input can be tied to framerate but we will say that the rate the frames show doesn't effect anything.

Important numbers: (Some of these are from memory and may be slightly off)

~24 fps. The human eye sees it as a moving picture rather than a series of frames.~48 fps. Anything lower will cause strain to your eyes over long period of time.60 fps. Refresh rate of most monitors. Any more than 60fps on a 60hz monitor is a waste of processing power as you're writing to the framebuffer multiple times between reads.

Your eyes can't tell the difference between 60-120, a regular person begins to notice choppiness at around 40 fps

I have a monitor overclocked to 90Hz and I can easily see a single frame being dropped, despite the more than usual motion blur my screen has. I had a 144Hz screen for a while which it was a little harder to see a framedrop on, but they sure as hell were there. You're not speaking for everyone, so please kindly shut up. I'm getting tired of this argument already. It's on the same level as claiming that everyone needs glasses.

What about gameplay? does it really effect it? If the simulation is at, say 60 logic cycles and someone plays at 120 fps, do they have an advantage? I know that input can be tied to framerate but we will say that the rate the frames show doesn't effect anything.

Game logic cycles shouldnt be tied to frame rate. It was fine to do that in the past but with so many different machines and devices, and operating systems that can share the cpu out to other applications at any time they want, it just doesnt work very well anymore.

You will hear a lot of hoopla about the "maximum frame rate an eye can see" is 30-40FPS (and that numbers always changing), but those people don't have a clue what they're talking about.

Yes, most recent research shows our eyes seem to process information at about the same at 30-40~ frames per second, but our eyes are NOT in perfect sync with the image we are viewing. If we could sync our eyes absolutely perfectly with our monitors *and* the frame rate stayed at a perfect, absolutely solid unwaving framerate that matched our eyes (because it's not constant either) then 30-40~ frames per second would be the smoothest video in the entire universe, we couldn't tell the difference between it and real life.

But, the reality is, since our eyes process the data at a different rate, different ways and out of sync with the video, we need even more frames to process to keep everything smooth because we (for lack of a better word) need "filler frames". From the latest/greatest in TV/Monitor tech:

Under 30FPS - Choppy, but if consistent, still "good". Believe it or not, a lot of movies actually run at around 30FPS. Did you know a standard Bluray disc only runs at 24 frames per second?60~ - Better for gaming or media that's frame rate isn't consistent, giving our eyes the additional data required to process the information to a smooth image.120~ - Getting to the point our eyes wont be able to tell the difference, there's so many additional frames that our eyes have plenty of data to translate to our brains.240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there's a good chance thats true between 120 and 240 too)

So I say the "holy grail" frame rate for gaming is 120FPS. But like BurntPizza said, you need a 120Hz monitor to get the full benefit. Although 120FPS on a 60Hz monitor is still better than 60FPS because if you have dips/skips they wont even be visible on a 60Hz monitor unless those dips go below 60FPS.

Regardless though, I would design your games to run at 60FPS, that's pretty much the standard. Running it at 120FPS takes a ton of extra processing power for a very minor difference.

As stated, these arguments are pointless. The demo above shows the difference between each FPS. On a 60Hz monitor, you can't see any noticeable difference between 120 FPS and 60 FPS. For gaming, 60FPS is the standard, but do not go below 30 FPS. Hopefully, this post supplements the posts above...

What about gameplay? does it really effect it? If the simulation is at, say 60 logic cycles and someone plays at 120 fps, do they have an advantage? I know that input can be tied to framerate but we will say that the rate the frames show doesn't effect anything.

They may have, depending on what rate the game is updating at and how its implemented. A commonly used gameloop is the one where you update at even intervals and then render when you have time over. Let's say you have an unusually high update rate of 100 Hz (hereafter referred to as UPS, updates per second). Let's calculate the average input delay of a high-end gamer with proper gaming gear and a computer that can achieve 100 FPS and a low-end "casual" gamer who gets 40 FPS.

The most important difference when it comes to FPS and input delay here is what happens when the FPS drops below the UPS. At 100 UPS the game should update every 10 millisecond. However, at 40 FPS the rendering part of the game loop takes 25 milliseconds by itself, meaning that although the game still updates at 100 UPS, they're not evenly spread out.

In general, game updating is often a magnitude or more faster than rendering a frame, so in practice the 40 FPS computer is reading input 2-3 times within a very short period, which effectively means that only the first read has a reasonable chance of catching any new input. The result is that although the game speed and all is constant regardless of FPS, the input reading effectively happens in sync with your framerate, not your update rate. 100 FPS = 0-10ms delay until the input is detected, 40 FPS = 0-25ms.

Gamer

Causal

Mouse/Keyboard USB polling delay

1000Hz = average 0.5ms

100Hz = average 5ms

Average delay until game-loop notices and reads input

100FPS = average 5ms

40FPS = average 12.5ms

Time needed for the GPU to render a frame

100FPS = 10ms

40FPS = 25ms

Monitor update time

2ms gaming monitor

8ms slow monitor

Total

17,5

50,5

In practice, the time needed for the GPU to render a frame is most likely significantly higher due to the GPU lagging behind the CPU to assure that it always has stuff to do. This is also amplified by a lower FPS, since the GPU starts limiting the CPU when either when it's X frames behind or when it has Y commands queued up. Being 3 frames behind at 100 FPS is only 30ms extra delay, but at 40 FPS we're talking about 75ms.

He said even just using windows in 120hz with a 120hz monitor the cursor is much smoother, let alone gamesanyone who says that 60 is even close to the limit we can see has never really researched this topic and is just repeating hearsay, aka being an idiot.

Quote

240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there's a good chance thats true between 120 and 240 too)

We'll see. Its very easy to test.Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we'll see.

He said even just using windows in 120hz with a 120hz monitor the cursor is much smoother, let alone gamesanyone who says that 60 is even close to the limit we can see has never really researched this topic and is just repeating hearsay, aka being an idiot.

Quote

240~ - They claim this is the absolute top, anything beyond this would just be silly. But the difference between 120 -> 240 are almost unnoticeable, even to the trained professionals. Anyone claiming they can tell the difference between 240 and 480 are just getting a placebo effect. (Quite frankly there's a good chance thats true between 120 and 240 too)

We'll see. Its very easy to test.Show me something 30fps and something 60fps for 2 seconds, one time, I will always be able to tell them apart. Its night and day. With that I know that 60 and 120 will still be quite noticable and I wouldnt be surprised if there is more room up. But we'll see.

I have a $3000 TV that does 240Hz. Although it's that MotionFlow 240Hz stuff, because there's no way to even transmit real 240FPS to a device over HDMI at 1080p, I don't even think dual DVI can do it. I *feel* like I can tell the difference between 120hz and 240hz, but it's negligible. Honestly I feel it's just the placebo effect.

MotionFlow 240hz only exists because it allows 120Hz *3d* playback, and you can see the difference there.. even though 3D sent over HDMI at 1080p is only 24Hz/24FPS. DVI on the other hand can break into nice 60FPS 3d though (takes a 120Hz TV to do it though). But really, using my MotionFlow for 240hz is kinda silly. It's just a byproduct of having MotionFlow 3D at 120Hz.

I don't know about other games, but take league of legends for example. I play only on 40-70 fps and it really seems choppy. My bro's pc runs 120-150 fps and it seems so much smoother. Its not all about fps in LoL. The more fps, more responsive controls. You can really feel the difference between 60 and 120 no matter what eye scientists tell you.

The thing about 60 vs 120 is... it's a law of diminishing returns. 60 is as good as it needs to get for nearly everyone. Although a lot of people will see improvements at 120Hz, they'll already be happy at 60Hz. The main problem with rendering at 120Hz is that you literally need twice the processing power and fancy hardware, so you're already limiting your audience drastically.

The past TV (CRT) screens also had some natural smoothing between the frames because the former frame faded only slowly and blended with the next. It's been less sharp this way, but motions appeared smoother, and the screen flickering was less notable.

Computer screens always had less of this smoothing effect, I think intentionally, and needed higher refresh rates.

If you would shoot a 24FPS Theatrical Movie using a camera that has an exposure of only 1/1000 of a second per frame, the movie would apear choppy.

True, and also why motion blur in games is so important since it increases the perceived frame rate. We still need at least 60Hz, but 60 FPS with a small amount of motion blur looks a lot smoother. I generally keep it off in shooting games since it's easier to see details when the game doesn't have motion blur, but I always leave it enabled in "movie" games like Crysis.

0.3 ms when screen is 100% stationary (fast path in shader).1.7 ms when the screen is 100% in motion.

1.7ms is enough to drop you from 60 FPS to 54 FPS, which is still well worth it in my opinion. My motion blur is also far from the fastest implementation, but it's still easy to tweak by clamping the maximum motion blur radius and the number of samples read for blurring. It's a feature that runs almost 100% on the GPU (some negligible CPU resources are needed to calculate motion vectors), and it scales with screen resolution since it's a postprocessing filter. If all else fails, you can just disable it.

I have found that the people who actually care about the perfect 120+fps are gamers. And I mean gamers not your average COD players. This means that the lowest common denominator doesn't care what the frame rate is as long as the game plays well.

Most people aren't rocking monitors above 120hz so why bother. It would be nice to have the game compatible for 60+ but they people with hardware to get that are even fewer. Its like those few PC (as we know consoles ain't hitting 120 anytime soon) gamers that have 120hz and the pimped hardware expect all this fancy stuff when they forget that their a small part of the market.

I think it will be a while before I look into 60+ fps as I don't have a monitor at above 60hz.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org