If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Welcome to OCF! Join us to comment and to customize your site experience! Members have access to different forum appearance options, and many more functions.

As of May 1, an ISP/EDU email is NO longer required to access the Classifieds. For more information or to gain access, visit Classified Access Rules Change thread. (100 quality posts and 30-day minimum membership are still required)

But ... it also depends on what you are used to.
For example, my lil brother is barely able to play Ut2k4, but he still manages to get a fair amount of frags with a framerate between 10...15, go figure. Choppy as hell for me when I look at his screen, but he is wondering why I don't find that playable.

For me ... well, as long as the FPS is above 25 (SP) or 35 (MP) and (more important) stable! I'd rather tweak my system for a low stable framerate than to have big differences.

As mensioned above anything above 25 fps arent noticeable
in anyway to the human eye, Just do a google on the fact ...

I play with Vsync on at all times if possible and so i dont see more than
85fps in games multiplayer or single player but just to make things
interesting i do change vsync every now and then to see those
UT99 fps shoot up to near impossible inconceivable high fps ...

As always though with game fps - The higher the better in my oppinion ...

My computer can handle pretty much any game I throw at it right now. It's latency that kills in FPS. As long as I stay above 25-30 fps I'm happy. Usually I'm a lot higher than that but I don't really notice the dips as long as it stays around 30. It's when my ping spikes from 10-30 to 100+ that I see a differance.

Originally posted by M4D As mensioned above anything above 25 fps arent noticeable
in anyway to the human eye, Just do a google on the fact ...

In fact, you should do some googling of your own and you'd find a significant number of truly scientific studies that say otherwise.

Back to the subject at hand, I voted for 35-45. Uber-fluid FPS are always nice, but I never had the money (until recently) to keep up with the newest of available hardware. A P2-450, 512mb of PC100 ram and a GF3 Ti200 went a VERY long way for me, even to the point of UT2K3... I don't need an uber-ton of framerate to be happy with a game.

Which is a good thing, because even on my rig now, I can't play FarCry with those kinds of framerates anyway

"In order to combat power supply concerns, Nvidia has declared that G80 will be the first graphics card in the world to run entirely off of the souls of dead babies. This will make running the G80 much cheaper for the average end user.""GeForce 8 Series." Wikipedia, The Free Encyclopedia. 7 Aug 2006, 20:59 UTC. Wikimedia Foundation, Inc. 8 Aug 2006.

I have to be over 50 or else it just seems to choppy. I believe I have very good eyesight because I could tell the difference between 75 and 80-85 FPS very easily when I increased my resolution and my vsync lowered my max FPS in CS to 75. I usually run 1024x768 so I get 85 FPS constant. Even in the UT2k4 I still get 50-70 FPS with mostly everything set to highest. Guess this 9800se is not that bad after all.

4 me, a decent number of fps is at least 75. That's for your monitor. 85 is even better. Getting more in a game than your monitor can support in that resolution causes tearing. And don't give me that "your eyes can't see more than 30 fps" bull****. I can clearly tell the difference between 40 and 100 fps.
Anything below 50 fps and things just get a little choppy.

I usaully will change a vid card if my FPS drops below 12-14, in whatever game I happen to be playing at any given point in time. unless I come across a deal I can't pass up. Last upgrade was free (GF4 Ti-4600). When this thing can't handle more than 12-14 FPS, I'll buy something new. Everyone is different. Can I see the difference between 14 FPS and 100 FPS? Of course! Some people can't play with low framerates, and some people can. I'm one that can. Low framerates don't affect how well I play. It just makes me have to predict rather than react in games. Having good eyesight has nothing to do with it. It's what your brain can compensate for is what make the difference.

Originally posted by Valk Since the human eye cant see more than 25 images in one second, Im comfortable with my fps at 25-30. its really kind of rediculous to have more, since you cannot see the minute detail changes at the higher frame rate. if you turn the camera swiftly, your eye naturally blurs the image since it cannot track the change of image locations quickly enough to redrew it crisp and clear.

I agree on the fact that we can't see more than 25 different images per second, but if you try and play quake 3 for example at 25fps, its VERY jerky, which makes me wonder what the differences are between TVs and computers when displaying motion.

TV's run at 60 interlaced fields per second, which creates a full 30 frames per second, which produces a bit of motion blur. And in most cases, the actual source media for your TV (DVD's, VHS tapes, actual production over-the-air video) has motion blur in it too (ever pause a high-speed action sequence on a DVD? notice how it's blurry?)

Computers update the screen one-entire frame at a time, which gives no sense of blur. Combined with full-frame rasterizers that don't blur either, and you get 60 completely different frames per second technically. They aren't blurred together, they are all discrete in their motion. Thus, you can perceive the difference easier on a monitor than you can on a TV.

What's really interesting is watching a normal TV compared to an HDTV running in a non-interlaced video mode (480p or 720p). The demonstration I saw was Pirates of the Carribean on DVD, both being played at the same time index on two identical DVD players plugged into two identical TV's. One was running in "standard" mode, one was in HD 480p mode...

The HD mode was incredibly smooth, it could almost make you motion sick how smoothly the skeletons were crawling over the ship deck in the final fight scene. That's because HDTV in non-interlaced mode is actually running at a true 60fps, and you can tell the difference quite noticeably.

I agree on the fallacy that we can't see more than 25 different images per second, but if you try and play quake 3 for example at 25fps, its VERY jerky, which makes me wonder what the differences are between TVs and computers when displaying motion.