If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

I don't think motion blur has anything to do with it, except make the image look less sharp.

AFAIK anything above 30 fps it's basically wasted. You _may_ (or may not) tell the difference between 30 fps and 50 fps if you watch two animations side by side, one in 30 one in 50. But if you see one today and one tomorrow, even 20 fps will probably look just as smooth.

Just keep in mind that the 30 fps is only valid when the rate is constant - like in a movie theater. The problem with gaming is those annoying peaks and valleys in the frame rates. You can be looking smooth at 30 fps and then suddenly it drops to 10 fps and - stutter, stutter. So, for computer gaming you need some overhead. If I had to guess (but I don't really), I'd say that 60 fps would be a comfortable number to shoot for.

As you all can see, the Demos from, Eg, Q3A, are just a standard one and the final results are of an average throughout the Demos. The actual speed may differ according to the time when you play. This means that when there are more things on the screen, like more human or moving objects, the fps will surely slow down to an extend.

To enable a rather smooth game for most of the game, a 50fps on both Demos is a recommendation by me, since it can take care of the more intensive scenes.

By the way refresh rate, the higher the better, a low refresh rate will get your eyes tired really fast. And you can tell the difference between 60hz and 72hz, if you think you don't see it, don't look the monitor directly, look at one side....(and see). Anyway, refresh rate above 72Hz will do a better job, but you will not be able to tell the difference.

Based on this, i will say the top fps the eye can see will be more than 60 but less than 75.

If you were in a world full of crazy people, who would be the crazy person?OTCentral

Ok this is a 3DFx explaination because I can't find a better one at present. forget the 3Dfx sales pitch and read the bold part as that is the important facts.

Motion blur can both remove the jerkiness from a computer-generated animation and create the illusion of enhanced speed and motion. Have you ever noticed that there is no jerkiness in a movie, but there is plenty of jerkiness in a computer animation? Computer animation and movies are created in completely different ways. When shooting a movie, the camera's shutter opens, stays open for 1/24th of a second, then closes instantaneously before opening again to capture the next frame. All of the motion that happens in that 24th of a second is captured on the film. In fact, if you were to look at fast-moving objects in a single frame of a movie film you'd see that they are actually blurred because they are moving while the shutter is open. The result when you string together a series of such frames is the appearance of very smooth, continuous motion from frame to frame.

Now with regard to creating computer animation, suppose you're running at a frame rate of 60 frames per second (fps). A single "frame" of computer animation is displayed on the computer screen, held there for 1/60th of a second, then instantaneously the next frame of animation, which occurs 1/60th of a second in time later, is displayed. In virtually all of today's computer-generated animation, the objects in each frame are sharply rendered, motionless in the frame. When the next frame is displayed and an object, such as a vehicle or ball, has moved in the 60th of a second that has elapsed between the two frames, the object suddenly appears in a new location. Our eyes are so sensitive to motion that no matter how high the frame rate, even 100fps (frames per second), most people will notice the jerky motion.

3dfx's motion blur feature simulates an object's motion during the period of time that each frame is displayed on the screen. Moving objects are blurred, just as they are in real film, to enable very smooth and continuous motion. But we can do even more! By exaggerating an object's motion blur, T-Buffer can create the illusion of tremendous speed and make a scene much more visually appealing. To learn more about the T-Buffer and motion blur, take a look at our

If I see one more person say you can't see beyond 30 fps I'm going to go pyscho. It is simply not true. People, get your head out of your asses and stop spouting such obsurd things. If you believe that 30 fps is the limit, God help you because you're dumber than a door knob. &lt;this is where I slam my head against a hard concrete wall at the shear amazement at the general stupidity in this forum&gt;

3dfx's motion blur only blurs objects, not the entire screen. When you turn your character's head in a game the world will sutter like it always does. Moving objects will appear to blur. A far cry from simulating a real shutter.

Watching a movie with real motion blur, the limit MAY be about 60 fps. Things like this will vary from not only person to person, but vary on how hard a subject concentrates, and if it is a A/B/X or side-by-side comparison.
But in a game where you do 180 degree turns in a fraction of a second with no blur (besides ghosting of the phosphors in the monitor) I'm positive the average human can see the difference between 60 and 100 fps. (of course assuming the refresh rate on the monitor is set to 100+ hz)

coolguy867: The real life is analog. This is not the Matrix.
Some light is given off descretely, though. Wave your hand right in front of a TV or your computer monitor. It will look like your hand is blinking at whatever speed your monitor is set at (60hz for a TV). Now go outside on a sunny day and wave your hand around. It will just look blurry.

I'm sure there is some blur associated with the time it takes for a cell in your retina to distinguish the speed of changes in brightness and color. If you've ever looked at one of those inverse American flag optical illusions you can see that there is a delay of sorts. Just remember that anything to do with the eye is all analog. There is no hard set limit, no hard on/off, etc. Your eye doesn't sample in a hard set speed. It is a constant signal. It is possible there is a rise and fall time for changes in that signal.
It's the same arguement for vinyl records vs CDs. A vinyl record is NOT samples, but a continous signal, the same way your ear receives it. A CD, even though it samples very fast, is NOT a continous signal, but a digital blocky version of the sound. Vinyl is like an infinite khz CD. (IMHO, vinyl sucks because it scratches and is not very portable, and CD samples fast enough that it is almost impossible to tell the difference) In the same way, real light in the real world is infinite frames per second. The soft limit is how accurate and ever trained your ear or eye is.

My personal opinion is that anything past 100 fps is very hard to distinguish. 30 fps isn't even close and gives me headaches.

I run qiii on a crappy k62-500 and voodoo 3 2000 pci, watching real time fps. It ranges from 18 to 40 or so, at 18 I can sort of tell...but it doesnt stay at 18 but for a split second then its back to like 25 or so...after a few hours..I do notice some fatigue.....maybe thats the difference

with lower fps you didnt notice it...but you find yourself fatigued...tried, irritable,,maybe even with a headache?

I can't belive you guys think that 30fps is good for a game, like a first person shooter! Don't believe that ****. Above 50 is acceptable, and you would need much more as an average to never drop below 50.

Take a good look at a movie. A wide camera panning looks choppy enough, even with the natural motion blur.

Refresh rates have nothing to do with FPS. You really can't match a moniter's Hz with a video card's FPS. Totally different measures. Our eyes can only notice up to 30 FPS, and that's on the kean side.

As Izomorph said, refresh rate is totally different from frame rate in games. A monitor at 60 Hz flickers, and that has nothing to do with motion, but with the fact that each pixel is lit for a very short time, and then quickly decays back to black. Briefly each pixel is some times much brighter and some times much darker than the average, and that tends to annoy your eyes, who can only adjust to the overall average. That's why you can see the flickering on a monitor at 60 Hz non-interlaced, but not on a TV at 50 Hz interlaced (effectively 25 Hz refresh). The TV's phosphor has much higher latency.

On the other hand, you can set your monitor to 85 Hz or even higher, so the image is stable, even if your game does 20 fps.

For the record, the first movies were made at 16 fps, which shows that the brain already starts interpolating the movement at that rate. Again, the transition was to 24 fps (not 60 or 75) and also had a lot to do with flickering, not just with smoothness.

As for motion blur, excuse me, when was the last time you saw motion blur IN REAL LIFE? So how's a movie's blur supposed to help? The motion blur in movies is more of a side effect, than something that makes them cool. Also note that unless we're talking car races, there is not enough blur in movies, either. When you see someone walk or even run in a movie, they are not blurred at all. When you film something you generally have the shutter set to hopefully minimize it, so the image looks crisp, not to throw in more blur. (Except in some very special effects, like time warps, flashbacks, etc, which are artificially heavily blurred, by combining several frames into one.)

Ditto for the background being out of focus. It's more of a limitation of physical cameras, than something neat. It doesn't make it more realistic. It's something that's _acceptable_ as long as I focus on the same thing as the camera, and pay no attention to the background. But if I, say, try focusing on some other detail, all of a sudden it feels very different from what I'd experience In Real Life. In Real Life, my eyes would adjust to that detail, and I'd see it crisp and clear, not fuzzy and out of focus.

So, 3DFX marketing BS notwithstanding, why would I want those in a game? Movie producers would probably sell their soul to get rid of those limitations of real cameras. Why would you want those problems added to a medium that luckily doesn't have them?

I don't think you are understanding the facts of that explaination. forget motion blur. what It's trying to say is a moving object is caught every 24th of a second so if it were a ball it could have moved while the cameras shutter is open so it will appear oval or "blured" this creates a smoothness that you will not get with a computer image simulation that is a sharp rendered image unless uo have high enough fps (about the speed of the balls movement) so as to not require the bluing effect.

3dfx are just putting in artifical effects to simulate this together with depth of field that will make the whole experiance more "real"

If I were less politically correct I'd say exactly what Freon said. 30fps is not even close to the limit of what the eye/brain can notice.

Also no movie director would want to lose the motion blur. 24fps is really very little, and extensive post production must be done before any movie appears smooth. Even the motion blur alone isn't enough to compensate for the low framerate. And high-speed shutter cameras are nothing new, any film director can get one if he would want to for some reason.

BTW NTSC TV is actually 60fps, it's just interlaced to save bandwidth and make the first TVsa possible. Try looking at a TV show at 60 fields per second and then covert to 30 fields per second and tell me you don't see the difference. I'll get you a good doctor if you can't.

"No doubt the truth, as usual, would be somewhere between the extremes" -Arthur C Clarke

Well as I've heard the eye can see 25-30 fps in real life but I think that rule bends in computers. Television for example is a steady 30fps and we see it in fliud motion but I can see differences on 30fps on games. Average fps is no good because you wont be certain that you'll be on that mark all the time. When a firefight is going on it drops waaay down. For me I anything above 40 is good enough. I really don't see why i should get 100+ fps since i hardly if not never notice it. You'd have to be a bug (a fly perhaps) to see every frame when you hit 100+ fps.