If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Don't get me wrong, I'm not against 3DFX making these effects available. As I've said, for some well defined situations, hell, you may well need motion blur as a special effect. But I seriously doubt I'd want it applied wholesale to a whole game.

Let's face it, people, if I wanted to have my graphics blurred and out of focus, I'd still be playing on an ancient composite colour monitor. Or on an old LCD screen, from the DSTN era or earlier, whose latency blurred anything that moved anyway. Including the mouse cursor. But I'm playing on a relatively expensive 17" monitor with small dots size, in 1024x768, 32 bit colour.

All these years, the whole computer industry has struggled for sharper quality graphics. Now we're told that's all wrong, and I'd need some expensive card to make my image blurred and out of focus again. Why? Is it just me that sees something wrong about this picture?

Again, some 3DFX features I aggree with. That FSAA looks great. And I'm happy even with the idea of motion blur, if it's used very sparingly, in situations where it's only needed, as a special effect. But that I'd need all my games blurred... Sorry, folks, I just don't buy it. To me it's just marketing BS.

It's not that the picture itself should be blurred. On the contrary, you should try to get it as sharp as possible. It's only the motion that needs to be blurred if the fps isn't big enough for your eyes themselves to do the blurring part. And not tuu much motion blur either, just enough to fill in the gaps between individual frames (ie like 1/24 seconds for 24fps).

"No doubt the truth, as usual, would be somewhere between the extremes" -Arthur C Clarke

I can hardly believe I am participating in this flame bait. Freon, FS, Bladerunner are correct - not by opinion, but by fact.

The human brain does not even see in FPS, so the whole discussion is asking the wrong question. The human brain EXPECTS motion blur; requires it. Motion blur is not a "special effect" but a consequence of how we perceive motion. "Frames" are a discrete sample, and is the best current technology can do to simluate images.

As a game player, 30-60FPS is way too low. My brain tells me this by giving me a headache after 15-60 minutes. Just sit still in Q3A, and turn left/right. At 60fps, if the turn takes 1/10 of a second, that is 6 frames in the turn - not enough to tell when to stop turning and fire accurately. We must interpolate, and that is hard to do without the blur.

Perhaps we can do what 3DFX wants - but not with today's technology. We would have to be doing 60*5 FPS then blurring 5 frmes at a time to get a real motion blur. Only then will things look "real" and touch on what our vision is capable of.

Mobydisk, do you at least realize the monstrosity of what you're saying? To assume that the 60 fps is what's limiting your turning and aiming, is to assume you could react in less time that 1/60 of a second.

THINK about it. Your brain has not only to assess the situation, but also transmit the information all the way to your arm, hand and fingers, via relatively slow nerves. (No, you're not working on copper wire, nor on optic fiber ) Said muscles have to actually contract, which is a relatively slow process, and you have the mass of the whole arm and hand to slow you down.

Sorry, THAT I really don't buy. Saying that your eyes can even tell the difference at that speed is one thing. Who knows, maybe your eyes really are miracle. But claiming that the whole eye-to-brain-to-hand process can happen at THAT kind of speed is just ludicrious.

To give you an example, the common fly may well be the fastest reacting creature, needing only approx 1/10 of a second to change direction in flight. For that it has eyes with approx 1/100 sec time resolution, and a sort of bypass mechanism that connects straight to the flight control sensors, which in turn have a direct (and very short) link to the wing muscles. It also only has 1-2 gram to move around, unlike your arm's weight. Now you're telling me you're at least 6 times faster than that? Gee, now THAT would be a miracle

Umm... RoadWarrior, if you'll read this thread, you will see that the issue of screen refresh rate versus game frames per second has been addressed already. Your strobe light example is an example of flickering, not of motion smoothness. And flickering has to do with screen refresh rate, not with the game's FPS. It's a good example, but for something completely different from what was discussed. 'Nuff said.

LMSAO thanks for lightening up a discussion that was heading into the usual tedium

It's not that the picture itself should be blurred. On the contrary, you should try to get it as sharp as possible. It's only the motion that needs to be blurred

originaly posted by FS

.....sums it up very well I think. This feature will hopefully remove the jerkyness of fast moving objects without bluring everything. You also must realise that this is a "new" graphics feature and will probably need time to develop to it's best in the same way the basic forms of anti-aliasing have lead to FSSAA.

movies at 24fps eh? Some movies, like i was watching 'best friend's wedding' the other day... when they panned across and followed an actor, it seemed more 'floating' than usual. It seemed like every frame was there as it moved across a room... (not that I can tell truthfully)... being something out of the ordinary, I just assumed it was bad filming... I guess it's just good quality filming after reading this thread?

In real life i hear that i eye see like if it was 128 fps in game game!! 128 is the total of the eyes!! Maybe am wrong but that is was i heard!! you will never be able to see a game faster than 128 fps!!

Izomorph: Actually refresh rate and frame rate used to be locked together back in the early Voodoo 2 days before drivers let you disable vsync.
Why was the "vsync disable" option added? To increase frame rate! Voodoo 2 cards with vsync on were basically stuck at about 50 fps.
Vsync causes performance to trail off near the refresh rate. i.e. if your refresh rate is set to 60hz, it is very hard for your video card to display at 60 fps. The closer it gets to 60 fps internally, the more performance it loses purely due to vsync. If it doesn't get 100% done writing a frame in that 1/60th of a second, it has to wait until the next frame to finish up and display it leaving a lot of dead processor time. Syncing the timing on devices is one of the greatest challenges in computer architecture from a broad perspective, but I digress...
Try running a Geforce Vsync locked at 60hz (and subsequently 60 fps). It will STILL look choppy. If you set your refresh rate to a gazillion hz, 60 fps will still be identifiable. Nice try Moraelin.
Discrete does not equal (and never will) realistic or transparent.

The problem you don't see is that even in refresh rate = frame rate, a computer still displays DISCRETE images. The only savoir it has is that if you get refresh rate and frame rate high enough, the ghosting and fall time of the phosphors on the monitor themselves could create enough blurring to fool you.
30fps is NOT enough to fool the eye and at 60hz with a computer CRT the fall time of the phosphors are (thank god) not that high either. If you've ever played using TV out you can actually see how the fall time on the phosphors can blur up the image over the time domain.

Ok, next topic.
I guess the whole point of the T-Buffer is "Hollywood on the desktop". In real life your own eye can only focus on one depth plane (good luck trying to tell what the user is TRYING to focus on *snicker*). A sniper rifle's scope has trouble at under 25m, etc. It's for immersion. I'm not totally sold on it either. I guess it would be kinda cool if in Counter-strike people would look blurred if they were close and you were zoomed in two clicks with a sniper rifle. Eh?

Ok back to refresh rate, fps, blah blah blah.
Morelin: Wave your hand in front of your face in natural light. It's called motion blur. Your retina and brain attached to it are analog devices and have their own latency effect blur. Try it, it works. And again try waving your hand in front of a bright computer monitor. Doesn't look natural. Turn your monitor refresh up to 200hz and it STILL won't look natural. That is a damned close simulation of 200fps vs. real life. This example bypasses your inane refresh rate arguement because refresh rate = frame rate.
Also, do you have any idea the fall time on the phosphors on a CRT? They stay lit well PAST the next screen refresh Captain Ignorant. At 60hz, the phosphors stay lit LONGER than 1/60th of a second. I suggest you go do some more reading on the subject. The heart of your arguement says that a monitor's phosphor pop on and off BEFORE the next electron gun pass (like a strobe light). It's not true! Haven't you ever waved your mouse cursor around a black screen? Do I need to go quote technical specifications for you? Post some fall off graphs?

On a computer with discrete frames, whether vsync is enabled or not, whether refresh rate = frame rate or not, the human eye and brain can see the difference WELL past 30 fps, and even WELL past 60 fps. Given the right test and a fast enough moving object, probably past 200.

FS: Yes, I was going to try to be PC about it but then thought there was no way because some people just have their heads crammed SO far up their asses that they'll never see the light. And I've done the single field vs interlaced test on my ATI AIW card (where I can change it back and forth on the fly). Big difference. Single field clearly looks choppy. Excellent example. And it has NOTHING to do with refresh rate.

Alittle more on TV for those interested. Yes, TV is 60hz. Every other line is draw 60 time per second. That equates to having the ENTIRE screen redrawn at only 30hz, true. It is like running 60hz at half the resolution, NOT 30hz at full resolution.
Motion pictures clearly do use motion blur. Frame step a DVD or pause a VHS tape (best on a good 4 or 6 head VCR). Toy Story, Bugs Life, etc. even use motion blur! But I'll bet you never noticed... You would have noticed if they didn't, though.

JM: "see 25-30 fps in real life " Real life is not descrete. It is 'inifite frames per second.' The only limit is the soft limit of the human eye and brain.

mody****: Ahh yes. Full screen motion blur. Multiple per-pixel sampling over the time domain rather than the space domain. Same performance problem as FSAA, but also the geometry, physics, AI, etc. must be calculated for all the inbetween frames, too. Big ouch for performance. T&L anyone? T&L&AI&TR&P (transformation, lighting, artificial intelligence, translation, and physics?) LOL

There is no way you can give the eye an fps figure, especially anything accurate to three figures. And 128 is a frickin' binary number!! (2^7) To my best knowledge most humans don't operate on binary code

That number might be a good estimate though, maybe something in the 100-150 fps range will start to get completely indistinguishable from Real Life(tm).

"No doubt the truth, as usual, would be somewhere between the extremes" -Arthur C Clarke