DOOM 3 Capped
on IGN (thanks Frans) relates a tidbit about DOOM 3 gleaned at a recent
NVIDIA editor's day, where they learned that the game will be capped at a
maximum 60 FPS framerate. They quote id's John Carmack on the reason for this.

"The game tic simulation, including player movement, runs at 60hz, so if
it rendered any faster, it would just be rendering identical frames. A fixed tic
rate removes issues like Quake 3 had, where some jumps could only be made at
certain framerates. In Doom, the same player inputs will produce the same
motions, no matter what the framerate is."

In a mostly unrelated
note, we'd like to thank IGN for doing a story today on DOOM 3 and NVIDIA (who
prefer their name be shown that way in print) to help celebrate INTRNATIONAL
CAPS LOCK DAY.

Anyone who has played Diablo II should know exactly what JC is talking about.

Diablo II runs at 25 tics per second. Every aspect of the game is updated every 25th of a second: projectiles, damage calcs, you name it. The only element of the game that does not run at that speed is the cursor.

Since data is only being thrown at you every 25th of a second, there is no reason to attempt to render visuals at a greater speed. That's why the frame rate of Diablo II will never exceed 25 per second.

In the same way, Doom III will not ever render visuals faster than 60 fps. Well, perhaps the HUD and console could get away with higher render speeds, but they aren't connected with the game logic that matters in this case.

60 fps is certainly an acceptable speed, and it isn't like many people are actually going to achieve that performance when it comes to the visuals.

Every game should be capped at 60fps. Anything higher than that starts to look stupid because its way *too* smooth. Half-Life is capped at 60fps also, btw.

Actually, I would like to cap all my games at 15fps. Yes, you heard me right, 15fps. No, I am not being sarcastic. I think its because my mind gets the impression that the game's graphics must be uber awesome if my computer can only run it at 15fps; thus I think the graphics are better than they really are.

Since 3d scenes are rendered, and not mapped (like 2d) why would it be outrageous to think that the same data couldn't be re-rendered from a different perspective? (Mostly independent of the game engine).

The thing is that would require the card to be coded with some kind of engine of its own for representing the data on the screen. There are multiple ways to display 3D data on a 2D surface (the monitor) and to interpret 2D input (mouse) in a 3D world. Different 3D engines use different approaches. Without the game engine telling the card what to do, the card doesn't "know" how to respond to input by rendering a new scene. In other games the game would tell the card to interpolate frames between updates, but in Doom 3, it seems that this will not occur. Without receiving an impulse from the game engine, the scene will not be re-rendered.

If the game locks up, you may continue seeing an image on the screen because the monitor is still being refreshed, but you won't see an updated scene. The rendering is separated from any kind of game logic, which encompasses updating the positions of objects (including the camera), making AI and physics calculations, creating real-time shadows, etc.

Xtro: Yes, that is correct. The thing is that Carmack is saying that in D3 the framerate will be capped at the game's tic rate, meaning the framerate won't go over 60 FPS and thus outrun the game engine updates.

Indiv: I didn't make up any facts, I made up a theory. And yes, you still don't uderstand it, but it is apparent that I don't have the ability to explain it without provoking your animosity.

JediLuke: Thanks for responding without the hostilities that others are burdened with. I appreciate the information. 2D rendering works like this CPU/Program/etc writes data to a memory buffer, and the video card reads the memory buffer to discover what is needed to be displayed. If 3d rendering requires data to be sent directly from the cpu/program to the rendering path, then that is news to me. Since a 3d scene continues to render even when a game 'locks up' I was thinking that it worked the same way. Since 3d scenes are rendered, and not mapped (like 2d) why would it be outrageous to think that the same data couldn't be re-rendered from a different perspective? (Mostly independent of the game engine).

At any rate, I think I'll go post in a less controversial subject somewhere, like religion or politics.

The FlightSim vs FPS example misses the point. The game engine of each interperts the mouse movement differently and each can send a new view vector to render the scene. The 'mouse movement' had nothing to do with the POINT of trying to differentiate the tics from the framerate.

No it doesn't miss the point. If the game engine isn't running it isn't sending anything to the videocard. In Doom 3, the game engine will not be sending scenes to render to the card any more than 60 times in a second. As has been said, the videocard doesn't interpret input, it just displays the data the game code gives it. What if moving the mouse sideways is supposed to strafe and not look around? That has nothing to do with the card, the game engine receives the input, processes it, updates the gameworld (60 times/sec in Doom 3) and then sends the new scene to the card to render.

You'd rather insult people and pretend that you are all knowing rather than have an interesting discussion about 3d rendering.

Keep in mind that your knowledge was totally incorrect, but yet you said, in a later post, AND I'M COPYING AND PASTING THIS: "Just because you can't understand it doesn't mean it's wrong." You're insulting our intelligence because we don't believe what you just made up about real-time 3D rendering?!?

If you wanted to have a conversation about real-time 3D rendering, there are many people who'd have been happy to oblige (myself included). However, if you think for a second that anyone (well, besides a sadist) wants to talk to someone who spouts the wrong information as FACT, and then DEFENDS THAT INFORMATION BY INSULTING THE INTELLIGENCE OF THE PERSON REFUTING IT, then you're very, very mistaken.

YOU'RE the one who ruined the conversation, buddy. Not the rest of us.

Anyway, MeatFarts says, "Wrong. A movie shown in a theatre is shot at 24fps, but each frame is actually projected twice in the theatre, or you get ridiculous flicker. So it's 48fps playback... but half the time you're seeing the same frame you just saw. "

To which I say, and your point is???? I said that 24FPS looks ok. If it is 48FPS, then 60 for Doom3 should still be quite sufficient.

More than that, I've been reading how Quake 3 had 2 1/2 times less tic rate than what Doom3 will have, so I don't understand the bickering.

60Hz tic rate FPS will be incredible!!!

LOL. Well, I think my point was to say that 24fps does NOT look ok - you need at least 48fps for that. The example of "24fps looks ok in a movie theatre" that so often gets trotted out is intrinsically flawed, and I was attempting to explain why. There are a number of reasons why film cameras do shoot at 24fps, and the motion blur effect that someone mentioned is one reason they can get away with it, but 24fps truly does not look ok. Ok?

But this whole conversation is pointless and tiresome. I totally agree that a 60Hz tic rate for D3 will be swell. It seems like some folks in here are quite solidly confused about a few terms.. but aside from that, I really don't see why everyone has gotten so worked up over this.

This comment was edited on Oct 24, 11:04.

-----I'm not even angry. I'm being so sincere right now, even though you broke my heart and killed me.

You guys are so full of yourselves. You'd rather insult people and pretend that you are all knowing rather than have an interesting discussion about 3d rendering. My knowledge of 3d rendering comes from what I've seen from these games and from about 10 years of doing auto-cad design. If I'm wrong, that's fine, but try not to behave like pompous assholes.

In a 3d rendering environment, you have vertices that hold xyz positional references. These vertices can reference one another to form 2 dimensional lines or 3 dimensional planes.

Once you have the 3d environment mapped to memory, you define a vector (direction) that can be called your view. If you want to re-render the same geometry, only from a different angle, all you have to do is update the view vector and the GPU on your rendering device can re-render the scene independent (mostly) of the CPU.

Now is 3d rendering in games the same as it is in auto-cad? Of course not, but I was thinking that some of the same principles applied.

The FlightSim vs FPS example misses the point. The game engine of each interperts the mouse movement differently and each can send a new view vector to render the scene. The 'mouse movement' had nothing to do with the POINT of trying to differentiate the tics from the framerate.

Step 3, if you move your mouse sideways to change your view by 10 degrees for example... I'm assuming that the rendering device can re-draw the scene without any help from the game engine because it already has the geometry and textures in the memory buffer. The resulting view on the scren has changed of course, and is what -I- call a new 'frame'.

Wrong. To put it simply, the video card knows about and draws exactly what you're looking at--actually, exactly what the game tells the card you're looking at--no more, no less. As a simple example that can disprove everything you said, let's compare 2 3D games.

1) A first-person shooter. You move the mouse left, your view turns left because your player turns left.

How on earth would the video card know what you wanted to happen when you move the mouse? It doesn't know that it's rendering a first-person shooter--all it knows is that it's drawing a set of points on the screen (and doing other stuff to them).

So... when you move the mouse, the mouse tells the operating system that the mouse moved, the operating system tells the game that the mouse moved, and then game recomputes what should be drawn and then tells the video card to draw it.

What you said is completely made up, although I did enjoy how you presented it as fact. I love the internet.

The one thing I would like to comment on is the occasional declaration that the human eye can only percieve 24hz. Really now, we must look at the defenition of your word "perceive". Because you'd have to admit that every person reading these words can tell (or... "PERCEIVE") the difference between 60hz and 85hz. Now, we know this is a universal truth because we can all TEST it. RIGHT NOW. Easy. And as surely as we can TELL THE DIFFERENCE between 60hz and 85hz... Can we say the human eye PERCEIVES MORE than 24hz? Is that not an accurate defenition for our purposes? Or.... not? Am I comming in clear?

And just so someone doesn't post this after me:

per·ceive ( P ) Pronunciation Key (pr-sv)tr.v. per·ceived, per·ceiv·ing, per·ceives To become aware of directly through any of the senses, especially sight or hearing. To achieve understanding of; apprehend. See Synonyms at see1.

Yes, we surely can BECOME AWARE OF the DISCREPENCIES between 24hz, and 60hz, and 85hz. So, in closing all you 24hz believers are MOOORONS. Oh, and if all that 24hz crap was just a joke, then forget this post.

Anyway, MeatFarts says, "Wrong. A movie shown in a theatre is shot at 24fps, but each frame is actually projected twice in the theatre, or you get ridiculous flicker. So it's 48fps playback... but half the time you're seeing the same frame you just saw. "

To which I say, and your point is???? I said that 24FPS looks ok. If it is 48FPS, then 60 for Doom3 should still be quite sufficient.

More than that, I've been reading how Quake 3 had 2 1/2 times less tic rate than what Doom3 will have, so I don't understand the bickering.

j450n, good point. Although I was trying to concentrate on just the 2 elements of game data vs rendering speed. The mouse movement would require the engine to send new x/y axis direction to the display buffer to inform it as to which vector to interpert the geometry data for.