I am attempting to implement the deWitters game loop (The one where the game rate is consistent but the FPS is independent), but I have no luck!For one reason or a other I have a tiny stutter in my image as it moves across the screen even if vsync is enable.

If you guys could take a look at it and tell me what Im doing wrong that would be great.

//Bind the texture to the quad to drawglBindTexture(GL_TEXTURE_2D, playerImage);

//Begin the simple render//All of the player values (player.x, player.y, player.viewX, etc) are in double//player.width / player.height is the size in pixels of the image (in this case 32 x 32 image)glPushMatrix();glBegin(GL_QUADS);

Anyway, could it be a problem with SDL_GetTicks()? A similar function in Java had very bad precision on Windows, which could lead to heavy stuttering. Other than that I have no idea...

I know this is a java forum but code is still code. There is one fatal thing I forgot though and that was to post my draw function!

Also what type of timing precision should I be aiming for? Nano seconds (Is that even needed really)?

Well, at least the interpolation needs millisecond precision to be smooth. The reason why I mentioned the timing problem is because the standard way of measuring time had a very bad precision in some OSes, more to the tune of over 10ms or so. Since many important functions like sleep() and just basic FPS calculation rely on being able to accurately measure time you could get very bad stuttering. If that's not the problem, it's most likely some logic problem, but be sure to measure the FPS to see that you're actually getting exactly 60 FPS (assuming you have a 60Hz screen). You also didn't post your draw function?

Well, at least the interpolation needs millisecond precision to be smooth. The reason why I mentioned the timing problem is because the standard way of measuring time had a very bad precision in some OSes, more to the tune of over 10ms or so. Since many important functions like sleep() and just basic FPS calculation rely on being able to accurately measure time you could get very bad stuttering. If that's not the problem, it's most likely some logic problem, but be sure to measure the FPS to see that you're actually getting exactly 60 FPS (assuming you have a 60Hz screen). You also didn't post your draw function?

I edited my original post to show the draw function

My monitor is at 60hz and my fps calculator with vsync on fluctuates between 59.98XX and 60.03XX FPSIs that to much fluctuation?

Anyway, could it be a problem with SDL_GetTicks()? A similar function in Java had very bad precision on Windows, which could lead to heavy stuttering. Other than that I have no idea...

I know this is a java forum but code is still code. There is one fatal thing I forgot though and that was to post my draw function!

Also what type of timing precision should I be aiming for? Nano seconds (Is that even needed really)?

Well, at least the interpolation needs millisecond precision to be smooth. The reason why I mentioned the timing problem is because the standard way of measuring time had a very bad precision in some OSes, more to the tune of over 10ms or so. Since many important functions like sleep() and just basic FPS calculation rely on being able to accurately measure time you could get very bad stuttering. If that's not the problem, it's most likely some logic problem, but be sure to measure the FPS to see that you're actually getting exactly 60 FPS (assuming you have a 60Hz screen). You also didn't post your draw function?

You mixed that up a little. In java

System.currentTimeMillis();

returned good values. The problem was

Thread.sleep(...);

, as that method didn't sleep exactly as long s specified. Sometimes more than +/-4 ms.

Well, at least the interpolation needs millisecond precision to be smooth. The reason why I mentioned the timing problem is because the standard way of measuring time had a very bad precision in some OSes, more to the tune of over 10ms or so. Since many important functions like sleep() and just basic FPS calculation rely on being able to accurately measure time you could get very bad stuttering. If that's not the problem, it's most likely some logic problem, but be sure to measure the FPS to see that you're actually getting exactly 60 FPS (assuming you have a 60Hz screen). You also didn't post your draw function?

I edited my original post to show the draw function

My monitor is at 60hz and my fps calculator with vsync on fluctuates between 59.98XX and 60.03XX FPSIs that to much fluctuation?

Oh no! that fluctuation is really negligible

What I just notice, probably you should output "loops". I guess the while-loop is called sometimes often, sometimes not.

Well, at least the interpolation needs millisecond precision to be smooth. The reason why I mentioned the timing problem is because the standard way of measuring time had a very bad precision in some OSes, more to the tune of over 10ms or so. Since many important functions like sleep() and just basic FPS calculation rely on being able to accurately measure time you could get very bad stuttering. If that's not the problem, it's most likely some logic problem, but be sure to measure the FPS to see that you're actually getting exactly 60 FPS (assuming you have a 60Hz screen). You also didn't post your draw function?

I edited my original post to show the draw function

My monitor is at 60hz and my fps calculator with vsync on fluctuates between 59.98XX and 60.03XX FPSIs that to much fluctuation?

Oh, okay. Must have missed that edit. And that kind of FPS fluctuation is normal. It's caused by the CPU working out of sync with the GPU. The monitor is even further "away", so a little variation is expected.

It's a bit weird to extrapolate the player position. If the "player" is a ball and bounces off a wall, it'll continue through the wall until the next update discovers the collision and snaps it to its new position. Rather, I'd interpolate between the previous and current position. The new code for line 37 in that case will be:

1

player.viewX = player.prevX + alpha * (player.x - player.prevX);

I don't see how that would help though...

You could also try to run the game fullscreen. If you're running in windowed mode VSync might not work (though it seems to be working considering your FPS...). Frankly I'm running out of ideas here...

If I am using this type of game loop should I be converting my times into seconds? I don't think that would matter as long as everything is matching times (milliseconds or seconds), but at this point I'm trying to think of anything

See what this game loop is doing is "render as many as needed to fit into an X fps scenario". The problem is, there is no consistency in framerate; and because the loop is dependent on the render loop for motion, it will never be smooth. Also, you cannot have true 60FPS because you cannot divide 1000 milliseconds into 60 frames (you get 16.6 repeating) so it will flip around a bit. I don't care how many tutorials try to do otherwise - as far as I'm concerned the most accurate way to process motion is to do it based on time.

Sure, you can use this loop if you want, but NEVER simply + or - a value. Always consider motion as a vector that is applied to a unit of time.

1

PlayerPos += MoveVec * TimeDiff

That way your code will attempt to skip frames and multi-update without drawing if rendering is too slow, but motion will still be based on time, which will work regardless of framerate. OR you can just have a typical draw/update loop, and do time based motion that way and call it a day

@LunarEditWow! How did you solve the determinism problem with floating point numbers? And the problem of physics simulations exploding after a half-second freeze due to anti-virus scans or just plain old GC pauses? Amazing!

...

Nowhere does it say that we're trying to achieve 60 FPS. We're trying to achieve as high FPS as possible. If we want 60 FPS, we'll use the sync() method or VSync. Neither of them suffers from the "cannot divide 1000 by 60" problem you made up.

Processing motion based on delta time sucks because the game isn't deterministic. The exact values and timings of collisions will depend on the computer it's running on and also what other programs are competing for CPU time. If the game freezes up for a few milliseconds you'll get a huge time skip which might cause things to tunnel through each other or bullets to miss. If we instead have an extremely high FPS, we'll get floating point precision problems. The Call of Duty games are a prime example of this. When you had above a certain FPS you suddenly started jumping a lot higher. The other problem is that you won't be able to exactly repeat the calculations since the delta-times will be completely different on each run. That makes it impossible to do lockstep synchronization over network (useful for strategy games) or record a replay and play it back later, though those features are usually not needed.

Wow! How did you solve the determinism problem with floating point numbers? And the problem of physics simulations exploding after a half-second freeze due to anti-virus scans or just plain old GC pauses? Amazing!

It's called having a tolerance. If the delta time is smaller than the smallest time slice, you skip processing for that frame. If it's bigger than a certain granularity, you THEN can loop to chew it back down (each loop taking out MAX_GRANULARITY time from the difference). Honestly.. it's 3 lines of code.

Nowhere does it say that we're trying to achieve 60 FPS. We're trying to achieve as high FPS as possible. If we want 60 FPS, we'll use the sync() method or VSync. Neither of them suffers from the "cannot divide 1000 by 60" problem you made up.

You are mixing two different thoughts together and getting confused. What I'm saying is that your 'update' loop will not be hit in any even sort of way, which raw addition and subtraction would require.

Processing motion based on delta time sucks because the game isn't deterministic. The exact values and timings of collisions will depend on the computer it's running on and also what other programs are competing for CPU time. If the game freezes up for a few milliseconds you'll get a huge time skip which might cause things to tunnel through each other or bullets to miss.

Now you're just ranting, see one of your previous quotes where you say the exact same thing in a slightly different way.

The other problem is that you won't be able to exactly repeat the calculations since the delta-times will be completely different on each run. That makes it impossible to do lockstep synchronization over network (useful for strategy games) or record a replay and play it back later, though those features are usually not needed.

This is why you don't lockstep synchronize over a network - this implies that your client actually has control over what's going on. It shouldn't. In a network scenario, you send commands complete with timestamp, and the client will go ahead and start acting on the command, but the server is going to process and respond as it pleases (maybe at a lagged interval) with the true state of the game -- again, with timestamps to synchronize properly. If it's client-to-client, there will still be one client that is acting as the 'server' for the session. Same goes with recording. If AI and interactions occur over time, then it is worry-free. Simply record when the game started up, and then what actions were taken, and at what offset to the game start time. Then it will always be repayable. I've done this before, and you'll see it again on the game I'm doing now (I'll have a demo screen) -- it works. every time.

@Granularity: Not entirely sure how that solves anything. If a collision isn't detected, it would ruin determinism anyway. At least I would not want my collision detection to work better for players with better computers.

You are mixing two different thoughts together and getting confused. What I'm saying is that your 'update' loop will not be hit in any even sort of way, which raw addition and subtraction would require.

This does not make sense. The whole point of the fixed delta with interpolation game loop is to completely separate rendering from logic. Using a variable delta value implies that the updating is in some way related to the render loop, specifically the performance of the render loop. Besides, without interpolation the the render speed cannot go over the updating speed, but I won't "rant" about that anymore. -_-'

@Lockstep: I'm pretty sure lockstepping is standard for strategy games, in which case the client rarely acts without the server's consent since they're much less reliant on input responsiveness compared to for example first person shooter games. I assume your "it works. every time." means that you aren't using a variable delta for that game. If not, please explain how you're doing that, since to me means you are using magic floating point numbers. =P

I have tried changing form extrapolating to interpolating the character position, but like you said it seems to do nothing for me.I am still working on finding a more accurate timer to use.

But in the meantime

In order for my game loop to give me perfect smoothness I need to be "stuck" in the update loop for 16.666667 milliseconds (because my monitor is 60 hz and vsync is on)Lets also assume I never want to skip frames, there is no interpolation going on, and my computer can always keep up

So the following code :

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

//Set up the updates per tick; or the amount of time to spend on each game logic updatedoubleupdates = 1000.0 / 60.0; //16.66667 milliseconds (perfect update rate for 60hz monitor)

//Set up the start time of the gamedoublegameClock = GetTicks(); //returns milliseconds

What this is a java forum? I thought this was a c++ forum all this time... That would explain all of the java threads and the name of the site...I know this is not a c++ forum! I'm using openGL and coding is coding! Do I really need to explain more?

Yes, but running the game logic at 60Hz might not be possible if you have heavy physics or simply lots of objects, etc. That's why you may want to run the logic at a slower rate and use interpolation. Also keep in mind that not all screens update at 60Hz. Some refresh at 59.<something> Hz, others at 120Hz. Since we can't really assume anything about the screen's refresh rate, it's better to just pick an update rate that's as high as possible but still has good performance. The interpolation then allows you to actually render the game at any FPS. Vsync in turn can provide stutter-free synchronization (well, in theory I guess since it's not helping here...?).

Okay, here's my last card: For now, keep the update rate at 60Hz and enable VSync. Now, for each frame, output the interpolation value to the console. If the game really is stutter-free the interpolation value should be relatively constant. It will undoubtedly drift around a bit, and if something disturbs it (other programs hogging CPU time or so) it might jump randomly, but when it's not stuttering it should be constant. If your interpolation value is jumping around a lot, it could indicate a problem with the timing method. If it however actually is smooth even though it visually stutters (and you're not imagining it =P) I'd just conclude that it's unfixable and that it's probably a problem with your graphics drivers or something like that.

Yes, but running the game logic at 60Hz might not be possible if you have heavy physics or simply lots of objects, etc. That's why you may want to run the logic at a slower rate and use interpolation. Also keep in mind that not all screens update at 60Hz. Some refresh at 59.<something> Hz, others at 120Hz. Since we can't really assume anything about the screen's refresh rate, it's better to just pick an update rate that's as high as possible but still has good performance. The interpolation then allows you to actually render the game at any FPS. Vsync in turn can provide stutter-free synchronization (well, in theory I guess since it's not helping here...?).

Okay, here's my last card: For now, keep the update rate at 60Hz and enable VSync. Now, for each frame, output the interpolation value to the console. If the game really is stutter-free the interpolation value should be relatively constant. It will undoubtedly drift around a bit, and if something disturbs it (other programs hogging CPU time or so) it might jump randomly, but when it's not stuttering it should be constant. If your interpolation value is jumping around a lot, it could indicate a problem with the timing method. If it however actually is smooth even though it visually stutters (and you're not imagining it =P) I'd just conclude that it's unfixable and that it's probably a problem with your graphics drivers or something like that.

I did this and I am pretty good on the interpolation values! I really only fluctuate between .034 and .036Where my interpolation calculation (tried basing it on player.prevX instead of player.x, but I seem to jump forward) is

1 2 3 4

//updateTime : 1000.0 / 60.0 or 16.66667 //gameClock : the value added on to during the loop by updateTime; originally this value is set to GetTicks() (which returns a clock value in milliseconds) before the start of the main game loopdoubleinterpolation = (GetTicks - gameClock) / updateTime;player.viewX = player.x + interpolation * (player.x - player.prevX)

Now since this is based on deWitters 4th game loop like the title implies

He has his interpolation calculation as (copied directly from the article I was basing this post off of) :

1 2 3 4 5 6 7

/*SKIP_TICKS (time spent updating the game loop) : 1000.0 / 60.0

next_game_tick : the value added on to during the loop by next_game_tick; originally this value is set to GetTicks() (which returns a clock value in milliseconds) before the start of the main game loop

//The max amount of time the game logic loop can be updated befoire forcing the render callconstintMAXUPDATES = 5;

//Time we have updated the game logic loopintloops = 0;

//The amount of time (in milliseconds) spent updating the game loop in this case 33.33333 millisecondsdoubleupdateTime = 1000.0 / 30.0;

//Used for the interpolation calculationdoubleinterpolation;

/* Get the current time before we start the main game loop timeGetTime() returns the amount of time since the "system" has started in milliseconds In this case the system is Windows*/doublegameClock = timeGetTime();

while(runGame){

//Reset the loop countloops = 0;

/* The GAME LOOP! Continue to update the game until the game Clock is greater than the current time ( timeGetTime() ) OR Until we have hit the max amount of time we are allowed to update before we a re focring the game to render ( the loops < MAXUPDATES part) */while( (timeGetTime() >= gameClock) && (loops < MAXUPDATES)) {

It seems that the inclusion of updateTime in the interpolation calculation greatly effects this. If I did at it in the first part of the equation I started jumping!

If you look at your updating code, you can see why you need to add updateTime to the time you calculate. Since you loop until

(timeGetTime() >= gameClock)

, the value of gameClock will be more than the current time. If you don't add updateTime to the value, the value

(timeGetTime() - gameClock)

will be negative! To be precise it will be between -updateTime and 0. If the update was instant, gameClock will be exactly one frame ahead, so the interpolation value becomes -1. If updating took a long time, it'll approach 0. Therefore you will want to add updateTime to it before dividing to bring it to 0.0-1.0 instead.

A note about the interpolation though: Remember that the update loop only updates 5 times at max. If the game freezes for a second or so, it'll behave a bit weirdly. In short, the game will attempt to show an extremely extrapolated view of the game since

(timeGetTime() >= gameClock)

will be extremely large and the "interpolation" value will be over 1.0. Since this might make the game look extremely weird (things continue through walls and so, basically the same popping you got before when movement changed but 10x worse) while the game catches up, it might be a good idea to clamp this value to 1.0 to prevent potentially inaccurate extrapolation. Just an idea though, depending on the game, the extrapolation could in the case of linear movement completely hide the fact that it's still catching up since the extrapolation is accurate. I guess this is kind of a minor detail since if your game freezes up for a long time you're kind of having more severe problems than the interpolation not being 100% accurate. =S

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org