I have an XNA game using a basic lockstep update loop.
The client sends all commands to the server,
and updates only if it has received the next turn's commands from the server.

When does a turn end? When the Server says so (using the timer class, it sends out every 200ms a "Turn Done" to all clients. If a client hasn't received the turn done message from the Server then it does not update, it just waits until it gets that message from the Server.

So, this works fine, for about 20 seconds. After that, the game starts stuttering. I'm not hosting for other players, it's just my client and my server, all on the same machine.

The issue seems to be the XNA game loop itself. While the Server uses a System.Timers.Timer to keep track of... time, the game itself is using gameTime.ElapsedGameTime.TotalSeconds. And guess what, XNA.GameTime races ahead. The game reaches the end of the Turn before the Server does, so it ends up waiting for the Server to respond. And XNA.GameTime is consistently faster than the timer, so the delay just keeps getting bigger.

So, my question is: Which is the most accurate way to measure time and synchronize the game and Server in this context?
Also, even if I do synchronize two processes on the same machine (seems feasible), Is it realistic to expect two different PCs to be synced? (And no, I don't mean that both have the same time at the very same instant, but that 200ms on one machine is exactly 200ms on another machine in real time, as otherwise one of them will sooner or later fall behind. If it.s the client that falls behind he will lag, and if it is the Server that falls behind then the Client will stutter while waiting for the Server to end the turn)

ps Thanks for any ideas/insight.

pps Thanks but I am well aware of the 1500 archers article. Either it doesn't contain info helpful to this question or I am too dim to see how it does, so some extra explanation is what I need.

2 Answers
2

First of all for 2P lockstep using a server only introduces more delay. You should be using P2P for that.

Don't use real time. Use frames. It's much easier to track integral frame numbers than this XNA's concept of "gameTime". Forward-Mark each message with a certain number of frames (say 10) that represents the average latency + some buffer time that each command will be executed in the future

Say it is frame 80 and you want to move UP.

Send the command to the remote player and plan to execute yourself in 10 frames. If you don't get a response before 10 frames, wait on frame 89 until the remote player responds with "Yes I will execute at frame 90". Then you can execute it the command too. OR he might respond with "I've already passed frame 90, no can do". You can discard the player input then.

When the game is running smoothly, all messages will be received and responded to early enough that the game won't stutter.

What I would do is re-synchronise the server with the client every turn. Given how 200ms is 1/100th of 20s, the stutter won't be noticeable after just one turn. Then, on the 'turn end' signal from the server, both reset their timers to 0. At this point, the stutter shouldn't ever be detected by a player - the time difference over 1 turn is minimal.