Recommended Posts

So I've been mulling a few ideas around in my head for the past few days, mostly dealing with decoupling game logic and game state from the rendering loop. An obvious way to do this is with the client/server pattern. The server maintains game state and executes game logic, communicating with the clients. Messages from the clients supply input to the server, and the server broadcasts gamestate updates to the clients. This, however, leads me to many questions:
First, my (naive) idea of an implementation of this would require the rendering loop to keep duplicate information around regarding the current gamestate that it is rendering. When it receives an updated gamestate from the server, rendering would have to momentarily pause while internal pointers were switched to point to the new, updated date describing the world objects. Is there a better (perhaps somewhat standardized) way to handle this?
Second, how frequently should these gamestate updates be generated? Across a real network the answer is pretty simple, the data rate requested by the connecting client, but when a local client is connected (via localhost(perf. overhead?) or other internal communications) how often should gamestate updates be sent to the client? One could conceivably send out the updates at a fixed interval, fast enough for the local client to be responsive, and send updates to remote clients at different time intervals, depending on their data rates. Would this approach lead to smooth gameplay though?
Lastly, on the low-level networking end, what generally comprises a gamestate update sent across the network? Two strategies come to mind immediately, sending a complete gamestate every network broadcast and sending an update to previous gamestates. Sending a complete gamestate would be more robust, as it is unaffected by packet loss. However, sending updates would result in lower amounts of data being transmitted, and thus more updates per second and smoother gameplay. The synchronization problems with this approach are many. It would require detecting packet loss and either re-requesting the lost packets to recreate the sync'd gamestate or re-requesting a new complete gamestate from which to base further updates. Is it really worth all of that trouble, or is the difference between the two too small to notice?
Thanks for you input!

Share this post

Link to post

Share on other sites

Some of these questions are harder to answer depending on the type of game you are writing. However I can offer some input based on what I have designed for my MMORPG.

First of all in regard to your first question and relating also to your second. The best way I found for displaying game state changes and dealing with network lag is to provide a "game state buffer" so to speak. The client should always have some information ahead of what it is displaying loaded into memory. For example if a character moves in a certain direction the client should have enough loaded in the buffer to display what is coming up in the next loaded screen. This will help out with lag while moving around in the game because the client can process and display somethings immediatly with out having to always wait for a response form the server; However changes that must occur live on the screen still need to be dealt with. The way I handle these on the client is to have a seperate thread reading data from the network while the other thread is in a display loop constantly displaying the current state of the game. When the network thread receives data it can preprocess it and then "lock" the display thread through a mutex for a millisecond or so and update the data that the display thread uses to draw the screen however the network thread never draws the screen itself.

In regard more to your second question the way I handle game state updates are more simple. Essentially if something would trigger something to happen on a players screen or in the buffer area I immediatly send in real time what has changed. I never send full game state changes unless the whole area changes

Finally in regard to your third question depending on whether you are going to be using a TCP or UDP based protocol makes all the difference. In my game I made the decision to use TCP over UDP because you do not need to worry about losing packets. They are automaticlly resent if they are lost. Only thing that needs to be handeled is if a Player has disconnected or their receive buffer is full. To read more about these protocols do a search on Connection-Oriented Protocols (TCP) and Connectionless Protocols (UDP). Also see Sockets

Share this post

Link to post

Share on other sites

I'm planning on writing a multi-player actin game, so responsive gameplay is important. How noticable is being a few frames behind for real-time events?

I've also made the decision to use UDP partially because of the speed benefits it grants over TCP as well as for personal edification. I've written a lot of apps using TCP (multi-threading webserver, chat server, etc) and I'd like to try a larger project using UDP.

Share this post

Link to post

Share on other sites

TCP is slower overall but there are ways to increase speed and keep the advantage of reliability such as disabling the nagle algorithm used by standard sockets which delays data sending.

But if you decide to stay with UDP you must realize your only real way of packet error detections are a message counter and response time out. Additionally UDP holds no session this brings up several issues such as sender identification and security. These can be dealt with but you are programming much more detailed components and TCP handles those two issues better than UDP.

Ultimately its your decision what to use I am just giving you my input from experience.

Share this post

Link to post

Share on other sites

Thanks for the advice, I think I've got a more clear picture of what I'm going to implement now, though I'm still waffling about which transport protocol to use...I hadn't thought about disabling the nagle algorithm to get more performance out of TCP.