Prez wrote on Nov 28, 2012, 19:58:Help me out here - I think I understand the basics of 'ping' but I may be oversimplifying it. A ping of 250 is actually 250 milliseconds, or one full quarter of a second. In other words, as I understand it, with that ping you would have to conceivably be aiming where your opponent will be in a quarter second from when you fire. Is that correct? A bit of a noob question I know but being a singleplayer gamer my whole life I have never really paid much attention to multiplayer idioms. Never really paid much attention to networking at all until only recently.

Sort of. I'm really rusty on this stuff but...

The ping is usually round-trip time, so it's how long it takes your cpu to send a packet, the packet to get there, the server to process it, packet to get back and your cpu to process it.

However that does not mean your latency is actually half the ping.

Remember what you see on screen is already out of date by the time it took you to get the data from the server. So by the time the server gets your response the total gap is indeed the full ping time.

This is based on Quake-style netcode where everything is done on the server. I'm not going to try and explain or even pretend to understand other approaches because whenever I read anything about them I get annoyed - IMHO the server should be king and arbiter, any other way is stupid and unfair.

It also assumes there is no timenudge/prediction/hocus-pocus going on. Some games try to reduce the effect of latency by putting info on screen that is a guess about the future. When it is accurate it does effectively negate the latency and you aim where you want to hit. But when it is wrong it can double or treble your effective latency because it showed you the player being distance-moved-in-100msec to the left of where he actually was when packet was sent out to you, but by the time you get it he might actually have gone distance-moved-in-100msec to the right thus the error is 200msec. However, by the time that is worked out it is again making another prediction a futher distance-moved-in-100msec to the right so the total on-screen adjustment is 300msec.

Again I favour Quake approach here because the default timenudge was 0 but player could set it to their preference. It's more common for games to force some prediction and not even tell you that it is doing it.

The average casual gamer wants to shoot where he wants to hit, therefore wants prediction. The average "skilled" player wants to control leading the target etc so wants minimum error, i.e no prediction (well okay, maybe a small amount of timenudge if has highish ping; a small amount of timenudge should be quite accurate, I would think the errors should scale up dramatically with the level of tn applied, probably logarithmically or exponentially)

This all gets a lot more complicated once you factor in the server calculating the gamestate in intervals, it taking time to do so, the latency in your cpu/graphics card/mouse etc

Subject

OptionalMessage

Login

Email Password Remember Me

If you don't already have a Blue's News user account, you can
sign up here.
Forgotten your password? Click here.