Some thoughts on Jitter in BZ
Posted: Wed Jan 17, 2007 10:10 pm
I hate to be the naysayer here, but, we really need some straight talk about network jitter that isn't based on anecdote and player frustration. A few other threads, and player comments, often identify jitter as the ultimate bane of BZ's existence. So let me throw out a couple of thoughts that are both factual, and will be politically incorrect.
We need to look at latency (lag) and variance (jitter) to understand that what we call jitter is not the problem we think it is. Is it a challenge at times? Yes. Can it be a problem? Yes. BUT... Do we usually exaggerate jitter's impact on the game? Absolutely.
Here's an example... let's say that my numbers are 100ms lag, and +/- 20ms. And yours are 160ms lag, +/- 2ms. That means my WORST case lag is 120, and your BEST case lag is 158.
The truth is that the position report of a moving tank that arrives 120ms after the tank was actually in that position is always more accurate that the report that arrives 158ms seconds after the tank was in a position. ALWAYS, period. Just because one of my reports took a little longer in transit than my previous one, it still arrived with more current data than yours did.
I work extensively with real-time network applications, chiefly VOIP and video conferencing. In these applications, I may structure buffers and application settings based on average latency because packet interval may be very important to me. If variance creeps up in my applications, I have to discard out-of-spec packets, and if it is chronic, then reset my application buffers on the fly. But gaming does not require the same tolerances. In our world, jitter that is 10, 20, or even 30% of lag is almost negligible in its effect on playability.
OK, so, I know what you're going to say. That jitter still requires you to adjust your lead when shooting at moving targets.
Again, let's go to the whiteboard for a fact-based analysis:
Let's assume that I am shooting at a moving tank. Our combined lag is 100ms, (his half is 50ms) and his jitter is +/- 25ms, a full 50% of his lag. And at 50%, this makes this illustration a very extreme example. Assume that he is 2 seconds of bullet flight time away from me.
What this means is that I must apply the following lead:
To compensate for his travel speed: 2,000 ms
To compensate for my lag: 50 ms
To compensate for his lag: 50 ms
To compensate for his jitter: 25 ms
That means that the jitter portion of my leading (25 ms) is about 1/100th of my total (2,125 ms) of lead.
Even if I am shooting a laser, and say the travel lead is only 250 ms... the jitter portion of my total lead is 25/375, or 1/15th of my total lead. And that is assuming that the other player's jitter is a whopping 50% of his total lag, which is almost unheard of.
My point is that we blame jitter for all our shooting woes, when in fact jitter's effect on lead is typically smaller than the pixel granularity of our screens and pointing devices. The kill-zone of a tank simply doesn't move that far on the screen in 20ms--or 1/50th of a second.
And let's "get real"... if your trigger finger timing on the mouse or keyboard is within typical jitter tolerances of 10 or 20 ms... that's one, or two one-hundredths of a second, then you have better synaptic coordination than the likes of Johan Santana or Barry Bonds and possess world-class athletic ability. Download a stopwatch application that starts and stops with mouse clicks, start it, and see how consistently you can get the stopwatch to stop within .02 seconds of an even second. Not .2, but .02.
I won't deny that jitter plays a role in the quality of gaming experience. But until jitter creeps up to be a large fraction of lag, then it is a tiny fraction of the overall challenge. Packet loss has a brutal effect on playability, and processing latency on both clients and the server also play a huge role that is difficult to measure or understand. And these sources of error far overshadow network jitter. When you see a tank jumping around and "jittering" on screen, you can be almost assured that packet loss or processing lag are to blame and that network jitter has nothing to do with it in all but the rarest cases.
Interestingly, the numbers work out about the same across many time-sensitive applications. In VOIP and video conferencing, total "spoken-to-heard" latency under about 200 ms is still workable, and latencies over about 200 ms start to decay the communications experience, becoming unworkable between 250 and 300 ms. Sound familiar? The same numbers seem to hold true for BZ.
BTW, for you technical enthusiasts/geeks out there... Jitter is actually a misused term. It used to mean bit-to-bit framing errors, like on a T1 line used by telephone companies. Jitter was the term used to describe timing errors WITHIN a stream of data whereby the voltage rise/drop that designated either a one or a zero arrived just early or just late enough to be misinterpreted. Only recently did the VOIP community glom onto the word jitter and re-purpose to mean variance between packet arrival times in fixed-packet-interval applications.
We need to look at latency (lag) and variance (jitter) to understand that what we call jitter is not the problem we think it is. Is it a challenge at times? Yes. Can it be a problem? Yes. BUT... Do we usually exaggerate jitter's impact on the game? Absolutely.
Here's an example... let's say that my numbers are 100ms lag, and +/- 20ms. And yours are 160ms lag, +/- 2ms. That means my WORST case lag is 120, and your BEST case lag is 158.
The truth is that the position report of a moving tank that arrives 120ms after the tank was actually in that position is always more accurate that the report that arrives 158ms seconds after the tank was in a position. ALWAYS, period. Just because one of my reports took a little longer in transit than my previous one, it still arrived with more current data than yours did.
I work extensively with real-time network applications, chiefly VOIP and video conferencing. In these applications, I may structure buffers and application settings based on average latency because packet interval may be very important to me. If variance creeps up in my applications, I have to discard out-of-spec packets, and if it is chronic, then reset my application buffers on the fly. But gaming does not require the same tolerances. In our world, jitter that is 10, 20, or even 30% of lag is almost negligible in its effect on playability.
OK, so, I know what you're going to say. That jitter still requires you to adjust your lead when shooting at moving targets.
Again, let's go to the whiteboard for a fact-based analysis:
Let's assume that I am shooting at a moving tank. Our combined lag is 100ms, (his half is 50ms) and his jitter is +/- 25ms, a full 50% of his lag. And at 50%, this makes this illustration a very extreme example. Assume that he is 2 seconds of bullet flight time away from me.
What this means is that I must apply the following lead:
To compensate for his travel speed: 2,000 ms
To compensate for my lag: 50 ms
To compensate for his lag: 50 ms
To compensate for his jitter: 25 ms
That means that the jitter portion of my leading (25 ms) is about 1/100th of my total (2,125 ms) of lead.
Even if I am shooting a laser, and say the travel lead is only 250 ms... the jitter portion of my total lead is 25/375, or 1/15th of my total lead. And that is assuming that the other player's jitter is a whopping 50% of his total lag, which is almost unheard of.
My point is that we blame jitter for all our shooting woes, when in fact jitter's effect on lead is typically smaller than the pixel granularity of our screens and pointing devices. The kill-zone of a tank simply doesn't move that far on the screen in 20ms--or 1/50th of a second.
And let's "get real"... if your trigger finger timing on the mouse or keyboard is within typical jitter tolerances of 10 or 20 ms... that's one, or two one-hundredths of a second, then you have better synaptic coordination than the likes of Johan Santana or Barry Bonds and possess world-class athletic ability. Download a stopwatch application that starts and stops with mouse clicks, start it, and see how consistently you can get the stopwatch to stop within .02 seconds of an even second. Not .2, but .02.
I won't deny that jitter plays a role in the quality of gaming experience. But until jitter creeps up to be a large fraction of lag, then it is a tiny fraction of the overall challenge. Packet loss has a brutal effect on playability, and processing latency on both clients and the server also play a huge role that is difficult to measure or understand. And these sources of error far overshadow network jitter. When you see a tank jumping around and "jittering" on screen, you can be almost assured that packet loss or processing lag are to blame and that network jitter has nothing to do with it in all but the rarest cases.
Interestingly, the numbers work out about the same across many time-sensitive applications. In VOIP and video conferencing, total "spoken-to-heard" latency under about 200 ms is still workable, and latencies over about 200 ms start to decay the communications experience, becoming unworkable between 250 and 300 ms. Sound familiar? The same numbers seem to hold true for BZ.
BTW, for you technical enthusiasts/geeks out there... Jitter is actually a misused term. It used to mean bit-to-bit framing errors, like on a T1 line used by telephone companies. Jitter was the term used to describe timing errors WITHIN a stream of data whereby the voltage rise/drop that designated either a one or a zero arrived just early or just late enough to be misinterpreted. Only recently did the VOIP community glom onto the word jitter and re-purpose to mean variance between packet arrival times in fixed-packet-interval applications.