Some thoughts on Jitter in BZ

Help with Setup, Hardware, Performance or other Issues...Or just pimp your rig.
Post Reply
User avatar
Sky King
Private First Class
Private First Class
Posts: 166
Joined: Mon Jun 05, 2006 8:07 pm
Location: Twin Cities, Minnesota, USA

Some thoughts on Jitter in BZ

Post by Sky King »

I hate to be the naysayer here, but, we really need some straight talk about network jitter that isn't based on anecdote and player frustration. A few other threads, and player comments, often identify jitter as the ultimate bane of BZ's existence. So let me throw out a couple of thoughts that are both factual, and will be politically incorrect.

We need to look at latency (lag) and variance (jitter) to understand that what we call jitter is not the problem we think it is. Is it a challenge at times? Yes. Can it be a problem? Yes. BUT... Do we usually exaggerate jitter's impact on the game? Absolutely.

Here's an example... let's say that my numbers are 100ms lag, and +/- 20ms. And yours are 160ms lag, +/- 2ms. That means my WORST case lag is 120, and your BEST case lag is 158.

The truth is that the position report of a moving tank that arrives 120ms after the tank was actually in that position is always more accurate that the report that arrives 158ms seconds after the tank was in a position. ALWAYS, period. Just because one of my reports took a little longer in transit than my previous one, it still arrived with more current data than yours did.

I work extensively with real-time network applications, chiefly VOIP and video conferencing. In these applications, I may structure buffers and application settings based on average latency because packet interval may be very important to me. If variance creeps up in my applications, I have to discard out-of-spec packets, and if it is chronic, then reset my application buffers on the fly. But gaming does not require the same tolerances. In our world, jitter that is 10, 20, or even 30% of lag is almost negligible in its effect on playability.

OK, so, I know what you're going to say. That jitter still requires you to adjust your lead when shooting at moving targets.

Again, let's go to the whiteboard for a fact-based analysis:

Let's assume that I am shooting at a moving tank. Our combined lag is 100ms, (his half is 50ms) and his jitter is +/- 25ms, a full 50% of his lag. And at 50%, this makes this illustration a very extreme example. Assume that he is 2 seconds of bullet flight time away from me.

What this means is that I must apply the following lead:
To compensate for his travel speed: 2,000 ms
To compensate for my lag: 50 ms
To compensate for his lag: 50 ms
To compensate for his jitter: 25 ms

That means that the jitter portion of my leading (25 ms) is about 1/100th of my total (2,125 ms) of lead.

Even if I am shooting a laser, and say the travel lead is only 250 ms... the jitter portion of my total lead is 25/375, or 1/15th of my total lead. And that is assuming that the other player's jitter is a whopping 50% of his total lag, which is almost unheard of.

My point is that we blame jitter for all our shooting woes, when in fact jitter's effect on lead is typically smaller than the pixel granularity of our screens and pointing devices. The kill-zone of a tank simply doesn't move that far on the screen in 20ms--or 1/50th of a second.

And let's "get real"... if your trigger finger timing on the mouse or keyboard is within typical jitter tolerances of 10 or 20 ms... that's one, or two one-hundredths of a second, then you have better synaptic coordination than the likes of Johan Santana or Barry Bonds and possess world-class athletic ability. Download a stopwatch application that starts and stops with mouse clicks, start it, and see how consistently you can get the stopwatch to stop within .02 seconds of an even second. Not .2, but .02.

I won't deny that jitter plays a role in the quality of gaming experience. But until jitter creeps up to be a large fraction of lag, then it is a tiny fraction of the overall challenge. Packet loss has a brutal effect on playability, and processing latency on both clients and the server also play a huge role that is difficult to measure or understand. And these sources of error far overshadow network jitter. When you see a tank jumping around and "jittering" on screen, you can be almost assured that packet loss or processing lag are to blame and that network jitter has nothing to do with it in all but the rarest cases.

Interestingly, the numbers work out about the same across many time-sensitive applications. In VOIP and video conferencing, total "spoken-to-heard" latency under about 200 ms is still workable, and latencies over about 200 ms start to decay the communications experience, becoming unworkable between 250 and 300 ms. Sound familiar? The same numbers seem to hold true for BZ.

BTW, for you technical enthusiasts/geeks out there... Jitter is actually a misused term. It used to mean bit-to-bit framing errors, like on a T1 line used by telephone companies. Jitter was the term used to describe timing errors WITHIN a stream of data whereby the voltage rise/drop that designated either a one or a zero arrived just early or just late enough to be misinterpreted. Only recently did the VOIP community glom onto the word jitter and re-purpose to mean variance between packet arrival times in fixed-packet-interval applications.
Retired Army--Proud to have served
Armored Cavalry Crewman, 1981-1984 (M60A5)
Infantry Officer & Paratrooper, 1984-1986
US Army Ranger & Sniper, 1986-1989 (LRSD)

Water Cooled 8-Core Ryzen 7 2700x @ 3.7GHz | Radeon RX590 GPU | 43" 4K Monitor
anomaly
Private First Class
Private First Class
Posts: 220
Joined: Tue Jul 26, 2005 10:32 pm
Location: Gainesville Florida

Post by anomaly »

I have worked with carrier systems DS1, DS3, OC3 to OC48, etc for about 15 years. Including VOIP, ATM, ethernet and packet networks. Jitter has always been a difference in actual arrival times of pulses, frames, packets, etc from a fixed time interval. I'm sure jitter can mean a lot of things for different applications. For instance DS1 (T1) pulses are fixed 324 ns time slots. Jitter is seen with appropriate test equip as the difference between pulse arrival times. The edges of pulses can be seen jumping back and forth. Like on an oscilloscope. Too much movement and the pulses can be misinterpreted by the differential receivers. Jitter is rarely an issue on stratum network timed equipment. Stratum 2e clocks are commonly used referencing GPS and stratum 1 sources.

The jitter in BZFlag can affect the deadreckoning mechanism and make players jump around on the screen. Packet loss and packet order will cause this as well. As with any real time application using an IP network, packets or frames that arrive late or out of order cause problems. VOIP has a real issue with dropped and ooo packets as does BZFlag. Packets don't always take the same path in the network. Load sharing algorithms may send packets on different paths. Some paths may be longer or shorter than others. BZFlag locally computes (predicts) the players positions. Only making changes in the players position when necessary. Player updates are sent at least once per second, sometimes more depending on what has changed. The jumping around I think is usually related to the difference in the predicted position and the new computed position.

Also one thing I have noticed is that players are rendered, positions computed, etc, in order from player 0 (zero) to player 'n', If there are a lot of players and your graphics processor or computer CPU are heavily loaded (low FPS) then a players position or 'id' plays a role in when that player is updated. Player zero is updated first, player 20 is updated 21 players later. Probably player 20 has moved in those few milliseconds. Normally that would not be an issue. But lag, jitter, and player id acummulate to impact playability.

I think all this just adds to the game play though. We have to learn to predict a players ever changing position mentally. It's a challenge. and fun! :D
Post Reply