ANNOUNCEMENT: Live Wireshark University & Allegro Packets online APAC Wireshark Training Session
April 17th, 2024 | 14:30-16:00 SGT (UTC+8) | Online

Wireshark-users: [Wireshark-users] Comparing RTP stream Jitter and Max Delta

From: Jaakko Hakalahti <e0201091@xxxxxx>
Date: Fri, 04 May 2007 18:46:59 +0300

Hello,
I have been troubled with understanding the connection between the Max Delta (ms) and the Max Jitter (ms) column in the RTP stream analysis. If I use a G.711 codec (PCM) for the VoIP-call, there should be one packet sent every 20 milliseconds and therefore the Max Delta (ms) value should be pretty close to that. (In an ideal conditions). In this case the Jitter value also should be close to nothing.

I made some tests with a lot of heavy traffic over the same network where I made the VoIP-call, and I got some wild Max Delta values up to 360 milliseconds. I expected that the Jitter value would also follow the Max Delta value and go really high but that is not the case. The highest Jitter that Wireshark shows for me in that same test is 42 milliseconds. I made several other tests as well and for example if I got the Max Delta value up to 160 ms, the Max Jitter was 16 ms.

Jitter is supposed to be the variation of the time between packets arriving to the receiver, right? Then why, if there is such a huge cap between packets (as that 360 ms is), the Jitter value is only 42 ms? How is it really calculated then?

I searched the web for this and I found out that Interarrival Jitter is supposed to be: "The interarrival jitter J is defined to be the
     mean deviation (smoothed absolute value) of the difference D in
     packet spacing at the receiver compared to the sender for a pair
     of packets." @ RFC 3550.

Though that Jitter means the RTCP-packet field "Interarrival Jitter", not the Jitter measured from the RTP-packets.

Help me understand the connection between the Max Delay and the Max Jitter, because I don't really get it.
Thanks.