
UDP latency and loss
The UDP latency test measures the round trip time of small UDP packets between the router and a target test server. Each packet consists of an 8-byte sequence number and an 8-byte timestamp. If a packet is not received back within two seconds of sending, it is treated as lost. The test records the number of packets sent each hour, the average round trip time of these and the total number of packets lost. The test will use the 99th percentile when calculating the summarised minimum, maximum and average results on the router.
The test operates continuously in the background. It is configured to randomly distribute the sending of the echo requests over a fixed interval, reporting the summarised results once the interval has elapsed.
By default the test is configured to send a packet every 1.5 seconds, meaning a maximum of 2,400 packets sent per hour. This number will usually be lower, typically around 2000 packets, as by default the test will not send packets when other tests are in progress, or when cross-traffic is detected. A higher or lower sampling rate may be used if desired, up to a maximum of reporting on a one-minute aggregation level, or a minimum of reporting on a 24-hour aggregation level.
The following key metrics are recorded by the test:
- Round-trip latency (mean)
- Round-trip latency (minimum)
- Round-trip latency (maximum)
- Round-trip latency (standard deviation)
- Round-trip packet loss
- Number of packets sent and received
- Hostname and IP address of the test server
Example
In our Spotlight article about the Metaverse we looked at our data to see if our current network infrastructure and technology support a real-time virtual world with no cap on the number of concurrent users. The Metaverse would need networks with very low and consistent latency. This chart shows that latency is a much bigger issue for mobile networks, especially older 3G and 4G networks.

Latency on different mobile network generations
Contiguous latency and loss
This test is an optional extension to the UDP Latency/Loss test. It records instances when two or more consecutive packets are lost to the same test server. Alongside each event we record the timestamp, the number of packets lost and the duration of the event.
By executing the test against multiple diverse servers, a user can begin to observe server outages (when multiple probes see disconnection events to the same server simultaneously) and disconnections of the user's home connection (when a single probe loses connectivity to all servers simultaneously).
Typically, this test is accompanied by a significant increase in the sampling frequency of the UDP Latency/Loss client to one packet every 1.5s, or approximately ~2000 packets per hour given reasonable levels of cross-traffic. This provides a resolution of 2-4 seconds for disconnection events.
Latency and loss under load
The latency and packet loss under load test seeks to characterise the effect of ‘Bufferbloat’ on the internet connection. Latency under load may also be referred to as ‘working latency’. Broadly speaking, this test reports how latency is impacted when the internet connection is heavily utilised. It may be useful to compare this with the idle latency measurement results.
This test measures downstream latency and loss under load independently from upstream loss under load.
In the downstream direction, UDP echo packets are sent at a pre-defined rate, usually configured as once every 100 milliseconds, whilst the TCP download speed test is running. In the upstream direction, UDP echo packets are sent at the same rate whilst the TCP upload speed test is running.
This test captures the following key metrics:
- Round-trip latency under downstream load (mean, minimum, maximum, stddev)
- Round-trip latency under upstream load (mean, minimum, maximum, stddev)
- Round-trip packet loss under downstream load
- Round-trip packet loss under upstream load
- Hostname and IP address of the test server
Example
We used data from our latency and loss under load test to show how a relatively unknown but common cause of high latency that can badly affect video streaming, online gaming, and teleconferencing. Find out more about bufferbloat in our Spotlight article.

Chart showing latency under load vs connection speed
UDP jitter
This test uses a fixed-rate stream of UDP traffic, running between client and test node. A bi-directional 64kbps stream is used with the same characteristics and properties (i.e. packet sizes, delays, bitrate) as the G.711 codec. This test is commonly used to simulate a synthetic VoIP call where a real SIP server is unavailable.
The client initiates the connection, thus overcoming NAT issues, and informs the server of the rate and characteristics that it would like to receive the return stream with.
The standard configuration uses 500 packets upstream and 500 packets downstream.
The client records the number of packets it sent and received (thus providing a loss rate), and the jitter observed for packets it received from the server. The server does the same, but with the reverse traffic flow, thus providing bi-directional loss and jitter.
Jitter is calculated using the PDV approach described in section 4.2 of RFC5481. The 99th percentile will be recorded and used in all calculations when deriving the PDV.