How to Reduce Latency for ISPs and Subscribers

How to Reduce Latency for ISPs and Subscribers blog header

Hot take: Low latency is more important than throughput when it comes to giving subscribers a superior quality of experience (QoE)🔥At Preseem, we like to say that throughput matters, but end users feel latency. When advertising their services, however, most ISPs only market plan speeds to consumers without ever mentioning low network latency as a selling point.

On one hand, this makes sense—consumers are more likely to understand what 100 Mbps download speed means, whereas promising them latency under 30ms probably won’t resonate right away. However, there was a time when consumers didn’t know what “Mbps” meant either. Educating subscribers on the importance of low latency and its impact on their experience could be a good way for regional operators to separate themselves from the competition and grow their business.

What Causes High Latency

Before we look at the benefits of low latency, it’s important to understand what causes high latency in the first place. For network operators, packet buffering is the main culprit. More specifically, on networks that use first-in, first-out (FIFO) queues with large buffers, bandwidth-intensive interactive applications like online gaming or video calls can cause bufferbloat during peak busy times.

Excessively large buffers lead to situations where too many packets are in the queue waiting to be transmitted. This leads to saturated links that cause network bottlenecks and dropped packets, leading in turn to high latency under load that makes a connection feel slow for the subscriber.

For example, some physical interfaces can have queues that are thousands of packets deep. As you can imagine, this usually translates into more “my internet is slow” calls for ISP support teams to troubleshoot.

Woman working on laptop in home environment

For a deep dive on bufferbloat, latency in networking, and queue management, check out the recap of our podcast discussion with latency expert Bjørn Ivar Teigen, or listen to the episode here.

How to Measure Latency

The first step in how to reduce latency is to be able to measure it accurately. At Preseem, we do this by measuring TCP Round Trip Time (RTT) latency using actual subscriber traffic. Here’s how it works: When ISPs use Preseem, all customer traffic goes through one of our inline devices, typically deployed in the core of the network. Each packet has an IP address that’s mapped to a subscriber and relevant network elements (e.g. access points).

We measure how long it takes for a TCP packet to go from the inline device all the way downstream—through the backhaul, access point, CPE, in-home network, and finally to the end-user device (such as a phone, TV, or laptop). TCP on this device then sends back an acknowledgement packet, and when that reaches Preseem, it calculates the RTT (which equals one latency sample). It then continues to do this for potentially thousands of latency samples, for each subscriber, per each 10-second logging interval.

By measuring latency in this way, Preseem provides regional operators with the ability to see where and why network latency is occurring. For example, is it a congested AP, or is it an issue with an end-user’s in-home Wi-Fi? This knowledge makes it easier for support teams to understand the causes of high latency issues and resolve them. It also allows ISPs to proactively address those issues and achieve low network latency moving forward.

A man is frustrated while waiting for something to load on his laptop.

How to Reduce Latency in Your Network

It stands to reason that if packet buffering and large queues are the main cause of high latency in networks, then finding a way to mitigate those will improve latency issues for subscribers.

Luckily, this solution already exists. Active Queue Management (AQM) is a traffic shaping method that proactively drops packets before buffers fill up. This keeps queue sizes small, meaning lower latency and a better experience for subscribers, even when multiple devices in the home are online at the same time.

Preseem uses the FQ-CoDel (Fair/Flow Queuing + Controlled Delay) AQM algorithm to separate traffic into bulk (e.g. system updates) and interactive (e.g. online gaming, VoIP) flows. This ensures the internet “feels fast” for end users, even during peak usage times. Created by Eric Dumazet and Dave Taht, FQ-CoDel makes it possible to “reduce bottleneck delays by several orders of magnitude.” As a result, this means fewer support calls and reduced churn for regional operators.

Graphic showing the difference between FIFO queues and AQM.

The Internet Engineering Task Force (IETF) calls FQ-CoDel “a powerful tool for fighting bufferbloat and reducing latency” due to its ability to provide isolation for low-rate traffic, keep queue lengths short, and its compatibility with a wide range of hardware.

Side note: Preseem’s Automatic AP Capacity Management feature can also help with AP congestion, packet buffering, and high latency issues. The ‘easy button’ for AP capacity management, this feature uses automatic bandwidth control to ensure a great experience for subscribers at all times. Find out more here.

How Subscribers Can Improve Internet Latency at Home

So far, we’ve looked at ways that ISPs can reduce network latency, but what about the end users themselves? Is there anything they can do within the home to improve internet latency? Spoiler alert: Yes, there is 🙂

If you’ve got online gamers in the house or you’re working remotely and doing a lot of video meetings, then internet latency should be an important priority. Laggy games, slow system updates, and dropped or jittery video calls are extremely frustrating, to say the least. Below are a few things to think about if you’re experiencing any of these issues.

Example of FIFO queuing compared to FQ-CoDel bulk and interactive flow queues

Optimize Your Router Placement

Conventional wisdom has it that a central location in the home is best for your router so that signal strength is distributed evenly throughout. For example, if you place your router next to an outer wall, then some of that signal is essentially being absorbed by the wall. If your signal is poor and you have active users inside or outside the house, this will negatively impact the speeds that each user can achieve.

Also, make sure the router is elevated and in a “low-tech” area where other devices are unlikely to interfere. For example, kitchens are not ideal because they have large metal appliances that can block access to other parts of the house. As a general rule of thumb, you should try to avoid having to go through more than two walls to reach an access point. When in doubt, contact your internet service provider and ask for advice—most offer consultation services to help you figure out the ideal router placement.

Check Your Equipment

To help ensure high performance and low latency, it’s best to check you have a router that can handle what you need. Perhaps your router is a little older and needs a firmware update or should be replaced with a newer model. For gamers, there are routers designed specifically for gaming that you might want to consider. As well, some routers now come with smart queue management built right in.

If you’re using a router/modem combo provided by your ISP, check in with them to make sure you have the proper model based on your needs and internet usage. Bonus: if they’re using Preseem, they’ll be able to tell you whether any issues you’re experiencing are network-wide or isolated to your in-home equipment.

Run a Speed Test

End users can easily measure their network latency by running a speed test. CloudFlare is a recommended option, as it shows not only upload and download speeds, but also measurements for latency under load, jitter, and packet loss. It’ll even rate your current network quality for online gaming and video streaming.

Netflix’s fast.com is another good option that shows loaded latency as well as internet speed. Loaded latency (or latency under load) is the measurement of latency when the network is busy (e.g. during peak time or when multiple users in the home are online at the same time). A high loaded latency time is an indicator of bufferbloat.

Side-by-side screenshots of speed tests showing latency under load.
CloudFlare (above left) and fast.com (right) are recommended options for home latency tests.

Ask Your Provider About Latency

This is an overlooked tip when it comes to improving internet latency at home but it’s an important one. Choosing an internet service provider that consistently delivers low latency is more important than download/upload speeds because latency is directly related to how the internet feels for users.

A 100Mbps plan may be more than most users will ever need—however, if your provider’s network has latency issues caused by large buffers, then the internet is still going to feel slow at heavy usage times. So, internet users: make sure you do your homework and choose an ISP with low latency! And ISPs: make sure to educate your audience on the benefits! It’s a true win-win situation.

For more insights on network latency, the impact of packet buffering, and the changing face of the internet, check out our webinar on the subject or read the recap blog here.

Subscribe to the Preseem Blog Newsletter

Stay in-the-know and get fresh content delivered to your inbox once a month.