What Is Latency? Definition, Impact & How to Reduce
Latency is a measurement of the time between a user's request and a server's response.
The value is measured in milliseconds, and in some cases, users don't notice it. But high latency scores can translate into irritated customers and low sales.
You can't eliminate latency. Devices and servers need time to connect and trade data. But you can keep your scores low.
What is latency?
A user sees a link on social media and taps it. The page loads while the user waits. Latency represents the time between the tap and the loaded page.
The average loading time for a mobile landing page is about 15 seconds, which experts think companies need to improve.
After all, loading a page is just the first step towards a user’s actual goals, which could include:
- Logging in
- Asking a question
- Making a purchase
- Updating preferences
Each one of these tasks involves querying a server and waiting for a response. Latency measures wait time for all of these actions.
People often confuse latency with other networking terms, including:
- Bandwidth. How much data can pass through your network at a specified moment? Answer that question with bandwidth. Latency measures time, not data, so these are different items.
- Throughput. How much data can you transfer over a specific time period? Answer that question with throughput. Again, latency measures time, not data, so these terms aren't synonyms.
Latency can influence your bandwidth and throughput. Slow connections mean less data can pass through or transfer. But these terms aren't interchangeable.
How to reduce latency
Devices and servers need time to connect, verify, and trade information. Latency is a necessary evil as we allow these systems to do their work. But lowering latency scores is critical.
Shifting latency can deliver higher revenues. One expert had a 4.3 percent decline in per-user revenue rate tied to a 2-second increase in latency. Faster speeds may also lead to happier customers and users.
These five factors can slow down (or speed up) your latency scores:
- Design: Loading pages with large images or resource-heavy assets can increase your latency. Adding in content from third-party vendors can worsen the problem. Agile, optimised pages load quickly with fewer delays.
- Distance: The farther users are from your servers, the longer they must wait. Move servers close to the edge of your network and deploy multiple versions to stay connected with far-away customers.
- Medium: Data travelling through copper cables moves slower than data moving through fibre-optic versions. You can't alter cables across your entire town or state, but you can push for legislative action.
- Routers: Data that routers process and alter moves sluggishly. Again, keeping servers close to your users means hopping through fewer routers and improving latency scores.
- Storage: Tables and other data resources may form the backbone of your website or app. Each time a device must read that information, delays occur.
Measure the impact of your changes to ensure you're on the right track. The Internet Control Message Protocol (ICMP) has several tools you can use to measure speed and latency. Find out how to use those resources on our blog.
Find Out How You Stack Up to New Industry Benchmarks for Mobile Page Speed. (February 2018). Google.
How to Improve Application Performance and Reduce Latency. (September 2012). CIO.