There are many factors that affect the speed of a web resource. One of them is network latency. Let’s take a closer look at what latency is, how it affects application performance, and how it can be reduced.
What is latency?
Broadly speaking, latency is any delay in the execution of some operations. There are different types of latencies: network latencies, audio latencies, when broadcasting video during livestreams, at the storage level, etc.
Basically, any type of latency results from the limitations of the speed at which any signal can be transmitted.
Most—but not all—latency types are measured in milliseconds. For example, when communicating between the CPU and SSD, microseconds are used to measure latency.
This article will focus on network latency, hereinafter referred to as “latency”.
Network latency (response time) is the delay that occurs when information is transferred across the network from point A to point B.
Imagine a web application deployed in a data center in Paris. This application is accessed by a user from Rome. The browser sends a request to the server at 9:22:03.000 CET. And the server receives it at 9:22:03.174 CET (UTC+1). The delay when sending this request is 174 ms.
This is a somewhat simplified example. It should be noted that data volume is not taken into account when measuring latency. It takes longer to transfer 1,000 MB of data than 1 KB. However, the data transfer rate can be the same, and the latency, in this case, will also be the same.
The concept of network latency is mainly used when discussing interactions between user devices and a data center. The lower the latency, the faster users will get access to the application that is hosted in the data center.
It is impossible to transmit data with no delays since nothing can travel faster than the speed of light.
What does network latency depend on?
The main factor that affects latency is distance. The closer the information source is to users, the faster the data will be transferred.
For example, a request from Rome to Naples (a little less than 200 km) takes about 10 ms. And the same request sent under the same conditions from Rome to Miami (a little over 8,000 km) will take about 120 ms.
There are other factors that affect network latency.
Network quality. At speeds above 10 Gbps, copper cables and connections show too much signal attenuation even over short distances, as little as within a few meters. With increasing interface speeds, fiber-optic network cables are mainly used.
Route. Data on the Internet is usually transmitted over more than one network. Information passes through several networks—autonomous systems. At the points of transition from one autonomous system to another, routers process data and send it to the required destination. Processing also takes time. Therefore, the more networks and IX there are on the package’s path, the longer it will take for it to be transferred.
Router performance. The faster the routers process data, the faster the information will reach its destination.
In some sources, the concept of network latency also includes the time the server needs to process a request and send a response. In this case, the server configuration, its capacity, and operation speed will also affect the latency. However, we will stick to the above definition, which includes only the time it takes to send the signal to its destination.
What is affected by network latency?
Latency affects other parameters of web resource performance, for example, the RTT and TTFB.
RTT (Round-Trip Time) is the time it takes for sent data to reach its destination, plus the time to confirm that the data has been received. Roughly speaking, this is the time it takes for data to travel back and forth.
TTFB (Time to First Byte) is the time from the moment the request is sent to the server until the first byte of information is received from it. Unlike the RTT, this indicator includes not only the time spent on delivering data but also the time the server takes to process it.
These indicators, in turn, affect the perception of speed and the user experience as a whole. The faster a web resource works, the more actively users will use it. Conversely, a slow application can negatively affect your online business.
What is considered optimal latency and how to measure it?
The easiest way to determine your resource’s latency is by measuring other speed indicators, for example, the RTT. This parameter is closest to latency. In many cases, it will be equal to twice the latency value (when the travel time to is equal to the travel time back).
It is very easy to measure it using the ping command. Open a command prompt, type “ping”, and enter the resource’s IP address or web address.
Let’s try to ping www.google.com as an example.
C:Usersusername>ping www.google.com
Exchange of packages with www.google.com [216.58.207.228] with 32 bytes of data
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
The time parameter is the RTT. In our example, it is 24 ms.
The optimal RTT value depends on the specifics of your project. On average, most specialists consider less than 100 ms to be a good indicator.
RTT value | Its meaning |
<100 ms | Very good, no improvements required |
100–200 ms | Acceptable, but can be improved |
>200 ms | Unsatisfactory, improvements are required |
How to reduce latency?
Here are some basic guidelines:
- Reduce the distance between the data origin and the users. Try to place servers as close to your clients as possible.
- Improve network connectivity. The more peer-to-peer partners (networks you can exchange traffic with) and route options you have, the better the route you can build and the faster the data will be transferred.
- Improve traffic balancing. Distributing large amounts of data over different routes will help reduce the network load. In that way, information will be transferred faster.
CDN—a Content Delivery Network with many connected servers that collect information from the origin, cache it, and deliver it using the shortest route—will help with the first and second points. A global network with good connectivity will help you significantly reduce latency.
However, keep in mind that latency is only one factor affecting users’ perception of application performance. In some cases, the latency is very low, but the website still loads slowly. This happens, for example, when the server is slow in processing requests.
As a rule, complex optimization is required to significantly speed up the application. You can find the main acceleration tips in the article “How to increase your web resource speed”.
Summary
- Latency is the time it takes to deliver data across the network from one point to another.
- The main factor it depends on is distance. It is also affected by the network quality and the route (number of networks and traffic exchange points).
- Latency affects other parameters of the web resource performance, such as RTT and TTFB. They, in turn, affect conversion rates and search engine rankings.
- The easiest way to determine the latency of a resource is to measure the RTT. This can be done using the ping command. An optimal RTT is less than 100 ms.
- The most effective way to reduce latencies is to enable a CDN. Content delivery network will reduce the distance between the client and the data origin, as well as improve the routing. As a result, the information will be transferred faster.
Gcore CDN provides excellent data transfer speed. We deliver heavy files with minimal delays anywhere in the world.
We have a free plan. Test our network and see how your resource will speed up.
More about Gcore CDN
FAQs
What is reduce latency? ›
Lower latency refers to a minimal delay in the processing of computer data over a network connection. The lower the processing latency, the closer it approaches real-time access. A lower latency network connection is one that experiences very small delay times.
What do you mean by latency? ›Latency measures delay in a packet's arrival at the destination. It is measured in time units such as milliseconds. Packet loss is a percentage value that measures the number of packets that never arrived. So if 91 out of 100 packets arrived, packet loss is 9%.
What is latency and what should it be? ›What is Latency? Latency is the literal time it takes for a packet of data to go from its origination and reach its destination. The measurement of latency is measured in milliseconds. Just 50 milliseconds of latency — a time of less than one-tenth of a second — can result in poor network and application performance.
Is there any way to reduce latency? ›Restarting your router can help to refresh your internet connection and can improve latency.
What is an example of latency? ›Latency can be measured one way, for example, the amount of time it takes to send a request for resources, or the length of the entire round-trip from the browser's request for a resource to the moment when the requested resource arrives at the browser.
Does latency mean slow? ›Sometimes, network "latency" (slow network performance) is caused by issues on the user's side, not the server side. Consumers always have the option of purchasing more bandwidth if slow network performance is a consistent issue, although bandwidth is not a guarantee of website performance.
What is latency for dummies? ›Latency is the time it takes for a data packet to travel from the sender to the receiver and back to the sender. High latency can bottleneck a network, reducing its performance. You can make your web applications less latent by using a CDN and a private network backbone to transfer data.
What is another word for latency? ›synonyms: latent period, reaction time, response time. type of: interval, time interval.
Why is high latency good? ›High latency decreases communication bandwidth, and can be temporary or permanent, depending on the source of the delays. Latency is measured in milliseconds, or during speed tests, it's referred to as a ping rate. The lower the ping rate the better the performance.
What causes latency? ›What causes latency? The main cause of latency is distance. The longer the distance between the browser making the request and the server responding to that request, the more time it'll take the requested data to travel there and back.
What causes internet latency? ›
What Causes Latency? In most situations, latency is caused by your internet network hardware, your remote server's location and connection, and the internet routers that are located between your server and your online gaming device, smartphone, tablet or other internet device.
What are the two types of latency? ›There are two types of latency: A one-way transmission or round trip, depending on the use case. One-way latency is the transmission of data packets from a source to a destination. Round trip latency is when the data packet returns to the source after acknowledgement from the destination.
How do you measure latency? ›Calculate the latency: Subtract the start time from the end time to get the total latency. For example, if the start time was 10:00:00 AM and the end time was 10:00:01 AM, the latency would be one second. Repeat: Repeat the process several times to get an average latency, as latency can vary depending on many factors.
Does latency mean speed? ›Latency and speed are different things. Latency refers to how quickly your online device can communicate, while speed measures the amount of data it can download or upload at a time. Bandwidth and speed are not synonyms. Bandwidth is your highest possible download rate, while speed is your actual download rate.
What is latency and why is it bad? ›Latency is the more technical term for lag, which is when you are experiencing response delays while gaming. High latency is what causes time lag and makes gaming far less enjoyable. Low latency is ideal as this means you are experiencing smoother gameplay.
What has the best latency? ›For the most part, DSL, cable, and fiber internet tend to have lower latency, while satellite internet tends to have higher latency.
What is latency in internet? ›Latency is a time delay or how long it takes data to travel between the sender and the receiver — or between a specific user action and the response. Network latency is a significant internet connectivity issue that can be caused by several things that will dramatically impact a user's internet experience.
Why do you need low latency? ›Low latency is imperative because customers expect to interact with technology in real-time with no delays. Issues with high latency and time delays can cause users to quit engaging on a platform and move to another application permanently.
What is latency in psychology? ›Definition: Latency Period. LATENCY PERIOD: The period of reduced sexuality that Freud believed occured between approximately age seven and adolescence. Freud claimed that children went through a "latency period" during which "we can observe a halt and retrogression in sexual development" (Introductory Lectures 16.326) ...
What can affect latency? ›What Affects Latency? Latency is affected by the type of connection, distance between the user and the server and the width of bandwidth. The Internet connection is impacted by the type of service you use to access the Internet. Let's assume that you are browsing the Internet on different types of connections.
What is high latency issues? ›
Latency measures how long it takes a specific piece of data to reach your computer. High latency can actually offset the effectiveness of a high-bandwidth connection because these packets of information take longer to respond and clog the network. Many people try to offset this problem by purchasing more bandwidth.
Why is my latency so high but my internet is good? ›Some reasons your ping might be high include: Routers and how updated they are, where they're placed, and whether their firmware is up to date. Computers and whether they're outdated, un-optimized for gaming, or need to be cleaned. Caches on your router or modems whether they're full.
How do I fix sudden high latency? ›- Try moving closer to your router. ...
- Close background programs and websites that are in use. ...
- Connect fewer devices to the Wi-Fi network. ...
- Reboot your router. ...
- When gaming make use of local servers. ...
- Check with your Internet Service Provider for any issues. ...
- Connect your device directly to your router.
Internet connection speed
In addition to the various factors that comprise the quality of your ISP, your internet connection speeds can affect your ping (or latency). A higher connection speed lets you send and receive data faster, thus lowering your ping.
Low latency is imperative because customers expect to interact with technology in real-time with no delays. Issues with high latency and time delays can cause users to quit engaging on a platform and move to another application permanently.
Does low latency mean faster? ›Bandwidth and latency have an impact on everything you do online. High bandwidth and low latency translate to the best speeds and the fastest response times—that's what you want for your internet connection. Low bandwidth and high latency mean slow downloads, choppy streams, and delayed responses.
What causes high latency? ›What Causes Latency? In most situations, latency is caused by your internet network hardware, your remote server's location and connection, and the internet routers that are located between your server and your online gaming device, smartphone, tablet or other internet device.
What is good latency for internet? ›A passable network should have: Latency of 200ms or below, depending on the connection type and travel distance. Packet loss below 5% within a 10-minute timeframe.
What is the latency rate? ›Latency rate is a network performance metric, measured as the round-trip time that it takes for a packet of data to travel from a sending node to the nearest receiving server in each country and back. It is collected by Measurement Lab from a high number of tests performed across networks every day.
What are the benefits of latency? ›Low latency data levels provide a stable connection that enables websites and apps to load quickly for users. A reliable connection is particularly important for cloud-hosted and mission-critical applications.
What is the minimum latency? ›
What is Low Latency? Low latency describes a computer network that is optimized to process a very high volume of data messages with minimal delay (latency). These networks are designed to support operations that require near real-time access to rapidly changing data.
Is latency the same as speed? ›Latency and speed are different things. Latency refers to how quickly your online device can communicate, while speed measures the amount of data it can download or upload at a time. Bandwidth and speed are not synonyms. Bandwidth is your highest possible download rate, while speed is your actual download rate.
What is the best latency? ›Low latency is ideal as this means you are experiencing smoother gameplay. Generally, an acceptable latency (or ping) is anywhere around 40 – 60 milliseconds (ms) or lower, while a speed of over 100ms will usually mean a noticeable lag in gaming.
What affects latency the most? ›Distance is usually the main cause of latency—in this case, it refers to the distance between your computer and the servers your computer is requesting information from.
What affects my latency? ›What Causes Latency? No matter the network, three primary factors significantly contribute to latency; propagation delay, routing and switching, and queuing and buffering.
How much latency does WiFi add? ›Typically, today's home and mobile internet users experience somewhere from 5 to 10 milliseconds of latency added by their home WiFi and last-mile connection, depending on the particular technology.
What is the best latency for live? ›Less than 15 Seconds of Latency
That's why, when comparing streaming solutions, it's best to ask the sales representative how many seconds of latency their platform has. You want to aim for less than 15 seconds of latency for professional broadcasts.
Latency is one of the most important factors that impacts the speed of the network. Latency is measured by the time it takes for a packet of data to travel from a client device to a website server and come back.