<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1050989841627822&amp;ev=PageView&amp;noscript=1">
6 minute read

When it comes to running a fully functional website there are certain Yes' and No's. One of the major No's are certainly Low Bandwidth and High Latency. But which one is more fatal and crippling for your site? Let's start by explaining what High Latency and Low Bandwidth actually mean and how do they affect your website's performance;

Bandwidth is just one element of what a person perceives as the speed of a network. People often mistake bandwidth with internet speed mainly because internet service providers (ISPs) claim they have a fast ‘50Mbps connection’ in their advertising campaigns. True internet speed is actually the amount of data you receive every second and that has a lot to do with latency too.

Latency is another element that contributes to network speed. The term latency refers to any of several kinds of delays typically incurred in processing of network data, the most obvious delay being the time it takes for a packet of data to go from a user’s computer to the website server they’re visiting and back (round-trip-time or RTT). A so-called low latency network connection is one that generally experiences small delay times, while a high latency connection generally suffers from long delays. Latency is also referred to as a ping rate and typically measured in milliseconds (ms).

Simply put:

Tweet this: Latency refers to the time it takes to initiate a communication

Tweet this: Bandwidth describes how fast you can get information across to the user

Excessive latency creates bottlenecks that prevent data from filling the network pipe, thus decreasing effective bandwidth. The impact of latency on network bandwidth can be temporary (lasting a few seconds) or persistent (constant) depending on the source of the delays.Think of latency in terms of a road. The longer the road, the longer it takes to travel.

Now substitute “higher” for “longer” and you have latency. The higher the latency, the more impact that can have on load times. From a pizza delivery standpoint, high latency can have you impatiently tapping your toes, wondering when the pizza guy is going to arrive. If we stick with the road analogy, you can think of bandwidth as the wider the road, the more traffic that can travel on it at once. As opposed to latency, where we don’t want it to be high, high bandwidth is in fact what we want. Low bandwidth means clogged traffic and cold pizza.

Tweet this: Think of latency in terms of a road. The longer the road, the longer it takes to travel

What Affects Latency?

Type of connection, distance between the user and the server and the width of bandwidth.

Connection - is impacted by the type of service you use to access the internet.

Think about load time and how slower site browsing gets. That's the ultimate damage.
Let’s say you are browsing the web on different types of connections. Here’s how latency would affect your browsing:

You would click a link on a web page and, after a noticeable delay, the web page would start downloading and show up almost all at once.

  • Theoretical Connection (Low Speed / Bandwidth, Low Latency)

You would click a link on a web page and the web page would start loading immediately. However, it would take a while to load completely and you would see images load one-by-one.

  • Cable Internet Connection (High Speed / Bandwidth, Low Latency)

You would click a link on a web page and the web page would appear almost immediately, downloading almost all at once.

Distance - the closer you are to the server, the faster information gets to you. This is typically improved by using Content Delivery Networks or CDNs. CDNs enable you to place servers where your user is and they’re typically used by websites that receive large amounts of traffic.

Bandwidth - if you have a small bandwidth, you’re more likely to experience congestion which means slower internet.

What Affects Bandwidth?

In order to diagnose a network problem one should use a network monitoring technology. Some popular “flow-based” technologies are NetFlow and sFlow. Bandwidth Issues can almost always be traced to one or two specific activities. These activities almost always high two characteristics: large amounts of data, and extended duration. Common activities causing bandwidth problems are:

  1. Watching videos from Internet (YouTube, Netflix)
  2. Large file transfers between computers (greater than 100 megabytes in size)
  3. Constant stream of data (surveillance footage from security cameras)
  4. Downloading files from internet

All of the above can contribute greatly to bandwidth issues in a network, and should be done only when there is light network traffic. Large file transfers or data streams within a network should be placed on a separate network, in order to avoid bottlenecking other users. Bandwidth is important when you have a lot of data to send/receive and it doesn't really need to be real-time, such as transferring large amounts of data to an off-site backup. (You don't really care in what order the data arrives or how quickly the other side can respond, you just need all the data to get there.)

Image Source

QoS (Quality of Service) refers to a broad collection of networking technologies and techniques. The goal of QoS is to provide guarantees on the ability of a network to deliver predictable results. Elements of network performance within the scope of QoS often include availability (uptime), bandwidth (throughput), latency (delay), and error rate.

Tweet this: Quality of Service = Uptime + Throughput + Delay + Error rate

While low latency and high bandwidth is the ideal to strive for, high latency has a deeper impact on load times than low bandwidth. At low latencies, data should transfer almost instantaneously and we shouldn’t be able to notice a delay. As latencies increase, we begin to notice more of a delay.You can measure the latency between your computer and a web address with the ping command.

Tweet this: Latency is always with us; it’s just a matter of how significant it is

At low latencies, data should transfer almost instantaneously and we shouldn’t be able to notice a delay. As latencies increase, we begin to notice more of a delay.

Sources:

About GlobalDots

With over 10 years of experience, GlobalDots have an unparalleled knowledge of today’s leading web technologies. Our team know exactly what a business needs to do to succeed in providing the best online presence for their customers. We can analyse your needs and challenges to provide you with a bespoke recommendation about which services you can benefit from.

GlobalDots can help you with the following technologies: Content Delivery Network, DDoS Protection, Multi CDN, Cloud performance optimization and infrastructure monitoring.

Start the conversation

Request a Free Consultation with our Specialists

Tell us about yourself