Latency and Bandwidth

Internet usage between 2000-2012 has increased by approximately 560%; and there are over 2.4b people using the Internet, according to Internet World Stats, a usage and population website that aggregates Internet usage data. That’s a lot of people surfing. And everyone is hurrying to get to wherever they are going faster. Companies are selling more bandwidth as if it is the elixir for latency. People often use “latency” and “bandwidth” interchangeably as if increasing bandwidth will solve the latter as well. It will not.

Latency is the amount of time it takes a packet of data to move across a network connection. Bandwidth is the amount of data that can be carried from one point to another in a given time period, usually measured in bits per second. Increasing the bandwidth does not eliminate latency. Bandwidth is similar to a speed limit in the sense that it is the maximum allowable speed for a given route. Say, for example 60 miles per hour or one minute per mile travelled. And latency would be the time required to travel the distance from one point to the next. So in terms of latency, latency would be higher for a 10 mile (10 minutes) compared to a 2 mile path (2 minutes). When we are experiencing slow response times from our network, one of the first things we look at is bandwidth, because it is easy to increase, and it is cheap. However, good network design can minimize node and congestion delay, but we are usually the end-user of the data center or ISPs, not the architects.

Therefore, we need to focus on the variables that we can control and have a clear understanding that although increasing bandwidth may allow more traffic to pass through, it still has to reach its destination, and therefore, there will always be some latency. There are routers and firewalls and other network devices in between that will slow things down.

Enterprise companies that experience high traffic on their websites often use a content delivery network, or content distribution network (CDN) to improve performance. It’s a system of distributed servers that deliver web content based on the geographical locations of the user. These servers store cached versions of content for faster delivery. If your server is in Seattle and your website has a visitor in Chicago, the CDN will use an algorithm to determine the best location to serve you from. In addition, a CDN will employ caching, load balancing and will request routing to determine the quickest path to delivery.

In order to improve the performance of your application you can also focus on optimization efforts that reduce the number of http requests, deploy your content across multiple servers, even if you are not using a CDN, which will make your webs pages load faster because most of the end-user response time is spent downloading the components in the web page.

The inexorable march to the cloud has created a dizzying array of options for customers. If you seek out system integrators to help you with any sort of transformation involving the cloud or improving the overall efficiency of your technology, choose one that will provide a rigorous analysis of your technology ecosystem and focus on three things: cost savings, agility and security.