How broadband speeds are measured - a teachable moment

=center>

Three weeks ago, Akamai released their State of the Internet report with findings from Q3 of 2012. The quarterly report typically includes a list of average speed measurements for different countries around the world. Somecommentators have suggested that these figures indicate serious competitive problems with the U. S. broadband marketplace, but things are not necessarily as dire as critics would think. It’s important to understand what the Akamai report – and others that compare speeds among nations – actually measures. While many of these tests provide interesting and useful data – and can clue us in to how broadband markets are performing over time - they do not provide the kind of consistent and calibrated measurements that make direct comparisons among countries possible.

How to Measure Broadband Speed Accurately

When considering how to compare broadband speeds, the focus should be on the connections the consumer has from their ISP. The best means of measuring and comparing such residential access speeds is to use measurement systems that are configured to isolate the ISP’s access service from other extraneous variables. Working with a number of ISPs and stakeholders, the FCC a few years ago put together such a system. The resulting reports (Measuring Broadband America) provide the best available metrics regarding local broadband connections in the U. S. (unfortunately, the UK is the only other country so far to publically release data from Sam Knows’ measurements, so it is hard to draw solid comparisons between other nations). My colleague, David Young, explained last week why the Sam Knows test yields the most accurate measurements for ISP performance. Another good resource is this thoughtful paper from MIT on “Understanding Broadband Speed Measurements.” The authors identify even more limitations that are beyond the ISPs’ control. They include local bottlenecks; performance of the speed test measuring server; TCP configuration settings; and how is traffic routed beyond the access ISP to the server.

What Akamai Measures

In contrast to the FCC’s system, Akamai’s quarterly report describes the average and peak delivery speed on its own networks for traffic in different countries around the world. The report does not offer a true picture of residential access speeds that is of so much interest to so many.

Furthermore, Akamai’s measurements may favor certain nations more than others due to other limitations:

Akamai measures all Internet traffic that accesses their servers and does not differentiate between residential and business traffic. Average speed data are skewed in favor of states or countries with a higher ratio of businesses/universities to the number of households (commercial subscribers generally buy higher speed tiers).

Akamai measures “speeds” through individual HTTP (browser) sessions to their servers, not adding up multiple simultaneous HTTP sessions from the same IP address. More sessions use up more bandwidth which equals lower speeds. This could be another reason Akamai’s speed results are considerably lower than the FCC’s and Ookla’s.

The file sizes that Akamai samples to produce speed estimates may vary based upon the distribution of content Akamai regularly serves in a particular location. This is likely to be different in different geographic areas. For example, Western Europeans have a strong preference for using PCs as a video platform, while U. S. users prefer a wider variety of devices. As a result, more video is likely stored - and thus sampled - in Europe than the U.S. (Internet protocols (TCP) are geared towards adjusting throughput based on available network capacity, and the longer the amount of time it takes to send a “package” of content, the better optimized the throughput can be. Video content provides a longer period of time - compared to webpages or email - for TCP to increase its sending rate to fully utilize the available capacity provided by the ISP. Thus, populations that stream more internet video could see a higher average speed than those that stream less).

“These are Akamai data of the average speeds from Akamai servers, much of the data is rate-limited video streaming by design. This is similar to the Netflix data that says Google's gigbabit network in Kansas City only runs at 2.55 Mbps. Average peak connection speeds in the best nations and best U. S. states, per Akamai, tells a different story (speeds in Mbps):

"Hong Kong is one of the most densely populated places in the world. The land population density as at mid-2010 stood at 6,540 persons per square kilometre, and Kwun Tong, with 54,530 persons per square kilometre, was the most densely populated district among the District Council districts."[1]

"The Population density (people per sq. km) in the United States was last reported at 33.82 in 2010, according to a World Bank report published in 2012."[2]

Do the math on the cost of FTTP (Fiber to the Premises) bearing these figures in mind.”

Conclusion: Better tests, better data are needed

Although speed tests can be a useful tool for assessing some aspects of network performance, measurement methodologies vary significantly. This is why the average speeds measured for the U. S. by Akamai (7.2 Mbps), Ookla’s Speedtest.net (15.91 Mbps), Youtube (8.09 Mbps) and the FCC rankings (15.6 Mbps) show large discrepancies. Akamai itself offers a disclaimer in an FCC filing that makes it clear the report does not purport to measure local access speeds, nor do its measurements “conform to the reporting categories of the FCC’s Form 477 (e.g. speed tiers).” Drawing conclusions about how nations compare with regard to broadband access services from tests that do not actually measure access speeds is not advisable. Very few broadband speed tests provide a technically sound basis for making such comparisons.

Broadband speeds have seemingly become the sole focus in the debate over U.S. broadband and Internet policy. As such, it’s important to understand what the various studies are measuring in this regard and how they should be interpreted. Bad or even misconstrued data leads to bad policy, and considering the huge role broadband will play in manufacturing, health care, energy management, education, and nearly every other sector of the economy, policy makers cannot afford to get this one wrong. Using a multi-stakeholder approach to test modeling and data gathering, as the FCC/Sam Knows test does, provides the best data from which to craft fair and effective broadband policy.