Latency is one of the toughest challenges in live delivery online. Typical delays can be upwards of 1 minute or more, which can be very annoying for viewers. Can this latency be reduced? Akamai has a solution which it says removes it entirely.

Mr. Michels says there are two key approaches necessary to reduce latency. The first is to make sure the streaming system receives the live feed from the master control system, not a downstream feed (like a local broadcast.) The second key approach is to make sure all the subsequent systems are built to handle very high velocity video traffic processing.

A lot of systems aren’t built from the ground up with live in mind. Mr. Michels talks about the many things that Akamai is doing in its mid-tier network to ensure high velocity processing of live streams, without excessive video buffering.

It’s no good reducing latency if the live streaming event doesn’t scale. Mr. Michels described a recent multi-day event where the low latency solution was used. He says the system delivered a total of 98M streams, with 69M streams delivered over first 5 days. The peak concurrent usage was 1.2M streams. He said the solution excelled in 3 key areas for the is event: there was 100% success in live stream ingestion, availability approached broadcast quality, and rebuffer rate remained very low, ( less than 1%).