Follow us

Who is live-streaming? Who drives the video streaming market? Where are the current potholes? How will we meet the rising demand for real-time latency?

These are a few of the topics covered last month at the Streaming Media East conference on a panel discussion with our CEO and Co-Founder Chris Allen, VP of Architecture Jason Hofmann from our partner Limelight Networks and Bill Zurat, VP, Core Media, BAMTECH Media.

You can watch the full video here, or take a look at our summary below.

To start things off, they broke up the current live streaming market into three sectors:

Transactional Market

Auctions

Betting

HQ Trivia

Events that are Simulcast on Satellite and Cable

Sports

Concerts

Political debates/ ceremonies

Trial Verdicts

Available on set-top boxes, Apple TVs mobile devices

Offline Deployment

Police body cams

Monitoring soldier activity for the military

For most application especially those in the transactional and offline sectors, it is absolutely necessary to have low latency.

Low-Latency Hurdles

latency is caused by any sort of processing even grabbing the frames from the camera and encoding them.

The characteristics of HTTP based protocols, like HLS and DASH, make it very hard for real-time latency due to the fact that it sends content in chunks. Since a majority of latency can come from those client chunks, cutting down the chunk size to 1 second produces lower latency.

However, as you lower chunk sizes, the CDN costs go up. If chunk sizes are cut in half, the number of requests the CDN has to deal with doubles. Since each request (no matter the size) has to be processed, that adds additional costs.

So solutions that don't require tiny chunk sizes are much more CDN friendly.

This really creates problems on scale especially on busy streaming times like Sunday nights. Everyone is trying to make use of the same resources and this creates delays and buffering.

Quality Concerns

Rebuffering is an important factor in regards to customer satisfaction.

As the chunk sizes are reduced, the buffer is effectively shortened as well. As buffers serve as a security measure for smooth streaming, once the network drops below that chunk size the stream falls behind the live stream, subsequently pausing to rebuffer.

New Solutions

WebRTC does not have a buffer so it delivers everything as quickly as it can. Network speed is the largest constraint. Without a buffer, any frames that are dropped remain dropped in order to keep pace with the live stream.

However, as a UDP based protocol, WebRTC has ways to accommodate for packet loss such as forward correction error to make the video smoother. Those kinds of error checking, correction, and recovery accommodations are happening at the network level far below the userspace. Rather than building them out with custom application code, it's much easier to use preexisting solutions (such as Red5 Pro) that have already been built to deal with this.

WebRTC is a standard and most browsers are handling it effectively. For example, Google Hangouts which was built on top of WebRTC can usually be run without the need to install plugins.

As Chris says:

In a couple of years, WebRTC will be ubiquitous and everything will support it in terms of browsers.

Live streaming to VR

The next big thing in the live-streaming market is live streaming directly to VR headsets. The amount of data sent through to create a full 360 video is very large and needs to be delivered at 50 - 100 Mbps.

Methods such as real-time cropping, or only rendering in high quality where the user is actually looking, are current ways to deal with this issue.

However, it is going to require a complete rethinking of the codecs to be able to deliver the full 360 video at 50 - 100 Mbps.

Interractive Video

Another emerging trend bound to continue, involves creating interactivity with live events.

Games such as Joyride (build with Red5 Pro) allow users to play an HQ trivia style game directly with their friends who are streaming alongside each other, playing as a group.

The Role of Older Protocols

RTMP is still a universally supported protocol and unlike Flash, it is not slated to be discounted. Most encoders support it as a ingest format across different devices. Furthermore, sub-second latency is possible as well after tunning the settings.

HTTP based protocols are kept around because they are widely accepted protocols, but this has come with the problem of latency and scalability. WebRTC, on the other hand, is perfectly scalable when using the right platform.