logo Edgio
Products
Solutions
Learn
Company
blog-hero-technology
Blog

Low-Latency Live Streaming is Key to Capturing Online Audiences

By: Charlie Kraus | December 30, 2020
Print Article

Original source: Limelight Networks

Nearly every major social media platform — except for maybe Pinterest — now has live streaming built in. Live streaming is everywhere, from e-sports to actual sports to SpaceX launches, and more children want to grow up and become professional live streamers than want to become astronauts. On the technical side, platforms are now in an arms race to provide low-latency live streaming to their users.

It might be easy to go live nowadays but affecting that technical marvel behind the scenes is anything but simple. Since live streaming is table stakes for your platform, the solution that can offer the broadest range of low latency streaming capabilities will outcompete the rest. Not every live-streaming application requires the lowest possible latency. Several factors determine which solution is the best fit for a given application. Big-name streamers will flock to the platform that offers the most appropriate video latency solution because that platform will provide the best viewing experience for their audience. [KC1]

Low-Latency Streaming is a Multidimensional Problem

Latency in video isn’t quite the same as latency in other forms of internet media. In the latter, latency is the time it takes for a server to fetch content and send it to your browser. Video latency, on the other hand, represents the lag between the time when a frame is captured and the time when it shows up on your monitor.

As such, video latency can be a multidimensional problem. For example, imagine you’re streaming an e-sports competition in which two well-known gamers compete against each other. The stream doesn’t include just the action of the video game — it also includes camera shots of each gamer’s face, plus an additional video inset with a live commentator.

To present an optimal experience, four video streams — the game, the two players, and the commentator — must be broadcast simultaneously to a vast audience. (For comparison, congresswoman Alexandra Ocasio-Cortez’s recent Twitch event reached an audience of 430,000 viewers.) If one stream has higher latency than the others, the whole presentation suffers — you could find the commentator remarking on action that took up to 30 seconds in the past, for instance.

As the streaming audience matures, they’re beginning to demand highly interactive and technically sophisticated streaming events. Interactivity requires the lowest latency, ideally sub-second, to provide the best viewer experience. Today’s audiences also want a high-quality broadcast-like picture quality. [KC2] It’s almost a given that a successful streamer will have a high-bandwidth fiber internet connection, and they will choose the video format and encoding protocol that can deliver their streams with the required latency.

HTTP-Based Chunk Streaming Works Against Low Latency

AWS defines ultra-low latency as being within a range of 2 to 0.2 seconds and low latency within 6 seconds and two seconds. The default latency for delivering common HLS and DASH video formats can be thirty to sixty seconds. With this high latency, it’s impossible to deliver the interactivity that audiences want.

The culprit behind this high latency is what’s known as HTTP-based chunk streaming. HTTP streaming formats such as MPEG-DASH and HLS break the video into small segments or chunks that must be buffered before playback. While it’s possible to reduce the size of the chunks to provide low latency, making them too small increases the chance viewers will experience video rebuffering and other playback issues.

Complicating the video delivery workflow, audiences watch streaming video on TVs, phones, laptops, tablets, and more. Each of these devices may use a different video format. This means a broadcaster must take each video stream and transmux it into popular video formats such as HLS and DASH before sending it. In addition, each stream may require configurations to optimize delivery for the best picture quality possible for the viewer’s internet connection condition. [KC3]

Conclusion: Limelight Networks Offers sub-second Latency with its Realtime Streaming Solution

[KC4] Limelight Real-time Streaming uses WebRTC (Web Realtime Communication) technology. This open-source standard can deliver streams with less than a second latency and is well-suited for interactive applications. Well-implemented solutions can deliver reliable, broadcast-quality, real-time video streaming at scale. This has been a significant area of development in 2020 for Limelight’s next-generation Realtime Streaming solution that is integrated with the scale and scope of global CDN capacity.

While enabling sub-second latency, real-time Streaming also incorporates data sharing that can be used to create interactive services. We are seeing these emerge in new applications for live video streaming, including online casinos, auctions, learning, and in-event sports wagering. We expect to see WebRTC emerge as the leading option for scaling the delivery of live content incorporated into the social audience experience.

In short, the technical side of live streaming matters almost as much as the on-screen talent. Make the most of your live streaming implementation with Limelight, now Edgio — contact us today!

Hero Texture Shield 2000x1220

Explore Edgio Solutions

Get the information you need. When you’re ready, chat with us, get an assessment or start your free trial.