When do you need low-latency HTTP live streaming?

Online entertainment continues to see unprecedented growth. Year over year, live streaming is up a whopping 99% in 2020, and the number of people streaming video on demand was projected to be up by nearly 50 million at the end of 2020.

Although some of this growth can be attributed to changing viewer behavior during the pandemic, in reality, those behaviors just accelerated the move to online — look at the growing streaming audience for America’s Big Game or Australia’s Melbourne Cup for proof. And it looks like the trend will continue — some entertainment companies are now even planning to premier entertainment online exclusively. 

This rapid growth presents its own problems though. As content owners and distributors flock to the internet to deliver live content, viewers and online broadcasters alike have discovered that IP-based delivery has a long way to go in terms of latency. 

Traditional broadcasting typically carries a 4- to 5-second delay. However, today, many online services are outside that by a significant factor — some are serving content with up to 60 seconds of travel time from capture to consumption. And nowhere has this been as evident as with live sports, where neighborhood cheers from viewers getting entertainment via cable or satellites can arrive well ahead of the content. 

Even as satellite delivery takes a backseat, content providers must get their streams to viewers ahead of — or at simultaneous with — social media. A tweet typically takes 5 seconds to be published (33:54). Add another 5 to 10 seconds to craft the note, and it’s clear that a live stream is up against very tight limits to secure continuous engagement and satisfaction with delivery.

The industry is therefore constantly seeking to optimize workflows and minimize any delay, and with the introduction of the iPhone 12, Apple officially released its new low-latency HLS format. While the HTTP Live Streaming (HLS) protocol was originally built for scale and stability, this latest iteration is built for speed and seeks to push the boundaries of even live linear broadcasting. 

Compatible with one of the biggest operating systems and platforms right out of the box, it is destined to do well — the spec has already been adopted or committed to by a large number of companies in the media and entertainment industry. And the use cases are growing. Here are a few examples of business cases that warrant low-latency HLS.

Business cases that warrant low-latency HLS

Of course, the user experience must be impeccable to attract and retain viewers, but it’s not the only reason content owners and distributors should deliver with low latency. Physical distance is another reason for low-latency adoption, with content being delivered over wider distances as content owners and distributors move into new territories.

Another significant driver of low-latency formats is the immediacy with which social media platforms deliver news, as mentioned previously. This is particularly true in live sports. Online broadcasters can’t afford to deliver such events with delays that significantly exceed the time it takes for a tweet to be written and delivered.

Other new business cases for low-latency HLS include quizzes and real-time voting built around content and entertainment that must be available with little to no delay. Betting and live online auctions are other areas that are expected to benefit significantly.

Currently, the most critical application though is two-way video communication: teleconferences, video education, and even more specific cases, such as telemedicine, will all deliver far better user experiences and engagement when delivered without delay.

However, not all use cases require the immediacy low-latency HLS will bring. In fact, the video-on-demand user experience in particular seems resilient to slower startup times without a significant dropoff of viewers.

Use Cases for Low-Latency HTTP

  • Live video broadcasting

  • User-generated content

  • Online quizzes and real-time voting

  • Video chatting and teleconferencing

  • Virtual gambling and betting

How a modern CDN supports low-latency HLS

The underlying architecture of a modern CDN, like our edge cloud platform, is radically different from that of legacy CDNs. Because we use a pull — rather than a push — content delivery model on our platform, content owners maintain full control and can focus on delivery rather than waiting on a vendor to support new formats. 

For example using this approach, our platform was able to support protocols like HTTP3, QUIC, and low-latency HLS as soon as the specs were put into practice. As we continue to push the envelope, this integration will serve as the foundation for advanced analytics, emerging low-latency technologies, and edge-compute capabilities that bring real-time processing and decision-making closer to the user.

For more information on low-latency HLS, watch a recent webinar we hosted on the topic. And to explore more on what else you can do with a modern CDN that supports internet protocols, read our Guide to the Modern CDN: security and performance for today’s developer.

John Agger
Principal Industry Marketing Manager, Media & Entertainment
Published

4 min read

Want to continue the conversation?
Schedule time with an expert
Share this post
John Agger
Principal Industry Marketing Manager, Media & Entertainment

John Agger is Fastly’s Principal Industry Marketing Manager, Media & Entertainment. He has been involved with digital media for more than two decades with a strong focus on publishing and streaming media workflows.


In his role, John works with key strategic accounts to bring awareness of Fastly’s increasing product line as they relate to M&E. Over the course of his career, John has also worked for Adobe, Dolby, IBM, and Ericsson on go-to-market strategies, awareness and sustainability.

Ready to get started?

Get in touch or create an account.