Streaming: Adobe HLS

GlobalDots
6 Min read

Welcome back to the Second Article pertaining to our Streaming MiniSeries; in the previous, opening Article, we introduced Streaming as a high level concept in the Delivery Panorama and focused on Streaming via RTMP, an Adobe proprietary Protocol.

Reduce your AWS costs by over 50%

Discover your Cloud Saving Potential – Answer just 5 simple questions. AppsFlyer, Playtika, Lufthansa, IBM, top leading companies are already using our FinOps services.

Reduce your AWS costs 
by over 50%

Today , our technical journey into Adobe proprietary Streaming technologies continues by taking a closer look at HLS , also known as HTTP Live Streaming, aka Adobe’s compliance to Apple’s diktats: this is peculiar as it is one of the few proprietary technologies compatible with all iDevices, as well as its general adoption since introduction has been pretty wide. Without further ado, let’s have a deeper look into it.

global-of-stream-video

Image source

HLS or HTTP Live Streaming Technology: Introduction

To begin with HLS or HTTP Live Streaming is a misnomer. HLS can stream both pre-recorded (On Demand) and Live video and audio streams. Set aside this obviously confusing nomenclature (not that Adobe ever really paid a big effort into reasonable Marketing Product names), let us look at this Streaming Technology. HLS was developed by Apple to serve its range of Mobiles and deliver video and audio to iDevices (which run on Apple proprietary operating system iOS 3.0 and on Safari 4.0 or later browser).

It is interesting to note that Apple has always adopted Technologies which are different from mainstream. When Microsoft Windows was the de facto Operating System, Apple came up with iOS for its Mac Computer. It was natural that Apple will continue with its romance with their own creations and so when the time came for Streaming, it adopted HLS as its primary vehicle for it. Given that Apple iPhone and iPad are wildly successful and adopted Platforms, there has been considerable interest shown by video and audio content developers to use HLS standard to deliver Streaming content.

Though there have been attempts by Google through its Android Devices and others to provide Streaming via HLS, the experience so far has been rather mixed and much less than a “total success”. It is not a daring statement, to say that HLS is primarily a Streaming Technology meant for Apple devices.

HTTP Live Streaming Technology: Overview

We must say kudos to Apple for providing extensive Documentation and Tools to enable developers to adopt HLS. Like (almost) any other Streaming Technology HLS also accomplishes its task by breaking down the video file into smaller segments (also called chunks, the process is also hence called “chunking”). Since it is an adaptive Streaming process, the Bitrates must accommodate and adapt to the channel Bandwidth and other parameters of the media Player. This means that there is an element (called “Manifest File”, it’s the one ending in “.m3u8”, as opposed to the raw content chunks represented by the “.ts” file extension) embedding information into the Stream which does communicate all available Bitrates from the Media Server to the player. Therefore, HLS stream essentially consists of two channels – content and signal.

The three elements of HLS

Conceptually, HTTP Live Streaming’s Delivery workflow consists of three parts: the Server, the Distribution Component and the Client Software.

The Server takes care of preparing the media for distribution by encoding the original video /audio file and formatting it for Delivery.

Since the HLS Streaming is based on the HTTP protocol, its Distribution is through an off-the-shelf Web Server. The Web Server is responsible to distribute the formatted video.

The third architectural Component of HLS consists of Client Software which performs the task of ascertaining the required media, downloading it and restoring original content before streaming to the End User. This software is part of iOS 3.0+ Operating System in case of Apple devices and Safari 4.0+ for Web Browsers.

Typically in a Live Streaming context, the Server receives raw video and encodes it as H.264 video and AAC audio using a Media Encoder and outputs it in an MPEG-2 Transport Stream. A Stream Segmenter, in turn breaks this stream into segments (or chunks, file extension “.ts” as mentioned above). It also creates an index file (or Manifest, file extension “.m3u8”) which stores information on the segments’ numbering and distribution across Bitrate renditions. The index file and the associated segments are then placed on a Web Server. The Client Software creates a continuous media stream by using the index file and recreating the video/audio segments.

Adding complexity to HLS Streaming : Adaptive Streaming

You must recall that the strength of HLS lies in Adaptive Streaming. This is all the more important because HLS is used in a Mobile Devices’ world, where resources in terms of bandwidth may vary greatly from one moment to another, say from a cellular to Wi-Fi or from 3G to EDGE Connections. Needless to say, the turn-of-the-Century generation of one single segment (Bitrate) Streams with an associated index file would be totally inadequate in dealing with today’s requirements (and make End Users connecting via Satellite from the Sahara Desert very unhappy). Therefore there is a need to introduce a bit of complexity into the basic structure which we discussed above.

One of the ways to handle alternate streams’ renditions is to generate and store several sets of segment files and deliver only a single compatible stream. This is exactly how HLS handles Adaptive Streaming. Chances are, you may now be wondering following: “Doesn’t having a number of different sets of segments put a strain on Server resources?” Of course yes. You have to store different files’ renditions, which means more Storage requirement on your Server’s part. Imagine that you have to keep thousands of video files for a video On Demand service. In such a case, storing several versions of the same original video can become a serious issue.

In HLS, the master index (or Manifest) file contains information about the alternate index files and related segments. Several alternate index files with segments are available but only one of them is streamed depending on the current Bandwidth of the receiver. In an On Demand environment, the master index file is downloaded only once at the beginning of the streaming session when the first alternative index file along with the respective segments is streamed. Later, any of the alternates can be chosen by the receiver depending on the available bandwidth. In a Live Environment instead, the Manifest file updates every time a new Live chunk is produced and needs to be referenced for Clients to keep on enjoying the “on air” event.

Some further thoughts on HLS’ chunks or segments

Each content chunk or segment accounts for a given playback time on an End User part, as you may know by now. This Setting – like most other ones – are configurable and as always, there are different views regarding what the “ideal” playback duration of any given chunk shall be.

Adobe’s recommended length duration of segments (chunks) used in HLS is 10 seconds. Though Apple claims that this enables optimum caching for a CDN, there are doubts raised by some tech-heads (including us). Our opinion is that ten second segments may be too long for efficient caching, when a CDN is adopted for Delivery of a Live Event.

Conclusions to our HLS Introduction

The subject of HLS is vast and requires lots more discussion than could be presented here. However, we have covered the essentials of HLS which was our intention and further Posts will touch HLS as a Streaming Technology much more in depth, especially as related to a CDN sitting in the middle for Delivery and its own challenges. To wrap it up –

  • HLS Streaming as a Standard has been developed by Apple for streaming video and audio content to their native receivers or Media Players;
  • HLS consists of index files (Manifests) and segment files (chunks) , which together enable streaming of media at various Adaptive Bitrates;
  • HLS does not directly support DRM or Digital Rights Management;
  • HTML5 video tag is recommended for playing HLS video on a Website (even yours!);
  • Due to HLS’ popularity, most Hardware Vendors today offer Live HLS-compatible Products;
  • HLS implementation is relatively simple;
  • There are limitations in HLS streaming (which we’ll discuss further into next coming Articles)

Original article.

Latest Articles

Cut Big Data Costs by 23%: 7 Key Practices

In this webinar, we reveal a solution that cuts big data costs by 23% and enhances system efficiency - without changing a single line of code. We’ll also explore 7 key practices that will free your engineers to process and analyze data at the pace and scale they need - and ensure they never lose control of the process.

Developer AXE-WEB
15th April, 2024
Project FOCUS: A New Age of FinOps Visibility

It’s easy for managers and team leaders to get caught up in the cultural scrum of FinOps. Hobbling many FinOps projects, however, is a lack of on-the-ground support for the DevOps teams that are having to drive this widespread change – this is how all too many FinOps projects become abandoned on the meeting room […]

Nesh (Steven Puddephatt) Senior Solutions Engineer @ GlobalDots
27th March, 2024
Optimize Your Cloud Spend with a FinOps Maturity Assessment

Achieving FinOps is a tall order: it demands a degree of organizational self-awareness that some companies are constantly battling for. Consider the predicament that many teams find themselves in: while their cloud environments may contain a number of small things that could be optimized, there are no single glaring mistakes that are consuming massive quantities […]

Nesh (Steven Puddephatt) Senior Solutions Engineer @ GlobalDots
27th March, 2024

Unlock Your Cloud Potential

Schedule a call with our experts. Discover new technology and get recommendations to improve your performance.

Unlock Your Cloud Potential