, , ,

Livestreaming with libav* – Tutorial (Part 1)

Benjamin Binder

Green screen live streaming production at Mediehuset København. Author: Rehak

Lifestreaming is the real deal of video today, however there aren’t that many content creation tools to choose from. YouTube, Facebook and Twitter are pushing hard to enable their users to stream vlogging-style content live from their phones with proprietary Apps, and OBS is used for Let’s Plays and Twitch streams. But when you want to stream events or lectures you are pretty much on your own.

In this series of posts I want to share the experiences I gained over the past couple of weeks while writing an application that captures video and audio and creates a simple livestream. This application is designed to be the basis of a simple streaming desktop application. This series is also supposed to help people to better understand the ffmpeg libraries for creating videos. While there are some great tutorials on how to build video players, almost no one troubles himself on writing tutorials for creating/encoding videos.
The posts are intended for people who already have a little experience with video creation. The first part of the series will be some background stuff, the second post will mainly focus on building an applications with the ffmpeg-libraries.

Assuming that you want to embed your livestream in a web page and play it with common browsers one of the first things to do as a content creator is to think about which codecs and container formats to use. This greatly influences the experience of the users.

Not all operating systems and Browsers support the same set of techniques. Therefore I want to give a very quick overview on the most common codecs and containers. This overview is mainly based on playback support for the codec rather than quality issues because this will be your first concern when starting with lifestreaming. Ronald S. Bultje provided an awesome quality and performance oriented comparison of modern video codecs.

Video and audio formats

First of all you have to look at your users: Which operating systems and devices do they use?

While on PC all codecs and containers you can think of are probably supported in some library, it is a very different story on mobile devices.

Because video playback (decoding) uses a lot of computing power and therefore drains your battery, mobile operating systems try to use hardware acceleration as much as possible. But this also means that not all codecs and containers are supported for livestreaming. Here is a support list of the most common operating systems and browsers.

                           vp8/9+webm  h264+mp4/DASH   h264+mkv
Chrome (+Opera, Vivaldi...)     x               x           -
Firefox                         x               x           -
Safari                          -               x           -
Edge                            vp8             x           -
iOS (all browsers)              -               x           -
Android (Chrome)                x               x           -

Sources: caniuse.com, MDN, Android developer guide, iOS developer guide, Microsoft developer guide

Even on PC support for hardware accelerated video playback is important to think about, since it relieves the CPU of a lot of work and lets the PC run quieter.

When you want to livestream to mobile devices and the most common browsers on PC, there are basically three options to go with:

h264 + rtmp

With h264 being an older codec it is now well supported on almost all devices, most of the time even with hardware acceleration. On lower bitrates you get the characteristic block artefacts, but all in all this codec offers goodish performance and efficiency. If you want to archive or postprocess your videos, the h264/5 is a good basis. Keep in mind that the use of both h264 and h265 may be associated with license costs if you plan on using the codecs in a commercial way.

The rtmp container is probably the most broadly used container with Twitch being one of its biggest users.
But rtmp is a proprietary software by Adobe. The company’s support (or rather lack of) for some of their products should make you think about whether to bet on such a technology.

h264/5 + mp4

h265 (HEVC) is merely an iteration of h264. Because it is relatively new, not all hardware supports this codec (only Intel > Skylake, AMD -> Carizzo, AMD > Radeon X400, Nvidia > GeForce 900).

Unfortunately the mp4 container is not usable for live-streaming because of its necessity to know exactly how long the video should be (which is kind of hard to know in advance). MPEG-DASH can be used as a workaround for the problem. It basically splits the video of unknown size into many small chunks with known size. This also allows you to change the resolution of the stream while playback (like YouTube does).

h264/5 + mkv

Because of browser limitations this combination cannot be used to stream directly to the user. But if you are using a server to distribute the stream to all the users (what you will most certainly do), you can use this combination to stream against the server as an intermediate step in your lifestreaming pipeline.
The open-source container matroska is probably the best choice when it comes to streaming h264/5, because unlike mp4 it doesn’t need to know how long the video is. Since it is supposed to be streamed via HTTP you also won’t run into any issues with restrictive firewalls.

Because webm is a subset of matroska, you can very simply convert the container formats and transcode the video to vp8/9 on a server and stream it to your users from there.

vp8/9 + webm

Wanting to avoid license costs for h264/5, Google implemented the open and royalty free VP8/9 codecs as well as the webm container. This combination shines especially in combination with the HTML5 video-tag. Being an HTML standard most common browsers (Chrome, Firefox, Edge) implement the codec and container. Building a player is as simple as inserting the video-tag into your page. VP9 offers far better efficiency with better image quality (especially when using low bitrates) than h264 while being resource-friendly on the decoding side. A very good tutorial on setting up VP9 can be found in the webm-wiki.

A real issue is Apple’s refusal to support this combination in Safari and on iOS. While on macOS you can simply switch to another browser, on iOS video playback isn’t supported at all.


So far we haven’t talked about audio a lot. But fortunately this is a very easy choice:

  • If you are using h264/5 in an mp4 or rtmp container you have to use either AAC or mp3.
  • If you are using vp8/9 you should use the open-source, royalty-free codec Opus. It uses two components to be able to encode all audio situations with great efficiency. SILK is used for speech oriented audio and CELT for the rest (such as music).

These three audio codecs are supported on almost all operating systems and browsers (Safari being once again the great exception). If you want to be 100% sure to cover all platforms use mp3.

A video codec to rule them all?

In the (hopefully) very near future a new video codec will arise and lay waste to all codecs that existed before. Because of compatibility issues but mainly because of legal reasons, the Alliance for Open Media, including but not limited to Amazon, AMD, ARM, Broadcom, Cisco, Google, Intel, Microsoft, Mozilla, and Netflix (basically all ginormous IT companies) decided to implement a new codec that will be completely open-source, patent-free and royalty-free: the AV1 codec. Early tests have already shown great efficiency and an image quality that may even surpass h265.


If you just want a simple and free solution, use the webm container with the vp9 video codec and Opus. Ignore iOS devices, Apple has to learn at some point. Also pray for AV1 to be usable.

https://giphy.com/gifs/justin-hope-hoping-fingers-crossed-l0NwNrl4BtDD7JCx2 via GIPHY


Image sources:


Leave a Reply