Skip to content

Instantly share code, notes, and snippets.

@andriika
Last active November 27, 2024 13:17
Show Gist options
  • Save andriika/8da427632cf6027a3e0036415cce5f54 to your computer and use it in GitHub Desktop.
Save andriika/8da427632cf6027a3e0036415cce5f54 to your computer and use it in GitHub Desktop.
live video streaming, html5, mpeg-dash, ffmpeg

Live streaming with HTML5

This document contains collected notes regarding html5 live streaming approaches. I'm trying to understand how to build a system that enables streaming of live video to HTML5 clients. Following content is mainly centered around MPEG-DASH - modern way of dealing with given needs.

Approach 1

To prepare movie.avi from MPEG-DASH streaming we will execute following ffmpeg commands:

> ffmpeg -y -i movie.avi -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 1500k -maxrate 1500k -bufsize 3000k -vf "scale=-1:720" movie-720.mp4
> ffmpeg -y -i movie.avi -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 800k -maxrate 800k -bufsize 1600k -vf "scale=-1:540" movie-540.mp4
> ffmpeg -y -i movie.avi -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 400k -maxrate 400k -bufsize 800k -vf "scale=-1:360" movie-360.mp4
> ffmpeg -y -i movie.avi -vn -c:a aac -b:a 128k movie.m4a

It will generate m4a audio file and 3 mp4 video files with different bitrate variants: 1500k, 800k and 400k. Now, it's time to fragment them into video segments and generate the MPEG-DASH Manifest file. We'll do it using mp4box utility:

> mp4box -dash-strict 2000 -rap -frag-rap -bs-switching no -profile "dashavc264:live" -out movie-dash.mpd movie-720.mp4 movie-720.mp4 movie-720.mp4 movie.m4a

Resources:

Approach 2

It looks like there is a possibility to generate mpeg-dash content with single ffmpeg command, but I'm not sure how it works with multi-bitrate video variants:

> ffmpeg -y -re -i movie.mp4 ^
  -c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" -r 24 ^
  -c:a aac -b:a 128k ^
  -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p ^
  -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 ^
  -b:v:0 250k  -filter:v:0 "scale=-2:240" -profile:v:0 baseline ^
  -b:v:1 750k  -filter:v:1 "scale=-2:480" -profile:v:1 main ^
  -b:v:2 1500k -filter:v:2 "scale=-2:720" -profile:v:2 high ^
  -use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" ^
  -f dash movie-dash\movie.mpd

TODO: explain options ^^

The -re option tells ffmpeg to read the input file in realtime and not in the standard as-fast-as-possible manner.

Command above do all the needed conversion on the fly, but this could be optimized by dividing the process into two steps:

  1. prepare output video with multiple video/audio streams which corresponds do adaptive DASH streams. Consider -preset slow

    > ffmpeg -y -i sample.divx ^
        -c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" -r 24 ^
        -c:a aac -b:a 128k ^
        -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p ^
        -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 ^
        -b:v:0 250k  -filter:v:0 "scale=-2:240" -profile:v:0 baseline ^
        -b:v:1 750k  -filter:v:1 "scale=-2:480" -profile:v:1 main ^
        -b:v:2 1500k -filter:v:2 "scale=-2:720" -profile:v:2 high ^
        sample_dash.mp4
    
  2. stream prepared output video with minimal dash configuration including -re option to enable pseudo-live streaming

    > mkdir sample_dash
    > cd sample_dash
    > ffmpeg -y -re -i ..\sample_dash.mp4 ^
        -map 0 ^
        -use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" ^
        -f dash sample.mpd
    

Resources:

@Eddyhanderson
Copy link

Hey du, thanks for sharing this content.
Just one question, why yout didn't specify the time of video segments ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment