Thanks!

You are now subscribed to our monthly blog digest. Happy reading!

Click anywhere to dismiss ...

Live Streaming MPEG-DASH with Raspberry Pi 3

Hardware

This post describes the setup on a Raspberry Pi 3. It should also work on other versions (rpi0, rpi1, rpi2) that share the OpenMAX h264 hardware encoder implementation, but it has not been tested.

It also uses a Logitech C310, instead of the RPI Camera module, which can be picked up for a few dollars more. Any webcam that supports raw video (YUV) capture should work as well, but again, it has not been tested.

MPEG-DASH vs MJPEG

Most streaming solutions of a video feed from a rpi use MJPEG via mjpg-streamer. While this solution works well while on the same LAN as the RPI, streaming over the internet or a WAN link becomes laggy.

MPEG-DASH is a codec agnostic format for adaptive streaming that specifies a manifest file with a series of representations (resolution, bitrate, et. al.).

Archlinux ARM vs Raspbian

This tutorial uses Archlinux ARM. To achieve similar results for Raspbian the instructions to recompile FFMPEG can be found here.

Recompiling FFMPEG

FFMPEG as supplied by most vendors will not take advantage of the hardware H264 encoding capabilities of the RPI. To recompile FFMPEG create a directory $HOME/ffmpeg and download the PKGBUILD from the Archlinux ARM PKGBUILDs repo at https://github.com/archlinuxarm/PKGBUILDs/blob/master/extra/ffmpeg/PKGBUILD.

Apply the following patch or edit the file to make the changes

PKGBUILD.patch

Then compile as you would any other package making sure to pass the --ignorearch flag to makepkg:

$ makepkg -sri --ignorearch

makepkg will automatically install the dependencies and install the resultant package. Verify that h264_omx encoder is available:

$ ffmpeg -encoders 2>/dev/null |grep h264_omx
V..... h264_omx             OpenMAX IL H.264 video encoder (codec h264)

NGINX

Install NGINX (pacman -S nginx) and configure it to serve out a directory. This example sets it up to serve out of /srv/http/dash:

nginx.conf

Start and enable nginx:

# systemctl enable nginx
# systemctl start nginx

Consistent Camera device names

By default the cameras will be assigned an indexed /dev/videoN device name as well as symlinks in /dev/v4l/by-id and /dev/v4l/by-path. A simple udev rules script can be created to also symlink them to a consistent path using device’s usb serial. Create /etc/udev/rules.d/camera.rules as follows:

camera.rules

It must modify it to match the ID_SERIAL of your camera. To determine this the udevadm test command is used:

$ udevadm test $(udevadm info -q path -n /dev/video0)

The symlink path can also be adjusted, the the above rules it will create /dev/cameras/camera01 for the device with the ID_SERIAL of 046d_081b_XXXXXXXX.

User creation

Create /usr/lib/sysusers.d/ffmpeg.conf with the following contents:

camera.rules

Then create the users with systemd-sysusers

# systemd-sysusers /usr/lib/sysusers.d/ffmpeg.conf

Systemd Service

Now that NGINX is setup to serve the content, the cameras are named consistently, the systemd service can be created:

dash@.service

The ffmpeg args are quite confusing at first. To simplify refer to this annotated command

/usr/bin/ffmpeg \            # The path to ffmpeg
    -y \                     # Overwrite output files without asking
    -f v4l2 \                # Input format
    -video_size 1280x720 \   # Input video size
    -framerate 25 \          # Input framerate
    -i /dev/cameras/%i \     # Input device
    -vcodec h264_omx \       # Encoding codec
    -keyint_min 0 \          # Allow every frame to be a key frame
    -g 100 \                 # But at most every 100 frames will be a key frame
    -map 0:v \               # Map input stream 0 to the video of this stream
    -b:v 1000k \             # Set the bitrate to 1000k
    -f dash \                # Output format
    -min_seg_duration 4000 \ # Segment into ~4 second parts
    -use_template 1 \        # Use templated names for output
    -use_timeline 0 \        # Dont use the segment time in the template
    -init_seg_name \         # Initial segment name
        init-%i-$RepresentationID$.mp4 \
    -media_seg_name \        # Segment names
        %i-$RepresentationID$-$Number$.mp4
    -remove_at_exit 1 \      # Remove all files when stopping
    -window_size 20 \        # Keep 20 segments on disk
    /srv/http/dash/%i/%i.mpd # Dash manifest name

The %i in the service file will be substitutes for the service’s instance name. That is if dash@camera01 is started %i becomes camera01.

Start and enable the service with:

# systemctl enable dash@camera01
# systemctl start dash@camera01

Multiple output streams

To transcode to multiple output streams, such as one 1280x720 and another 640:360 to allow for adaptive quality and bandwidth, simply specify another set of codec options:

/usr/bin/ffmpeg \            # The path to ffmpeg
    -y \                     # Overwrite output files without asking
    -f v4l2 \                # Input format
    -video_size 1280x720 \   # Input video size
    -framerate 25 \          # Input framerate
    -i /dev/cameras/%i \     # Input device
    -vcodec h264_omx \       # Encoding codec
    -keyint_min 0 \          # Allow every frame to be a key frame
    -g 100 \                 # But at most every 100 frames will be a key frame
    -map 0:v \               # Map input stream 0 to the video of this stream
    -b:v 1000k \             # Set the bitrate to 1000k
    -vcodec h264_omx \       # The second encoding codec
    -keyint_min 0 \          # The second minimum keyframe interval
    -g 100 \                 # The second max keyframe interval
    -map 0:v \               # Map the same input stream to this codec
    -b:v 500k                # Set the bitrate to 500k for this stream
    -vf "scale=640:360"      # Scale the input down to 640x360
    -f dash \                # Output format
    -min_seg_duration 4000 \ # Segment into ~4 second parts
    -use_template 1 \        # Use templated names for output
    -use_timeline 0 \        # Dont use the segment time in the template
    -init_seg_name \         # Initial segment name
        init-%i-$RepresentationID$.mp4 \
    -media_seg_name \        # Segment names
        %i-$RepresentationID$-$Number$.mp4
    -remove_at_exit 1 \      # Remove all files when stopping
    -window_size 20 \        # Keep 20 segments on disk
    /srv/http/dash/%i/%i.mpd # Dash manifest name

Do note that there is only one hardware encoder on the RPI and access to it is controlled by a mux which will flip back and forth between the streams.

Playing in the browser

There are several options to watch the stream. In the browser, MediaSource Extensions are used to provide the content to a <video> element. The reference implementation is dash.js, however other options such as Shaka or clappr provide the ability for multiple formats in addition to MPEG-DASH to be played. For this tutorial Shaka will be used since its less code to get going, but clapper has a nicer default UI.

Create /srv/http/dash/camera01.html with the following content

camera01.html

This example is straight from the Shaka Tutorial only modifying loading Shaka from the CDN and setting the manifest to our local camera01.mpd.

Browsing to http://<rpi_ip>/camera01.html should now show the live stream. Since the manifest is constantly being updated with the current segment times, refreshing the page will not start the video from the “beginning” but from the last segment time.

Protect Your Business Data

We are passionate about helping our customers protect their data. We want you to use Jungle Disk to protect yours. Click on Sign Up to get started. It takes less than 5 minutes!

Sign Up