Apple Streaming

Revision as of 16:05, 6 October 2012 by Sam.nicholson (talk | contribs) (Fixed the command)

HTTP Live Streaming (HLS) is a technology for Apple devices such as iPhones, iPads, iPod Touches and Safari to receive live video streams, given Apples disapproval of Flash RTMP streaming.

Please note, at time of writing (05/10/12) this is how it will work, but hasn't been implemented just yet. Will be implemented as soon as the stream's back up, and this notice will be removed when implementation is complete.

Functional Overview

Broadly, video is ingested from some source (in our case the YSTV live stream) and transcoded to an H.264 format and fed to a segmenter. The segmenter splits incoming video into short chunks of a few seconds long, and generates an M3U8 playlist file with addresses of the currently active chunks. Were this a static, non-live video file the transcoded video could be passed directly to the segmenter and all generated at once, however there are are other, simpler ways to serve on-demand video, using HTML5 players. The segments and playlist file are published to a directory on a web server, and the client reads the playlist file, and uses this to download the chunks listed and play them back to back. Meanwhile the playlist file is continually updated to remove the old files and add new files in.

Encoding

Ideally for HLS we would take a second copy of the live video from the Intensity Shuttle, or run a second server with another capture unit, however the former does not appear to be possible, and the latter is prohibitively expensive.

Instead, ystvencode which acts as the encode server for HLS (given its spare cycles most of the time), reads the high-quality internal stream (stream2) using rtmpdump, and transcodes it with FFMPEG as in the following command. This generates 480x320 H.264 encoded video, using the -re switch to have FFMPEG run as close to realtime as possible, and pipes it out to STDOUT. The segmenter, from https://github.com/johnf/m3u8-segmenter receives the video stream via a pipe and segments it into 3 second long chunks, with 3 active at a time. The complete command is:

rtmpdump -r "rtmp://ystvstrm1.york.ac.uk/live/stream2" -v | ffmpeg -re -i - -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s 512x288 -vcodec libx264 -b:v 200k -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 400k -bufsize 96k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -g 30 -async 2 - | ./m3u8-segmenter -i - -d 15 -p iphone/stream1 -m iphone/stream1.m3u8 -u http://ystv.co.uk/static/

This runs in /usr/local/m3u8-segmenter, and the iphone directory is a mount to the /data/webs/ystv/static/iphone folder on the webserver (served over NFS).

Clientside

For the clients, a simple browser detection script scans to see if the user is on an iPhone or similar, and if they are queries the database for an HLS stream name for that stream. If one is found, it is prepended with the site address (http://ystv.co.uk/ unless on a development server) and presented as a video tag containing an image of the YSTV logo in place of the usual live stream box, linked to the M3U8 playlist. Normally the user's device will then render a play symbol over the top, tapping this will open the phone's native HLS player.