a different point of boo

june 2019

The world can be more interesting – more loving, more empathetic – when seen from a different point of view. That’s the thought that seeded this month’s hack project.

Meet Boo, my 10-month-old Golden Retriever.

Okay fine, he’s not quite that tiny now.

Wouldn’t it be fun to watch life through his eyes, streaming video and location in realtime to my friends and family?


The Hardware

We start with the GoPro. I had a HERO 4 Black on hand, but unfortunately, it wouldn’t do the job here for a couple reasons. First, it doesn’t support RTMP streaming. Second, it has virtually no video stabilization (neither EIS nor OIS). Without it, you’d feel like throwing up 30 seconds into Boo’s walk.

Luckily the HERO 7 Black has both of these things.

HERO 7 Black with EIS enabled

Next, I needed to attach the GoPro to Boo. GoPro’s got us covered with the Fetch Dog Harness. Should fit any dog between 20-120 pounds.

Now I wanted to make sure the GoPro had a way of streaming even when Boo was out of range of my iPhone. Any LTE mobile hotspot with good coverage will do the trick (GoPro’s 720p streaming runs between 3,500-4,000kb/s). I used a Verizon one, and picked up a cheap case on Amazon to tether it to the GoPro harness.

Boo, all rigged up

Lastly, to stream realtime GPS telemetry, I turned to a Raspberry Pi Zero W (which – as an aside – seems like an absolute miracle of technology every time I use one) and Adafruit's GPS module. The Pi measured 66mm x 30mm x 5mm, making it super easy to mount to Boo's camera kit alongside a small battery to keep it running.


The Software

Moving on to the virtual side of things, I wanted to power this end-to-end via the server rack in my garage (that’s for another story). I'm sure you could skip a bunch of this using AWS/GCP, but that's no fun. 😋


Streaming Footage

I started with Node-Media-Server (nms), a simple and stable RTMP server implementation, in its own Ubuntu Server VM running in a tmux session. The idea here was to output an HLS stream anytime the GoPro went live, and to provide an API to query stream activity.

First, I updated nms's configuration to enable HLS remuxing, MP4 archiving of streamed content, and its internal API. (Note, you'll need to install ffmpeg if you haven't already.)

const config = {
  rtmp: {
    port: 1935,
    chunk_size: 60000,
    gop_cache: true,
    ping: 30,
    ping_timeout: 300
  },
  http: {
    port: 8000,
    mediaroot: './media',
    allow_origin: '*'
  },
  https: {
    port: 8443,
    key: './privatekey.pem',
    cert: './certificate.pem',
  },
  trans: {
    ffmpeg: '/usr/bin/ffmpeg',
    tasks: [
      {
        app: 'APP_NAME',
        hls: true,
        hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
        dash: true,
        dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
        mp4: true,
        mp4Flags: '[movflags=faststart]'
      }
    ]
  },
  auth: {
    api: true,
    api_user: 'USERNAME',
    api_pass: 'PASSWORD',
    play: false,
    publish: false,
    secret: 'SECRET'
  },
};

Once running, I used ffmpeg on my local machine to start streaming a test clip to the server via RTMP.

ffmpeg -re -i test-clip.mp4 -c copy -f flv rtmp://SERVER_HOST/APP_NAME

Worked! The HLS stream became available at:

http://SERVER_HOST:8000/APP_NAME/index.m3u8


Processing for the Archives

For those who'd miss Boo live and want to catch up on his San Francisco strolls, I wanted to create an archive of streams that felt alive. This required writing to processing job that generated from each nms recording:

The job is also responsible for uploading the resulting files to a local file server, making them publicly accessible, and the resulting metadata to a SQL database.

To create the video thumbnail, I first calculated the midpoint of the clip:

ffmpeg -i %s 2>&1 | grep Duration | awk '{print $2}' | tr -d , | awk -F ':' '{print ($3+$2*60+$1*3600)/2}'

Then I generated the 20-second-long version without no audio track and a conservative resoution and bitrate to keep it small:

ffmpeg -hide_banner -loglevel panic -ss MIDPOINT_DURATION -i VIDEO_PATH -an -t 20 -preset medium -b:v 1000k -vf scale=-2:360 VIDEO_THUMBNAIL_PATH

To create the image thumbnail, I captured the frame of the clip's midpoint. This creates a seamless transition to the video thumbnail for clients that load the video poster before autoplaying the video itself.

ffmpeg -hide_banner -loglevel panic -i VIDEO_PATH -vcodec mjpeg -vframes 1 -an -f rawvideo -ss MIDPOINT_DURATION IMAGE_THUMBNAIL_PATH

Moving on!


GPS Streaming

To power a little heads-up map display of Boo’s whereabouts over the live stream, I wrote a script powered by gpsd for the Raspberry Pi. The script connects to the same cellular hotspot as the GoPro, tests for internet connectivity, waits for a satellite fix, then broadcasts each GPS reading to a redis channel. Readings include latitude, longitude, speed, altitude, rate of ascent, heading, and more.


Bringing It All Together

With the streaming server configured, I moved on to build the site where friends and family could watch Boo. Of course, any fun site deserves a fun domain. Enter boo.dog.

To power boo.dog, dokku serves as the Docker container orchestrator (and also manages other fun projects on the same server), gunicorn as the HTTP server, flask as a wonderful micro web framework.

To start, I knew I wanted a bit of a magical experience where the site would automatically switch to the stream when active and fall back to a cover photo when offline. To achieve this the frontend polls the nms API via a flask proxy endpoint. When the stream becomes active, the background flips to the HLS stream and vice-versa.

That same proxy endpoint also is responsible for keeping track of active viewers. Every time it’s called, I set low-TTL key in redis with a unique device fingerprint. When a viewer leaves, their key expires and they fall off the count. And that heads-up map display? It's using server-sent events via flask-sse to receive updates from the previously-mentioned redis channel and relocate the marker location with every GPS reading.

Making the HLS stream work without any user interaction required a few extra tweaks for mobile devices. WebKit on iOS will not autoplay any HTML5 <video> source with an audio track unless it's explicitly muted. In addition, up until the release of iOS 10, <video> elements could not play inline. Instead, they were required to go fullscreen with iOS playback controls. For devices running iOS 10 or above, adding playsinline  as a <video> attribute allowed me to avoid fullscreen playback and retain my UI and media controls.


For the Future

And that about wraps it for v1 of this project, though I’m already itching to build its next feature: SMS notifications to replace blasting family and friends one-by-one over iMessage when Boo starts streaming. I'm sure they'd love to sign up.


Try catching Boo live at boo.dog.

go home