Category: raspberry pi

If you’ve been following my blog post series on the development of my ever so useful cat cam, powered by a Raspberry Pi, you’ll know I’ve made several attempts at a more stable and scalable streaming solution for my Cat Cam. As it stands today, I’ve been using Motion. While it’s a decent tool, Bandwidth has been my primary concern and I’d like to be able to stream real-time without sucking up what measly bits my ISP gives me if more than a few folks decide to show interest.

So far we’ve tried ffmpeg => ffserver and that turned out exactly how you probably thought it would. Next, I tried swapping ffserver with an Nginx-powered RTMP server. While not an entirely fruitless endeavor, there were some blockages that I just couldn’t get past.

I received a suggestion from a colleague to fire up the Raspberry Pi’s hardware encoder/decoder. Up until yesterday, I didn’t know this was a thing. Shame on me for not looking into it. So that’s what we’re going to cover in tonight’s post: taking some of what we learned from our first RTMP attempt and make the hardware do all the work. With any luck, we should see some real perf gains, possibly enough for live streams to start instantly (which would make web players happy).

Since I felt like including it here would deviate from the purpose of this post too much, I wrote up how to Add RTMP Support to Nginx if you installed it via apt-get like me. If you’re in that boat, take a moment to read over that post then come back to this one.

Setting up ffmpeg to use hardware H.264 encoding used to be a fat challenge, but they’ve since added support to the official codebase. If you followed my original ffmpeg post, you’ll have a recent enough version that includes this code, but we’ll still need to compile it.

What we’re looking for this time is the OpenMAX IL (Integration Layer) acceleration module.

I confirmed VLC is able to play the stream, which is excellent, and there are no lag or jitter issues. It’s about 10-15 seconds behind live, which is totally fine.

I was able to set up an HTML5 player using tools from Bitmovin. I’m not entirely happy with this setup, though, as the player isn’t free and only HLS is supported, right now1. In my next post I’ll cover a new idea that came to mind when looking into the coolness of Ruby on Rails 5: WebSockets.

Update July 11, 2017:@HPGMiskin pointed out libomxil-bellagio-bin is not a thing. I’ve pulled that from the optional step for missing OpenMAX headers.

This a follow up from this article I wrote talking about trying to get ffmpeg + ffserver running the Cat Cam. I abandoned that project and went in search for a new solution. What I came up with was ffmpeg + nginx. Here’s how that worked out.

After a night of streaming failure, I decided I’d give a shot at using ffmpeg to stream to an RTMP server via nginx. RTMP servers are generally pretty basic in that they just relay what they receive to whomever connects. This seems like a pretty straightforward process, from what I can tell. The hardest part would be to get the nginx source bits and compile it with the nginx-rtmp-module. Here’s how we’ll do that.

Quick Side Note

Before we begin, I found out during this process I never compiled ffmpeg with H.264 support. If you didn’t, either, let’s sidetrack for a moment. Run this to find out:

ffmpeg -encoders | grep 264

If H.264 isn’t on the list, then let’s re-compile :

./configure --enable-gpl --enable-libx264makemake installldconfig

NOTE: If you get an error saying it can’t find the library:

ERROR: libx264 not found

then you’ll need to run (with periods):

apt-get install yasm libvpx. libx264.

Once that’s done, verify ffmpeg has H.264 support and let’s move on.

ffmpeg -encoders | grep 264

Nginx

Compile & Install

Since we’re getting the generic nginx from source, we’ll need to make sure some libraries are installed. You can always compile nginx without them, but that’s more work, in my opinion, and could lead to problems, later. You might not need all of these, but the Linux system I’m working on was missing most of this; never hurts to share.

apt-get updateapt-get install libpcre3 libpcre3-dev libssl-dev

We’ll need the nginx source. Pick it up here. I used nginx-1.10.11. I’m a fan of newer versions when possible. and it’s been out for a month, now. I suspect it’s stable.

Now that we have that set up, let’s get a player. I’m opting for JW Player. you can pay money for it, if you want, but you’re really just paying for their service. the Player files are 100% free. This is their official site, but you can also snag it from GitHub.

How you want to implement it is up to you.

Setting Up ffmpeg

Let’s give this another go. The command you can use here is a little more complicated as we’ll need to stream a legitimate video, but here’s the idea:

We’re Streaming!

At first I was super excited that it started working, but a problem arose, pretty quickly: it’s streaming way behind real time. After about ten minutes, the speed was near 1x, but still not quite there, and I suspect the buffer was quite full. VLC takes about 30 seconds to start playing the video.

At least it’s sorta working, right?

My JW player isn’t able to load the stream, so some tweaks will be needed.

If only…

The RPI3 isn’t strong enough to live-encode H.264. I’d bet lots of dollars that if I had a stronger piece of hardware to work with, It could do it, and this wouldn’t be an issue. I might see what would happen if I used my MacBook as a test3.

I really wish I could embed this into my site. JWPlayer isn’t going to ever be happy with the stream the way it is. A thought that came to mind is going back to motion, using my streaming server to capture the mjpeg stream with ffmpeg, then relay it to the rtmp server in a better format.