Quick update to the home theater system. I installed the max with silverjaw lure (adds mpci-e and msata). To attach the lights, I also required a level shifter to take the 3.3V up to 5V required by the Apa102’s. Adafruit had just the component for the job: The 74AHCT125.

Here’s the panel with the max in there:

The results are pretty awesome. The APA102’s peform much better than the WS2801’s. No flicker. Fast, and most of all, more lights!

Next up is to add some buttons to turn the thing off when needed and add USB capabilities and a remote.

My room TV is working great. I’ve got the minnowboard max hooked up to some ws2801 LEDs and acting as a DLNA renderer. Since my last post on it, I’ve also paired up my PS3 six-axis controller and have been using it as a mouse. I have other cool ideas planned for it, but before those are finished, I wanted to get a Max-based system on my other TV and work it into a home theater setup.

First thing I have to do is get my TV off the floor. The base for the TV broke, so when we moved we decided to mount it on the wall. We found an “articulating” wall mount that looked like it would meet our needs:

What I like about this system, is that it has an open area on the base mount on the right and left that’s perfect for a double gang box. So I grabbed one of these

It fit very nicely in the open area:

I think it looks really clean -especially after I zip-tie up those cables!

I also wanted to add some ambilights to this setup like I did my room TV. However, I want to do something better. Enter, APA102.

The APA102 is similar to the older WS2801 in that it’s an individually addressable LED strip that supports 24 bit color. But that’s about where the similarities end. The WS2801, shown on the image below on the right, is a larger chip that takes up valuable space. You can only typically find strips of 32 LEDs/meter of the WS2801 flavor. The APA102 (shown on the left), however, have the IC built right into the LED. This allows up to 144 LEDs/meter.

The protocol is very similar to the WS2801. You send an array of bytes, but in 32bit segments. The first 32bit segment must be all 0’s. The next byte is for brightness, usually all 1’s, and finally the 3 bytes is the data. I updated my python light library to support the APA102 in only a few minutes. Here’s basically the meat of the code:data = bytearray()

self.spiDev.write(data)
The fun thing was discovering that it uses GBR color format.

Mounting the LEDs

Instead of mounting the LEDs on the back of the TV, I’ll be attaching them to the wall. I’m using the same aluminum brackets I used with my last LEDs. However, this time I’m going for a flat black strip 60LEDs/meter. 4 total meters of it for a grand total of 240 LEDs. To power them all, I need 80Watts @5V. This turned out to be a problem. I was only able to find a 5V power brick that was 50W. Not good enough. After some more searching, I was able to find this on amazon:

This can supply up to 100W @5V which should be more than enough. The problem is… where am I going to put this? It’s too large to fit in a double or triple gang. And if I run a cable through the wall for this, I’m guaranteed to lose a bit of voltage (4-8% depending on which wire I use). I also need a place to put the Max. After a trip to the Home Depot, i found what I need. It’s an “telecommunications” box 14″ by 14″. It fits and mounts between two studs. Turns out this was almost perfect. I cut a hole in the laundry room, right behind the TV, to the size of the box. One thing I didn’t plan well, was that the double-gang box for the TV prevented both being in the same spot. To fix this, I just cut a large hole in the box to let the gang box come through. This ends up being advantageous because it’s so easy to wire and rewire from the box instead of removing the outlet.

I have plenty of room to mount a max in the middle. There’s also a nice looking cover that screws shut to protect the insides from little hands:

That’s it for this part. Next part I’ll continue the adventure building a home theater including mounting the LEDs, the Max and speaker system.

In this video, I put some more LEDs around my TV. I already had a strip of LEDs at the bottom. This time, I add them to the remaining sides. I bought the aluminium angle brackets at Home Depot for about $5. Then I drilled 1/2″ holes and glued in the LEDs. The results are pretty nice.

The minnowboard max is a pretty cool platform. It’s small -just a little larger than a credit card, power efficient and best of all: powerful. It uses the Intel Atom single or dual core with hyperthreading. The best part about it, however, may be the integrated graphics with open source accelerated drivers.

Because the drivers are open source, you can expect them to generally “just work” on a typical linux distro. No extra EULA or compiling necessary like on other embedded system.

The max’s Intel HD graphics also supports OpenCL 1.2 via the open source beignet project. OpenCL allows you to offload otherwise CPU intensive computations onto the GPU which is specialized for specific tasks. Having OpenCL available in an embedded system opens up a lot of possibilities including image processing via the open source OpenCV (Computer Vision) project. I will be using all of these components in this project.Goal:To create a DLNA renderer that uses and LED strip to display an ambient light which correlates to the image on the screen. There are several projects out there that do this: boblight, hyperion are a few. In effort to teach myself some new skills, I opted not to use any of these projects and instead start from scratch with an architecture where I could utilize the CPU power that the max avails. I believe this exercise has created something simple, yet unique.Components of the system:

OpenCV for image analysis

Gstreamer to play the video

Beignet for OpenCL/GPU offloading

Rygel for DLNA renderer support

Vaapi for hardware accelerated decoding/encoding

MRAA for accessing IO on the Max

MaxVideoRenderer

Python – the language

Ubuntu 15.04

Hardware:

Minnowboard Max

LED Strip with the WS2801 IC (google for LED strip and WS2801 and you’ll find dozens of options that aren’t very expensive)

Aluminium right-angle bracket I got from Home Depot for $2

Double-sided heavy duty 3M tape.
(More about hardware in Part 2!)

OpenCV

OpenCV is a library for computer vision. It’s used for object recognition, detection and has a lot of image manipulation routines that can take advantage of hardware acceleration where available. OpenCV 3.0, now in beta, features transparent OpenCL usage when available. In the version 2.4-days, you had to use special opencv function calls to take advantage of OpenCL. In 3.0, all these functions have been unified into the same call. The underlying OpenCV system will then decide if it can use OpenCL on the GPU or not.

Ubuntu 15.04 doesn’t have OpenCV 3.0, so we will have to get it from source. First, lets get the dependencies going.

These commands will have brought you to the cmake gui. click configure to make Unix-style makefiles and then make sure you click to enable python and the python examples. After configuring, look at the output to make sure the python module was enabled. If it wasn’t, look for clues in the output as to what was missing.

Tip: To make compiling faster and to eliminate errors, I usually turn off the opencv_java module in cmake.

type “make -j5”, get yourself a drink and maybe something to eat. It takes a little bit to compile opencv. After make is done, run “sudo make install” to install opencv.

Beignet

Beignet is an open source project that provides OpenCL support for Intel graphics platforms. It supports the minnowboard max as well as Core “i” platforms. Ubuntu 15.04 has version 1.0.1 already in the repository. That will work wonderfully for our needs:

sudo apt-get install beignet ocl-icd-libopencl1 ocl-icd-dev

Gstreamer and Vaapi

Gstreamer is a powerful media framework that supports decoding and encoding of numerous media types. It has a plugin framework system where you can combine several “elements” into a “pipeline”. We will use this framework with our own customized and optimized pipeline. Ubuntu comes with Gstreamer 1.0 by default, but we need a few extra packages for rygel and for vaapi support

Rygel is a DLNA framework for serving and rendering DLNA content. Ubuntu has a slightly older version of rygel that doesn’t have python bindings enabled. Further, upstream rygel does not yet have python bindings for the gstreamer renderer library. I created a patch to be merged upstream that enables the bindings. So for now, we’ll use my github fork until the patch is merged upstream.

git clone https://github.com/tripzero/rygel.git

Next, let’s get the build dependencies:

sudo apt-get build-dep rygel

sudo apt-get install python-gi libgirepository1.0-dev

We also need to grab mediaart 2 from github.

git clone https://github.com/GNOME/libmediaart.git

cd libmediaart

./autogen.sh –enable-introspection=yes

make -j5

sudo make install

Build Rygel:

cd rygel

./autogen.sh –enable-introspection=yes

make -j5

sudo make install

If everything compiled and installed, we can now test rygel out. I use BubbleUPNP on my android to control DLNA renderers. It also allows me to play content from my phone. There are probably DLNA apps for other platforms. Look around and find the one that’s best for you.

To run the example rygel renderer, navigate to rygel/examples/gi and run “python example-gst-renderer.py”. Note that you may have to edit the interface which is hardcoded to “eth1” at the time of this writing to the interface on your system that has an active connection. When I run this, I see some output on the screen about some deprecated “SOUP” calls. This usually indicates to me that it’s working. I can now launch up BubbleUPNP on my phone and select the “rygel gst renderer” renderer from the renderers list.

MRAA

MRAA is a library for accessing IO on various devices including the Max, RPI, Intel Edison and some others. It has c++ and python bindings and is pretty easy to use. It supports SPI, I2C, GPIO, PWM, and AnalogIO.