How to access TinyRex peripherals (Examples)

Content

On this page, you will find examples how to use peripherals on TinyRex.

HDMI Video Input

Setup

This tutorial assumes you are using an i.MX6 TinyRex with a Tiny Baseboard that includes an HDMI input and an ADV7610 HDMI decoder. You’ll also need to load an image onto the SD card that includes gstreamer-0.10 and the Freescale Gstreamer Plugins. If you are not sure what I am talking about, simply follow How to start with TinyRex YOCTO and run “bitbake fsl-image-gui” (fsl-image-multimedia should also work).

Drivers

Once your i.MX6TinyRex is up and running, you’ll need to insert a kernel module. To do so, use the command:

modprobe mxc_v4l2_capture

You should also ensure that the adv7610 was recognized, and the kernel module was loaded. Verify with the command

Gstreamer

Freescale heavily uses Gstreamer in both it’s BSP as well as example code. Gstreamer is a very handy way to create video “pipelines”. A pipeline consists of elements put together in a particular order to convert video or audio data to encoded and/or packaged data or vice-versa. In our case, we want to take video input from the ADV7610 (connected to the i.MX6′s CSI input), convert or encode it and display it on the HDMI output or transport it over a network. A standard Gstreamer h.264 encoding pipeline might look something like:

Gstreamer allows you to create these pipelines quickly and easily from the command line using the gst-launch utility and various Gstreamer plugins that make up the elements of the pipeline. To get a list of plugins available, use the gst-inspect command. You can then use gst-inspect on a particular plugin to get more details, e.g:

gst-inspect imxv4l2src

For a much more detailed discussion of Gstreamer, see their gst-launch tutorials here.

On the output side, there are two available output sinks: imxv4l2sink and mfw_isink. As the name implies, imxv4l2sink uses the v4l2 library to output to the first framebuffer device. Alternatively, mfw_isink writes data directly to the framebuffer, skipping v4l2. It also does other cool things like output to multiple framebuffers simultaneously, and its latency is much less than imxv4l2sink. However, it does require more configuration and can be difficult to work with depending on the window manager being used. Generally, the best output sink to use is autovideosink, which will automatically select the element for you.

So, connect an HDMI source (like your computer, a GoPro, or a BluRay player) to the HDMI input on your TinyRex baseboard, and the HDMI output to a monitor. To display the HDMI input on your monitor, run the pipeline:

gst-launch imxv4l2src ! autovideosink

Example Gstreamer Pipelines: HDMI input -> encoder -> network

If you want to send video over a network, you will need to encode and payload it first. As an example, lets use Freescale’s vpuenc plugin which is capable of using the i.MX6′s hardware encoding engine (the VPU) to encode video into MPEG4, MPEG, h.263 and h.264 formats. To get the video in a form that video players and other Gstreamer clients can understand, we’ll payload the h.264 encoded video using RTP and the rtph264pay plugin. Finally, we’ll send the video over UDP to a client on port 5550.

Note that I’ve added arguments to the vpuenc and udpsink elements. Specifically, we use the “avc” codec for vpuenc (telling it to use h.264) at a fixed bitrate of 4Mbps. Additionally, we tell the udpsink to send the data via UDP to 192.168.1.1 on port 5550.

You’ll also notice the -v argument on the end (verbose). This tells gst-launch to spit out a bunch of additional information, including the “capabilities” of each element in the pipeline. These capabilities are important: they tell other elements about the stream, like width/height and the encoded stream type. In order to display the video in a receiving pipeline, we need to tell the receiver pipeline what the capabilities of the stream are. Specifically, we need the capabilities at the udpsink element:

Example Gstreamer Pipelines: Network -> decoder -> HDMI output

We can now create a receiving pipeline, either on your computer or on another i.MX6TinyRex. Lets use the example of another i.MX6TinyRex, where we use the vpudec and rtph264depay elements to again leverage the i.MX6′s hardware h.264 decoder to decode the video and display it on the HDMI output. Note: we need to use the caps from the encoding pipeline in the udpsrc stream to tell the receiving pipeline about the capabilities of the stream. Do this by setting the “caps” property of the udpsrc to the same properties, copied and pasted from above and put in double-quotes:

The vpudec automatically knows what type of stream to decode from the capabilities passed to it from rtph264depay, but I do add the low-latency=true argument to have it produce frames as soon as they are decoded. The autovideosink is again used to output on the HDMI.

Example: Decoding with VLC on your computer

VLC is also capable of decoding and displaying video streamed to it from the i.MX6. However, much like how we had to tell the receiving pipeline about the capabilities of the stream, we need to do the same for VLC through the use of an .sdp file. Here is an example:

This .sdp file is compatible with the above pipeline. Specifically, you’ll need to set the port (5550) as well as the sprop-parameter-sets from the capabilities of the stream. Is also important to specify that this is an RTP h.264 stream. To view the video, just redirect the encoding pipeline to your computer’s IP address and open the .sdp file in VLC. You should see video after a few seconds.

UART Serial Port

We can use a basic python script to test the UART ports on the TinyRex board.
Connect the FTDI cable to the console debug port on the TinyRex board and open you favourite serial terminal (e.g. Putty, TeraTerm for Windows or CoolTerm for OS X), or alternatively ssh into the TinyRex with your favorite ssh terminal.

Login to the TinyRex, the default username is ‘root’ with no password.

In this example we will use 9600bps 8-N-1 on UART2. We can create a python script using the nano text editor;

You should now see the python prompt >>
Type some characters and they should be transmitted, and you should also see characters received in the terminal.
You have successfully tested the UART ports on the TinyRex board

GPIO

As an example we will use the baseboard USER LED. First, check what CPU pin is used to control the LED. Have a look into the schematic and you will find, that the baseboard LED is connected to GPIO3_29.

Use this equation to get the magic number which we need: MagicNumber = (GPIOX – 1)*32 + _YY*Note: In our case: MagicNUmber = (3 – 1)*32 + 29 = 93