Wednesday, December 26, 2012

This is a custom QQuickItem subclass rendering hardware-decoded h264 1080p video on Raspberry Pi. The egl_render component used in some previous posts is good. The QML component can play nicely with the rest of the scene rendered by the Qt Quick renderer in the scene graph.

Code will be available when I'll find the time to clean it up, but is a direct consequence of the previous posts which include the code to implement decoding and rendering.
In the QML test code, I created some simple standard QML animations and also placed an Image element with 0.5 opacity overlapping the video element.
NOTE: New information is available here.

Sunday, December 9, 2012

After accomplishing the target of decoding and rendering compressed image formats onto OpenGL textures directly
using OpenMAX, I've looked into how to use the same concept to render h264 stream directly into a texture
using the same OpenMAX component on my Raspberry Pi.
As a starting point I took the same sample code I used in this post, and the hello_video sample code by
Broadcom in VideoCore.

This is the code I wrote to make it work (PiOmxTextures_2.0.tar.bz2 version 2.0, https://github.com/carlonluca/pi): notice that this is only a collection of notes, which compiles and seems to work
fine on the wheezy Raspberry Pi image, it is not a fully re-usable component. I'm still working on that.
The code is pretty messed up and much of it is not fully implemented. Error management is almost inexistent, but still it can be useful to guess how to make things work. The rest is up to you ;-)
This is a video illustrating the sample code running:

To compile the code refer to this article.
You will need the Qt 5.0 libraries (version 4.0 might be sufficient) running with the eglfs plugin.
Hope this helps!

Friday, December 7, 2012

After building Qt 5.0 on Raspberry Pi, I focused the attention to hardware acceleration using OpenMAX. This
is quite an interesting subject to study. The most interesting element I found among those available for
VideoCore is the egl_render component. This component should be able to render content coming from the output
of the decoder component directly into an EGL image, which is bound to a texture.
Of much interest for me is rendering video into an texture, but I thought for the moment I could start just
with decoding an image and placing it to an OpenGL texture for rendering. This can be done by using the
image_decode component and the egl_render component. Refer to the documentation in the git
for some more information.
Unfortunately not much documentation is available for this task, but I found something very interesting that
helped me out much:

and the method using the egl_render OpenMAX component.
These are the results I got by loading 6 jpegs 1920x1080:

Average time out of 5 runs without OMX: 6750ms;

Average time out of 5 runs with OMX: 918ms.

Here is the package containing the sources (version 1.0): PiOmxTextures.tar.bz2. The code links to Qt 5.0 and needs 6 images to be
specified on the command line to load. You need to move the images to the same directory (no spaces) and
to name those images like:

prefix{0, 1, 2, 3, 4, 5}.jpg

To compile you'll need Qt Creator or at least qmake:

cd PiOmxTexturesyour_qmake
make

This is a video showing the performance loading 6 1080p jpegs, first the software decoding is used, second and third run are hardware accelerated:

Sorry the quality is really bad, but the performance can be appreciated anyway. Also it seems that for some reason the software implementation failed to load one image, you can see a black texture on the side of the cube. The same is not happening with the hardware implementation. Didn't investigate the reason.
This is not a ready-to-use code, just some notes that might be useful. I've almost never used OpenMAX or OpenGL so
if you happen to have any suggestion or observation, feel free to comment! ;-)