Archive for the ‘Uncategorized’ Category

Last week, TI released two new software development tools that they hope will encourage more developers (especially ARM developers) to make use of the DSP on their ARM+DSP platforms (which includes the OMAP-L138 at the heart of the Hawkboard).

The first tool, called C6Run, aims to provide an automated way to compile and run code on the DSP core, without having to learn any new tools or APIs. When programs built using these tools are run, from the target Linux system, the DSP is enabled, loaded and communicated with by the ARM. Individual portions of applications, or entire applications can be executed on the DSP. Documentation for the tool is available on TI’s embedded processor wiki. This tool is also open source and is being developed on TI’s external subversion server.

Further encouraging development of innovative open source applications, HawkBoard.org and element-14.com announced a series of “Hawk Tawks” webinars aimed at familiarizing both industry innovators and hobbyists with the HawkBoard’s wide range of end applications.

Three Topics and associated schedule are below
_________________________________________________

The OMAP-L138 SoC on the HawkBoard features a fixed-/floating-point C674x DSP core, an ARM926 core, and a rich set of peripherals. This talk will give an overview of the two processors and go over many of the on-chip peripherals. It will focus on the Programmable Real-Time Unit (PRU) and the Universal Parallel Port (uPP), two new peripherals found on this device.

The talks will also cover board bring up and demonstrate porting of Linux Kernel and drivers on ARM9 and working with all the peripherals on board with sample applications.
_________________________________________________July 22nd 2010, 11 a.m. EDT/4 p.m. BST and 8 p.m. EDT/1 a.m. BST (8:30 PM India time)

Topic: : Developing and porting Qt applications on the HawkBoard

Presenter : Prabindh Sundareson, Texas Instruments

User Interfaces have become fundamental components, at the same time with more complex functionalities, to suit the needs of new emerging applications. To drive down the cost of development and to bring products to market faster, it is imperative to use standard frameworks like Qt, Android, X, and customize it to the needs of specific applications. To make the system more efficient and performance oriented, developers have to understand fundamental graphics layers of each framework. In this talk, we will cover the fundamentals of Qt framework, its programming model and tools, and performance benchmarking with Xgxperf toolkit from TI on target development boards like Hawkboard.

Learn how to stream data through the DSP, using the ARM for control and data routing. On the ARM side, data exchange can be done using USB, TCP/IP networking, SD / hard disk storage, and audio / video connections. DSP algorithms that are XDM and XDAIS compliant can be built into a DSP based codec server. The iUniverse framework is used expose the codec server based algorithms to the ARM using the GStreamer streaming media library. GStreamer has many features (over 200 pipeline elements) so you can focus on your DSP algorithm and let GStreamer take care of the rest.

With all my experiences with IGEP and SheevaPlug I was ready for a new experience with an ARM board having a SATA connector. My desktop environment at home is totally ARM based. First I tried SheevaPlug being my desktop but I was not completely satisfied because of the instability of the USB based display. Then I tried IGEP which has proper display handling but when it does I/O on USB or SDHC it largely blocks the system. Finally I set up SheevaPlug to be my NFS server and I use IGEP from NFS root. I compiled a Gentoo system on the IGEP and this became a pretty usable system with acceptable performance. The story was almost done, but there was one point I couldn’t digest: the NFS perfromance. I connected a USB disk to SheevaPlug and it sees around 16MB/s. When it is exported to IGEP through NFS it goes down to 4MB/s. There are also issues with the USB disks power saving. These are the factors that made me curious about the Hawkboard .

Hi all,
My cross development base is a virtual Ubuntu 9.1. To be honest I dont
even know if it is possible to get to that machine from the host
network. Since I had to get a serial link up to be able to fiddle
around with uboot I searched for other options to get the kernel in.
There are several serial transfer protocols included in uboot. I
picked ymodem for its simplicity.
I will include my particular kernel in this example to highlight the
importance of size.

Now that seems quite ok, so now we have the uImage flashed to NAND at
0x200000 and the length is 0x256de4
I dont use ramdisk so I’ll leave that for you to figure out.
I use a sdram with two partitions, first for uImage (not used, but
soon(tm) we hope for a new uboot).
Second is my rootfs, so I continue to set some bootargs and bootcmd in
uboot

This part above have more or less copied from another post here and it
has a typo in it, just ignore the error message on the “mem=80M”
stuff.
mem=80M (or something smaller than 128M at least) needs to be passed
to the kernel in order to get DSPLink to work.

I found this discuss and the method useful for all hawkboard users. Just logging here. I am waiting for DSP side code, will update this blog when I upload other sources as well.

> Caglar,

> Can u share the information regarding how to make composite video work in
> hawkboard.

Sure, but it’s a little bit picky though. Here is what I did to make it work:

I downloaded sources for kernel coming with Angstrom [1]. Kernel version
is 2.6.33-rc4, I guess it is the PSP kernel.

This kernel was supporting VPIF input and output out of the box, moreover
associated filesystem was creating /dev/video0 node automatically. However, in
my experience TVP5147 driver was a little bit buggy(due to noise I guess)
so it wasn’t detecting camera standard. Then I patched my kernel with the
attached patch to solve this issue.

Next trouble was color space conversion. VPIF was giving data in NV16
(two plane YUV) format but LCDC was expecting an RGB data. This was a little
bit problematic for generic players; for instance gstreamer v4l2 plugin was not
supporting NV16 and mplayer was unable to detect the format. So I wrote my own
program to do the following:

But this was not enough as framerate was not so good. So I modified Codec
Engine video_copy example to utilize DSP and that way I offloaded color space
conversion to the DSP. This gave me a beautiful 25 fps from my camera.

Currently I don’t have any appropriate place to upload my example codes but I
can share them with anyone interested. Actually binaries can serve a good
purpose for anybody want to give it a quick try on their board.

I am really sorry frustrating every one by not giving the details about the platform. The board has come along good, I don’t find any issues in bring up Linux kernel and have tested all peripherals on board.

We have finalized the cost and distribution channel. The biggest problem for the hawkboard team is in arranging all the required chipsets or components, due to lead times in getting these components we are seeing the availability of board only by end of February 10.

We will get few boards by early November which will be delivered to early adopters and MCU day registered users interested in this platform.

For world wide distribution we still need more time.

sorry for all the delays, will keep you all updated with the proceedings.