These are mostly SMD components, so I get to gain more experience over dealing with these guys. I have used variboard with SMD in the past so theres is that, but I might take the opportunity to make a batch of useful 8pin/14pin/16pin smd breakout boards for the space [2].

Today I drove my small car in an after-work dash to a quiet low cliffside car park in-between the harbour lighthouses.

There I sat intently listening to radio static blasting from my mac. A freshman radio enthusiast, I was planning to receive magic space images from orbit via my magnetic roof antenna plugged into a cheapo SDR radio receiver. The long, drawn sputnik-sounding 'weep weep' with superimposed donkey-like short 'clip clop' of the NOAA Automatic Picture Transmission (APT) signal was my paydirt. After its capture, the audio signal was going to be processed via a simple toolchain to become a beautiful, fully-formed live weather photo from orbit. This exciting outcome was not a given however, as I had found at my last two attempts. I was pretty sure this run would be a success and not a waste of hours. I had brought sandwiches and a banana just in case.

I'd heard about the magic NOAA weather broadcasts from hackerspace radio folk and it seemed like a fun way to use the RTL-SDR and Sky Scanner Rx antenna combo I'd bought on a whim. A number of weeks ago, space members hosted a radio weekend at Aberdeen Uni where folks erected their roof dipoles and made contacts over HF. Making use of the roof also, I played with receiving the NOAA signals, sprinting to the chilly roof when the sats passed overhead. Orbiting National Oceanic and Atmospheric Administration (NOAA) satellites circle pole-to-pole 540 miles above the earth. NOAA have a few sats up there and the three main APT players - NOAA15, NOAA18 and NOAA19 - pass overhead a few times a day. NOAA15 and 18 have their downlinks as 137.6200Mhz and 137.9125Mhz respectively and seem to be the strongest signals for me.

To track sat passover times, I found gpredict [1] to be a great open source tool that can be installed cross-platform. To capture the actual audio via an RTL-SDR once I had a sat overhead, I used GQRX [2] for the mac set to narrow FM, but SDR# does a great job on a windows machine. On the mac, outgoing audio can be directed to an input device via the virtual soundflower device and recorded via Audacity set to a sample rate of 11025hz. On a non-mac, you’ll probably have a ‘stereo-mix’ device in place already. The resulting wav file then needs to be imported into the very cool WxtoImg tool. It does a one-click job of automatically transforming the audio into imagery, even if the signal is incomplete. This was the audio [3] and resulting image [4] of my first attempt. Not great, but a good start. At 38secs the audio is perfect, but it does not last. I sought a clearer, more rural sky, so a few weeks later I drove to the harbour coastal car parks and took a shot there, but the cloudy day seemed to prevent a strong signal. I got the ‘weeps’ but not the‘clops’ :(

So back to the present: there I was in my third attempt, waiting in my little car in a carpark with worlds smallest entry gate (seriously, my Corsa barely fitted past). After waiting a spell, I started to hear shadows of that magic sound. I had run my mac battery down by this point and I did not want it to die at this crucial stage. I plugged in my car charger for more juice. Not long after, I COMPLETELY lost the signal. Having not started the audio capture I was pretty pissed when no tweaking about the freq made any change. The sat was still in full view but I was getting nothing with 3mins till LOS. Infact the entire noise floor had raisen up (that should have been a clue). I checked the usb/coax connectors and roof aerial but no change. The sat passed below the horizon just as I realised that I had started the car engine to make sure my charging laptop and phone would not kill my car battery. Stopping the car brought the noise floor back to normal. DAMIT DAMIT!! Thats how I learnt that using cheap radios from inside a car sometimes work better with the engine and chargers off. Feel free to write that down if you are a radio dumbass like me. On the plus, I guess I forever learnt something new about SDR radio and found a nice radio spot, so not a waste of hours at all.

If *you* want to have the same (or hopefully more successful) raw audio-to-imagery experience, you should hurry. The APT sats are getting pretty old and cannot hold orbit forever. NOAA APT transmissions are scheduled to die from 2017 [5] and will be replaced with something digital. Boo hiss boo. If you do have a go though, for gods sake make sure you bring sandwiches. It seams brains are optional.

As a distraction, I purchased a TG12864H3 RGB LED dot-matrix screen on the cheap from a fellow hackerspace member with the aim of quickly adding a status display to my HackSack using prewritten library [1]. I expected the screen to be very simple to use with the pi, and generally it is, but I had to dig a little to get all the info about setting it up.

Expanding use beyond the hacksack, I plan to use this with a Pi to control and view number of Pi connected sensors (SDR, RFID, Light, Bus Pirate). To log my notes, progress and resources, I am documenting my steps below.

Test wiring the LCD SPI

Wiring is pretty easy - I made the table below from available guides to help me not fry the I/Os.

Note to self: this is the Pi v1. I'll need to update this for the v2 I am now using.

The demo display code [1] runs beautifully, so that works :). I can compile/run the C++ code but the python equivalent [2] seems to suit well on the Piv2, speedwise. The Bus pirate, RTL-SDR and the I2C hot-swap interface IC are also tested as working from the command line (the latter only briefly tho). I'll need to make figure out how to make calls to these later.

Next I need to get the case readied for the next stage since the project will be 'additive' and will need to be robust pretty early on.

Next steps:

Sourcing the hard case and input device

3D print box spacer

Using the rf keypad for control

Adding peripheral devices/sensors

I have found a nicely shaped box from Maplin (£11) that is shaped nicely for handheld use [3]. I can 3D print and reprint the middle part (see below) as I add internal devices. I think that combining shop a brought case with printed guts gives me a good balance of robust and custom. Feel free to disagree.

Also, about 10 months ago I brought a cheap RF numbpad to speed up data entry on the laptop [4]. This will be used to both control the python-powered lcd menu and as a remote control as needed. It's USB, so I can switch it out with a keyboard when I and using the headset viewer via the composite output. Oh yes, I am using a hacked set of video googles with the project also [5]

Boxing it for handheld use

As it stands, the Pi fits beautifully inside the box but there is little space for the other devices to say nothing about the power supply. Thats fine since the box's thickness and internal fixings can be modded via simple wrap-around spacer where the case top and bottom join. This will be 3D printed once our hackerspace printer is up and running.

Next task - 3D design the box wrap-around spacer and fit the box for the power supply and peripherals.

Getting MultiWii to WiiMote Camera is not easy.

Funnily enough, MultiWii was initially developed for use with the motion sensors of the WiiMote. Since my interests relate to using the WiiMote camera to stabilise instead, I am in ironic and uncharted territory as far as I can tell.

About the Camera

The Wiimote camera is manufactured by Pixart. It contains a 1024x768 infrared camera with built-in hardware blob tracking. Over the I2C bus, it outputs X/Y coordinates for up to four IR points, as well as approximate intensity/size.

Interfacing it

A lot of the interesting human interaction hacks done with wiimotes come from hero wiimote Hacker Johnny Chung Lee. If you saw a news piece about something cool done with a wiimote, chances are this chap made the video showing it. His site offers the background for interfacing to a Wiimote to a PIC BASIC Stamp as a USB HID device. To achieve either, you'll at least need to add a 20Mhz clock (ideally 25Mhz) and most likely a I2C logic level converter (click schematic to enlarge in new window).

Trawling over the net I came across the blog of one Stephen Hobley, another WiiMote Camera rockstar, who has written an Arduino library for the Pixart Camera. This was downloaded and added to an Ardunio Nano v3, which uses the same atmega328p AVR. I also stripped down a wiimote, removed the camera and solders/heat shrunk wire to it. The whole lot was shoved on to a breadboard and put on hold till the crystal, 74AC04 hex inverter and the LTC4301L Level shifter ICs arrived.

Dead at square one

Sadly, for some reason, I was not having any luck getting my breadboarded 25Mhz clock to work. I am using the parts shown above and a Buspirate to measure but nothing. I have probably done something very stupid. I don't want to spend too much time on this, but its dam strange why I am detecting nothing from the CLK line into the camera module. I have ordered another crystal as this is probably the cause - can you blow crystals? I have no idea.

It is probably time for a progress report on this.

My early goal here is to teach the tiny drone to take off and hover at the same relative height and position over a moving target; that is to say - to move as the target moves.

In the time passed since, I have been distracted with a great many things, however I have made progress on this.

Learning to pilot these bloody things

If I aim to teach the micro-quad to fly by itself, I at least should have some ability to take control in case it vears away to attack someones bobble hat (or lack of bobble hat). To this end, I have been sneaking out of work at lunch to my local sports hall to better get to grips with the pilot controls. The hall is huge and these quads have great range and manoeuvrability. I'll film my own progress next time if I can remember. I have also been dicking about with the PID settings using this beautifully succinct guide.

For context, I am using the multiwii firmware on my microquad and a 5 channel Spectrum DX5 TX. Like most of the quad firmwares, MultiWii has a number of flight modes triggered from spare TX channels with the self-explanatory horizon and heading hold modes being the most useful for beginners.

With the beginner modes on, the quad does a grand job of hanging in the air *eventually*, however I am not able to take-off and land in a small space, and the small adjustments required do not yet come naturally. Generally I need room to get up and reach a stable position and prior to reaching that, it always vears close to room objects, whereby my wussie prop-protective nature takes the leash and forces me to kill all power. Fellow hackerspace members at 57north.co present while I've been trimming the wood from nearby table legs, will appreciate this sucks.

I cannot help but think that adding in some static positioning feedback would help here. Sadly, these quads are generally considered too small for GPS modules that are of any use here. Like with all my DIY drone problems, a good (and undeniably lazy) first step is ask how does nature do itand can I rip it off?

Insect positioning

Space members have suggested ultra sounders and/or base station openCV solutions, but I am keen to make the solution onboard (so light), cheap and if at all possible via a wiimote pixelart camera (REALLY want to use these things on project). A googling session brought me repeatably to the same paper from Current Biology Journal, namely Visual Control of Altitude in Flying Drosophila (PDF 1.6mb).

In it, researchers ask how a fly chooses a particular altitude at which to fly, and why it isn't flying at some other height. Their answer, complete with some interesting math (I presume it is, cannot understand much of it yet), suggests the cheeky critters cheat, and use horizontal landmarks rather than anything barametric or ranging based.

So after campGND, I was given a Turnigy Micro-x Quadcoptor to play with. For some exciting context, below is the original project blog post dated Aug 2014 (ported from a previous blog):

After being kindly given a Turnigy micro-x quad to play with, Nordin is seeking other interested parties to build adrone quad army and is asking if others are interested in some r/c quad fun? The goal is to achieve an airborne versionof those line-following robots using principles from stigmergy rather than a control structure based on complexrepresentations etc.

Categories

About

Ed Watson

I am an elearning developer who works mostly with dynamic languages and media production during the day, then plays with µC's and desktop fabrication methods during the evening. If it has I/O, melted plastic and needs a dremel tool, it's awesome.