Saturday, October 31, 2015

This is the second revision of my Wifi controlled rover. The first used an old Android phone and an IOIO board as a way to learn some Android programming and figure out how to control a vehicle over the network. It worked pretty well - a friend drove it across the internet from 40 miles away, and I learned a lot.

I did figure out that I didn't like Android for a robotics platform, since it is so highly optimized for GUI use. Keeping a program running at high priority in the background on an Android device isn't trivial, because it's not designed for that. You have to assume your program can be interrupted and resumed at any time. I decided that even though the phone came with all sorts of cool sensors and was extremely compact, I wanted the control over what was happening that comes with Linux.

My intent is to use this rover as a testbed for systems I will eventually install in an underwater remotely operated vehicle. It's also a lot of fun to drive. I wanted to document the overall design here - it has been done lots and lots of times, but a detailed writeup might be useful to someone.

So here we go. This is intended to show you one possible way to do it, and the thought processes that went into the design. It can all be improved.

Hardware

Dagu Rover 5 chassis. This thing is pretty awesome, but I have a problem with mine shedding tracks occasionally that I have not been able to fix. I understand this was fixed in later versions than mine.

A 2200 mah LiPO battery from my quadcopter, providing 12.6V. LiPo batteries are a signifcant fire hazard if not handled properly. The rover currently has no method to automatically kill power when pack voltage drops - consider a safer battery chemistry, like NiMH, if you are unfamiliar with the risks inherent in big unprotected LiPo packs. An excellent guide is here.

A battery eliminator circuit like those used in RC aircraft to generate a nice steady 5 volts from the 12.6V LiPo pack to power the Raspberry Pi and motor board.

Network Design

The rover starts up an access point and also starts two servers on the Raspberry Pi. Each listens on a different port. Full details on the software configuration are below.

An Android device connects to the access point and is issued an IP address. The IP address of the rover is fixed - it's acting just like a router for your home internet. This makes it very easy to take the router somewhere and run it with no additional infrastructure. However, it makes it harder to run over the internet. If that's your goal, it's better to just connect your rover to an existing Wifi network so that traffic can be routed to it from anywhere. I may add a switch later that allows me to flip between these modes.

My ROV will be designed to take to places with no infrastructure, and I wanted to be able to easily take the rover to show friends, so I chose to make the rover the access point.

I currently have the rover configured to act as an access point and hand out IP addresses in the 192.168.42.x range with no DNS or default gateway. The rover itself is on 192.168.42.1.

Software Design
A traditional robotics paradigm is the "Sense, Think, Act" cycle. A robot takes input from it's sensors, performs processing on them to try to identify the best course of action, and then commands the robot's actuators to do something. The process then repeats.

We're not building a robot in the typical sense. That's because a human is in the loop, making the decisions based on sensor input. I wanted to make sure that the platform could be used as a robot, just by changing the software on the server, but right now I'm interested in building a reasonably robust remotely operated vehicle rather than something autonomous.

On reflection, I decided that a remotely operated vehicle can follow the same sense-think-act cycle. The primary difference is that the thinking is done off-vehicle, by the human operator.

I wanted to be able to send back sensor data from the rover, such as video, voltage levels, accelerometers, GPS data, etc. and display them on a simple console. So on the network, the command traffic would look like:

Currently, I'm not sending back any data from sensors. I will detail plans for that in the "Next Steps" below.

The server sets the appropriate IO pins, which drives the motor controller board. My rover has 4 motors, each controlled by a direction line and an enable line.

If the timeout value is exceeded, the server shuts down the motors, resets and waits for another connection,

The Python program at the end of this post implements this. Sending the full string defining the direction is horrendously inefficient - in the next revision of the client program I'll reduce that to, say, a single character. I originally did it this way to aid in debugging the client, and never got around to fixing it.

Client Program

The client I wrote is fairly simple. It rapidly makes HTTP requests to get an updated JPEG image from the rover, and updates the screen. A separate thread sends commands and gets a fake sensor value back. It attempts to reconnect when the connection is lost.

Doing the video this was is crude and eats a lot of network bandwidth compared to something like H.264, but it's easy to implement and actually works pretty well at 320x240 and 640x480.

Low(er) Lag Video Streaming on the Raspberry Pi
There are a number of tutorials for using the raspi-still command to grab a still frame and shove it across the network via a couple methods. These work well for a stream that can tolerate a lag, but it results in a delay of up to a second and the framerates are low. This is due to an inherent delay in the raspi-still program - it's not designed for that.

I got much better results using the Video for Linux (V4L) driver and MJPG-Streamer.It took some doing - you first have to compile the V4L driver. Good instructions are available here and here.

I ran into a problem getting mine to compile. I got an error, "undefined reference to symbol 'clock_gettime'". The solution was found here.

Access Point Configuration
One way to turn your Raspberry Pi into an access point is to use hostapd and dhcpd.The Edimax WiFi dongle is not supported by the stock hostapd binary that you get with apt-get install. Dave Conroy has figured out how to make it work - he has a great document describing the process here (starts at the Prerequisites section). I used that to get it working, and some of the configuration options described on Adafruit's tutorial. My dhcpd.conf file and hostapd.conf file are below.

The following commands are in a small script, /home/pi/startap, to start the dhcp server and hostapd.
sudo service isc-dhcp-server start
sudo hostapd /etc/hostapd/hostapd.conf &

Automatic startupThere are a number of ways to do this, but I decided the simplest way was to make small scripts to start each subsystem and then launch the from /etc/rc.local. I appended these commands to /etc/rc.local:

Next StepsI intend to add an Arduino that can communicate via USB to gather sensor data such as pack voltage. Ideally, the Arduino could control power to the Raspberry Pi to allow a complete shutdown. Even if you shut down the Raspberry Pi via a shutdown -h, it will still draw significant power while halted. That's not good - you need to be able to kill power when the pack is dead. I intend to design this and test it prior to using it in the ROV.

It needs some big honkin' bright lights. Just because.

A Sharp IR sensor or ultrasonic range finder would be cool to have and would allow for simple autonomous behavior, as well as being useful to a human operator.

Saturday, June 27, 2015

After a few months of distractions to prepare for a new kid, build a quadcopter, and work a bunch, I'm trying to get my sonar project moving forward. As documented in previous articles, I've got a simple digital sonar working in air. It was a simple way to test the echo detection algorithms. I'm convinced if I can figure out a way to use piezo transducers to transmit sound in the water, I can make it work.

So the next challenge is how to mount the piezo element and efficiently couple the sound to the surrounding water. One way to do this appears to be to pot the transducer in a potting compound that closely matches the density of water.

This article from NOAA on building hydrophones for listening to whales details one way to pot a transducer, and also includes a high gain amplifier circuit. My current plan is to build one, and figure out how to get the ADC on the Launchpad reading the audio. From there, I can make a transmit circuit. The challenges will likely be in acoustic coupling, transducer selection, and getting enough power into the water to travel a reasonable distance.

I considered using piezo disks, but I found that getting any sort of output from them at all requires them being mounted at either their edges or nodal points in a resonant cavity known as a Helmholtz chamber. I don't think I can manufacture one to the precision needed for the small size. I'm going to work first with cylindrical piezo units as used in the hydrophone above.

I intend to try one with a resonant frequency in the audible range - that's not going to result in very good resolution, but should be easier to debug since I can hear it and use PC audio equipment to measure it. Once that works, I'll switch to higher frequencies.

The next step is to build a functioning hydrophone with a piezo element, get it working with the op-amp, and get that feeding into the ADC of the Launchpad. That will complete the receiver side, and test the methods of acoustically coupling the transducer to the water.

Sunday, June 14, 2015

I built my F450 with aerial video in mind. Once I got it flying, it was time to select a camera and gimbal.

The camera needs to be able to record at high framerates to reduce the "jello" effect of rolling shutter. If you try to strap a cheap keychain camera to the frame of your quad, it is very likely that the result will be a garbled mess of distortion. This is because the CMOS sensors in those cameras scan each frame into memory over a small period of time. Vibration causes the frame to move as it is being captured.

Additionally, even if you get the vibration under control, the rapid movements in all directions as the quad flies around will make you ill. It's not a lot of fun to watch.

The solution is a camera that can record at 60 fps and a motorized gimbal to compensate for the motion of the quadcopter and keep the camera level. There are gimbals that use servo motors, but the best use brushless motors, which are quiet and smooth. They nearly instantly compensate for the motion in pitch and roll that occurs from pilot inputs and wind gusts.

I selected the Xiaomi Yi camera. This has the same imaging sensor as a GoPro without some of the frills, and is much less expensive. They are currently available on Amazon Prime for $88. The don't come with a case, or even a lens cap. The Android version of the app is rather untrustworthy looking - it is currently distributed off of a file sharing site I normally associate with pirated software, rather than from the company's website. "Here! Run this random APK from the Internet on your phone! It will be fine!"

Yeah. I dug out an old phone that doesn't have access to any of my important stuff and used that. I used the app to set up the video mode (60 fps at 1080p) and timelapse mode (still frame every 3 seconds). You can toggle between these modes with the camera's button - you really only need the app once.

I also ordered a Walkera G-2D 2-axis gimbal. This only compensates for pitch and roll, but uncommanded yaw motions don't seem to be much of a problem. I am extremely pleased with this gimbal for the money. It has an onboard regulator, so you can run it straight off your 3S lipo pack. I connected it to my main power line on the quad and it fired right up. It supports the use of auxilliary channels on your receiver to aim it in roll and/or pitch, but it doesn't require it - you can set the tilt and roll angle with a couple of trim pots and leave it alone, and it requires no connection to your receiver. It even comes with a small tool to adjust the pots with and the needed Allen keys. It worked right out of the box, and bolted directly onto the lower frame of the F450, aligning nicely with the slots on the lower frame. I secured it with 4 bolts.

One note: the gimbal is not designed for the Xiaomi Yi and the existing mount doesn't fit. I found that the frame could easily be removed, a 1/4" cardboard shim cut to level off the mounting plate, and a large zip tie easily secures the camera to the gimbal. There is probably a more dignified way, but that works just fine.

I am really pleased with this combo. I am still seeing some vibration in the video that I want to eliminate, but it's by far the best video I've gotten from an RC model so far. More to come on the vibration problem as I work it out. (Update on how to fix this below)

Here are a couple of still frames of a local park, shot in timelapse mode.

1) The vibration was improved by changing the vibration dampeners that came with the gimbal with more rigid ones from HobbyKing. The dampeners that it comes with are too soft.

2) Additional improvements were made by inserting soft foam earplugs into all four vibration dampeners.

3) The lens rectification function on the Xiaomi Yi makes the edges of the video very blurry. Once I fixed the vibration, the edges were still bad. I turned the lens rectification off, and it's much better. Here's a test flight with these improvements.

Saturday, June 6, 2015

This is my F450 quadcopter. There are many like it, but this one is mine.

I have a lot of experience with RC airplanes, but I'm new to quadcopters, so I want to document the build in case it is useful to others. I learned on the excellent and incredibly affordable Syma X1, which is serious fun for the money and a perfect trainer when flying indoors. I put a number of flights on an ARDrone, until it went berzerk and parked itself in a very tall tree. At that point. I decided something with a real, proper RC system was in order.

A few abbreviations:

ESC - electronic speed control. Converts control inputs from you (through the flight controller) into a throttle output to one of the motors.

FC - Flight controller - a small microprocessor board with gyros and accelerometers that stabilize your quadcopter in flight. It handles the mechanics of keeping the machine in the air by making small adjustments to the motor power many times a second, and turns your stick input into your desired motion.

BEC - battery eliminator circuit. Steps down the main flight pack's 12.6 volts to the 5V the receiver and flight controller needs. A regulator.

A dedicated low voltage alarm. I never got the low voltage cutoff on the flight controller to work right. This one works great. You need one or the other, since quadcopter ESCs don't have a low voltage cutoff like airplanes do. Set to 10.8V, you have 30-60 seconds to get it on the ground before you lose power.

A pack of 3S balancing wires, to connect the battery to the low voltage alarm.

Whew. OK. Once you have the stuff, building it is actually quite easy. You need a higher power soldering tool - I used a soldering gun - since you need to solder heavy wires to the copper traces on the frame. There is an extremely helpful build video from Legend RC here:

Be sure to read the KK2.1 manual section on powering the board carefully. I chose to cut the red wire from all 4 ESCs that connects to the FC motor outputs and power it with a dedicated switching Battery Eliminator Circuit (BEC). The switching regulator runs cooler and more efficiently that the linear regulators on the ESCs.

One of my ESCs was dead on arrival. I didn't find it until the kit was 90% built. I couldn't return the whole kit, and even returning the dead ESC to China would have a been a serious pain. I tracked down the same part on Amazon and bought a replacement, along with a spare. This is a serious drawback to buying the kit.

After very carefully checking propellor rotation direction, as well as making sure the correct prop was on the correct motor, I did a quick test flight, and was surprised to find that it flew fine with stock settings on the KK 2.1. I did make some PID adjustments, but it was quite controllable.

There were bugs to work out. My KK 2.1's low voltage alarm, set to 10.8V, would howl continuously in flight, and cease on landing. I never figured out why. I turned it off and installed a dedicated low voltage alarm, listed above, and it works superbly.

On the first few flights, I have trouble with split second instances where the motors would just STOP. All at once, for a fraction of a second. It would fall abruptly, and then recover, unless I happened to be low. I first blamed the linear regulator on my BEC.I tested with a dedicated receiver pack, did a quick test flight, and presto, it was fixed. Victory! I installed a nice dedicated switching BEC, went flying, and SMACK, it fell out of the sky again. It finally dawned on me to range test it. On the ground, with the motors spinning just above idle, I started walking backwards. At 40 feet or so, the receiver light blinked out. A few steps forward, it came back on.

Argh. Radio trouble. Gambled. Ordered new receiver. Got lucky - that fixed the problem. No way to return cheap dead receiver, at least not economically, so into the trash it went and I ate the $15. But it passed a range test and works fine farther than I can see the quadcopter.

ALWAYS RANGE CHECK YOUR MODELS. I have known this for years, and got lazy, and it bit me.

Several more flights, and a new problem cropped up. Propellor blades started randomly separating from the hubs. Once in flight, causing a crash from 30 feet, and once on takeoff, narrowly missing me. Cheap plastic props that came with the kit are absolute garbage - to the point that they are dangerous. Into the trash they went. Ordered some 10x4.5 carbon fiber props, which are absolutely superb. My flight time immediately improved from 7 minutes to 9. I'm not sure if they would fail before the bones in my finger would, so... respect.

One final note about propellors - they aren't perfectly balanced from the factory. Take the time to balance them - mine flew much more smoothly and quietly than before they were balanced. My video quality dramatically improved too, since it eliminated the jello/rolling shutter artifacts I was getting.

I borrowed a friend's Dubro prop balancer and used scotch tape on the back of the blades to balance them. Went surprisingly quickly. I intend to buy a balancer and add it to the periodic maintenance list. I never bothered with planes, but it matters a lot for multicopters.

I now have perhaps two dozen flights, and the bugs are worked out. It is a reliable machine, climbs well, and has plenty of lifting power. I printed a camera mount for an ancient Canon point and shoot camera and it hauled it around just fine - all 1/2 lb of it. I have since upgraded camera and added a gimbal - more on that soon.

Knowing what I know now, I would not have bought the kit - I would have bought the same components, with decent propellors. That way, if I got a bad speed control, I could return it, rather than the entire kit. Other than that, I am pretty pleased with it.

Saturday, February 21, 2015

One of the defining characteristics of the 2013/early 2014 versions of the Printrbot Simple was the use of Kevlar fishing line for the motion transfer on the X/Y axis. A rubber hose gets superglued to the stepper shaft, a Dremel sanding wheel gets glued to that, and the fishing line gets several wraps around it. It kept cost down (the original was just shy of $300 in kit form) and it works surprisingly well. Mine has held up for quite a lot of printing over the 13 months I have had it running.

However, it did have a couple of disadvantages. It required tightening every now and then. It can result in a loss of precision, because the fishing line can walk back and forth on the drum. And frankly - it's just not very dignified looking. Here is a view of the X axis drive with the bed removed.

Thanks to the work of Thingiverse contributor iamjonlawrence there is a printable conversion to GT2 belts for both the X and Y axis. Newer Simple models come with belts, though they cost more than the original Simple kit did.

Jon is a mechanical engineer, and it shows in his hobby work. He has released a number of upgrades for various versions of the Printrbot Simple, and it is accompanied by professional drawings and detailed bills of material. The parts in these kits were very well thought out and fit perfectly, I highly recommend his work.

I printed two sets of the parts - I was concerned that I would have my printer torn apart, and if I messed up a part I would not be able to print replacements. This turned out to not be necessary, but I still think it is worth doing.

I made sure my printer was calibrated well before printing the parts. The tolerances are snug, but if your printer is printing accurately it will fit with only minor brushes with a file to remove burrs or other loose material from the print.

In addition to the McMaster Carr part numbers called out on Jon's BOM, I used the following components from Amazon:

808 BearingsBelts and pulleys (there is plenty of belt for this conversion - I had enough left over to replace one of the belts if I ever need to)

Note that FLDM printers tend to print holes and slots slightly small. I calibrated mine to accurately print outside dimensions, and just drill my holes to the right size. With the hardware specified on the BOM, a 7/64 bit will make the hole sized nicely for the screw to thread into. A 1/8" bit will allow the screw to pass through smoothly.

You'll need to recalibrate the X and Y axis since the pulleys are slightly larger than the original drums.

Procedure - Y Axis
First, I clipped the zip ties holding the Y axis carriage to the motion rods. I then removed the stepper.

Next, I installed the new motor plate and bearings, and aligned the pulley.

I fed the belt in and checked motion, and secured one end of the belt to the stop.

The stop gets belted on.

The second stop gets attached.

Securing the belt to the tension block is a little tricky. I had to remove material from the slot that the belt passes through to let it pass through twice. The drawing shows clearly that the belt should just fit through the slot when folded back on itself. A short length of filament acts as a pin to hold the filament in place. There are detailed shots of this in the X axis section.

Tension is adjusted by turning the screws in the tension block. At this point, I connected to the printer and tested the motion. All looked good, so I moved on to the X axis.

Procedure - X Axis
The X axis is more involved because you have to install a replacement motor mount plate for the X axis stepper. This is not a trivial process, but it went pretty smoothly.

First, the bed is removed and the X carriage is removed by clipping the zip ties holding it in place.

The side opposite the control board is removed.

The bottom plate can now be pulled free and the X axis motor mount is removed.

The new bearing plate is assembled.

Carefully align the pulley with the bearings. They are a snug fit, but they do fit, and don't allow any slop when assembled.

Install the new motor plate and reassemble.

The new belt ends are held in place with the zip ties securing the carriage to the motion rods. As the drawings call out, the carriage is flipped over and a new hole drilled for the X axis end stop screw.

Here is a detail shot of how the belt tension blocks work. I had to open the slots a bit with an exacto knife, just enough to pass the belt when it is folded back on itself.

Test it! I had to remove a little material from the carriage to get it to run smoothly, just rounding over an edge.

Initial test prints look really good. I am in the processing of recalibrating the X and Y motion in the Printrboard, since the pulleys are slightly larger than the original sanding drums. I will post the final values once they are determined.

Update: X and Y values for M92 are both just a hair above 80. I have mine printing to within .001" on a 2.000 inch square test model.

In Repetier Host, the GCode commands can be entered into the GCode command box on the manual tab, shown below. Enter the command you want to run and hit the Send button.

Sunday, January 18, 2015

Over the last year I've been working towards an underwater sonar system for ROVs and surface boats. In order to learn the basic signal processing required to detect the echoes, I initially got a simple sonar working in air with a desktop conferencing USB speaker/mic running on Windows. A writeup, including source, is here. That article describes the algorithms used in detail and would be a good read if you want the details of how this works.

The next logical step seemed to be to get it working on a microcontroller. There are plenty of low cost ultrasonic sonar modules available that work really well in air, but the idea was to work towards getting a sonar that worked in water. There are currently no low cost sonar modules for hobby use in water. Additionally, the low cost modules only give one echo - with a signal processing approach like this, you get a series of echoes that may convey more information about the environment. As an example, a boat floating above a school of fish could detect both the fish and the bottom.

I selected a Stellaris Launchpad because of the high speed analog to digital converters (ADC) and the 32 Kof RAM. At the required sample rates, the Launchpad has just enough RAM to send a chirp, and then record a fraction of a second of audio so that the echoes can be determined. Higher frequency sound will require a higher sampling rate, so I may need to switch to a Teensy 3.1, which has 64K of RAM.

A chirp waveform is computed and sent to a small piezo speaker driven by a simple transistor circuit. The piezo supply voltage (VCC in the diagram below) is provided by 3 nine-volt batteries in series to obtain 27V. This diagram shows how it is connected. This is not my diagram - I found it online, but I don't have a reference. If this is yours, please drop me a line.

The return echo is detected by a small amplified microphone from Adafruit. I like this module because it has an integrated level shift. Rather that swinging from -V to +V, it is shifted to 0 to +3.3V so that it can be connected to an ADC. It's very convenient.

A couple 3D printed parts hold it all to the board just to keep it pointed in the right direction.

The chirp is sent, and the audio immediately starts recording to catch the echo. The same correlation function as used in the previous article is used to pull the echoes out of the recorded audio. The intensities of the correlation function are sent through the debug port to the PC so that it can be plotted.

I need to work on optimizing the echo detection code - currently it works on the audio from each pulse for 4 seconds or so. Also, the power output of the audio transducer is very low, so range is pretty limited. It has an effective range of between 3-9 feet. Closer than 3 feet, the echo is hard to pick out of the noise produced when the pulse is sent.

As in the original experiments with the speaker/mic, the results are plotted with a simple Python program set up similarly to a fishfinder display. The results of a test run are shown below. Source for the Python display is modified from code from the previous article.

Source code for the Launchpad is given below.

Next steps are to work on getting transducers working under water and increasing transmit power. I've made a simple hydrophone to test transducers with - update coming soon.

About Me

One guy's wanderings through science and technology, just for the fun of it. Currently focused on astronomy and hobby robotics, but likely to wander into photography, DIY drones, CNC and 3D printing, or whatever seems interesting at the time.