Update 4/16/13

April 16, 2013

The Senior Design Project open house in on April 24th. We will be presenting on the third floor lounge of ENS, so we thought it would be cool if we had the robot autonomously navigating around people during the event. Here’s is an updated diagram of the nodes, topics, and services that currently run as we test navigation:

Navigating around people and other objects is trivial if the sonar array is working properly. However, we believe that it would be more impressive if we did this task exclusively using vision processing. In order to navigate around people using vision we’ll have to try a different approach than the one we’ve been using. Rather than characterizing the HSV thresholds of obstacles, we will characterize the floor, and treat everything that does not look like the floor to be an obstacle. We are lucky in that we will have 24 hour access to the place where our booth will be, so we will be able to easily test and develop in the coming days.

From recent testing we have discovered an issue with the PSoC’s connection to the computer. Occasionally the PSoC disconnects when the remote kill switch is hit. We also have observed the power supply to the computer turning off when it is being powered from the wall and the kill switch is hit. This problem could stem from the fact that everything in the system is on the same ground, and a spike could occur when the power is cut to the motors (even though our motor controllers should protect against this). Our current solution to this is to decouple to PSoC from the computer by using Ethernet to pipe data instead of USB. If we do that and then power the monitor and computer with a separate battery, the computer and components powered by the computer will be completely decoupled from the motors and batteries that power the motors.

In other news, we have making a few purchases. Since the VN 200 does not support either OmniSTAR or WAAS, we have bought a GlobalSat BU-353 GPS receiver, which has WAAS support. We have also bought a HDMI capture device which we expect will be able to allow the Hero3 to stream images. In addition, we’ve been trying to acquire as many duplicates of parts as possible so that no single component of our robot can fail the day of the competition without us being able to replace it.

Chris H. has started to work on using PTAM to give our EKF another source of odometry. There appears to be issues with finding features, and it may become a CPU hog. However, it may also prove to be a good source of error correction in case we slip on grass and our magnetometer gets screwed over by IGVC’s power generators.