Smarter cameras may help bring autonomy to space robots

Space robots, including the Curiosity rover currently roaming Mars, have been great at following orders. With a daily set of instructions transmitted to the red planet, the rover can maneuver the terrain and photograph what it finds. However, a 20 minute delay in communication to the planet is one of the reasons scientists are looking for ways to put more of the decision making power within the grasp of the rover itself. A team at NASA's Jet Propulsion Laboratory is developing a camera system called TextureCam to do just that. They've developed an algorithm to allow the rover to analyze 3D images and determine whether an object in front of it deserves further investigation.

TextureCam uses an algorithm designed by NASA scientists to analyze the photographs it takes, looking for unusual colors and textures.

TextureCam is trained to look for unusual colors and textures in its surroundings, and the 3D photos it takes are processed separately from the rover's main processor. When an interesting feature or rock is identified, the rover is directed to investigate further or send a photo back to Earth for analysis.

While the 20-minute transmission time between Earth and Mars doesn't seem too bad, JPL's Kiri Wagstaff points out, "it works less and less well the further you get from the Earth. If you want to get ambitious and go to Europa and asteroids and comets, you need more and more autonomy to even make that feasible."

If it proves successful, this kind of technology could find its way to the Mars 2020 rover or other robots exploring more distant planets.