As it has for more than six years, NASA’s Mars Exploration Rover Opportunity surveyed a bleak Martian landscape on March 4. The robot could see more than 50 rocks but, for the first time in those six long years, decided on its own to take a closer look at one of them.

This was no act of robot rebellion, but the exercise of scientific autonomy made possible by new software uploaded to the rover this winter. “We were really happy with the result,” said David Thompson, a Carnegie Mellon graduate and a researcher at NASA’s Jet Propulsion Laboratory who was part of the software development team. The tan, football-sized rock that Opportunity chose to zoom in on was just the sort of feature an Earth-based scientist would have selected.

For Thompson, who earned his PhD in robotics at Carnegie Mellon in 2008, the experiment was gratifying. It also was reminiscent of time he spent in the deserts of Chile and California with a solar-powered scientific rover called Zoë. Like Opportunity, Zoë was developed to make scientific decisions on its own. In field experiments directed by David Wettergreen, associate research professor in the Robotics Institute, Zoë autonomously found signs of life in Chile’s Atacama Desert in 2004 and 2005 and drew its own geological maps of an area near Amboy Crater in California’s Mojave Desert in 2007.

The Amboy Crater experiment, the subject of Thompson’s doctoral thesis, demonstrated capabilities similar to what is now incorporated into Opportunity’s new software. That system –Autonomous Exploration for Gathering Increased Science, or AEGIS – is based on more than a decade of work at JPL, he noted, but the insights he gained at Amboy Crater informed his work on it.

“The experience left me with better intuition for navigating in a desert environment and particularly for the way that changing light can affect sensors,” Thompson explained. Joining JPL in 2008 after receiving his PhD proved fortuitous as he soon found himself working on the AEGIS project.

“It was a very seamless transition from what I was doing at CMU,” he added.
The unexpected longevity of Opportunity and its twin, Spirit, has enabled NASA to incorporate a number of advances in robotic autonomy that have been made since they landed on Mars. In 2007, for instance, navigation software developed by Carnegie Mellon and JPL was uploaded to the rovers, giving them the ability to maneuver around obstacles autonomously.

By allowing Opportunity to make some decisions about observations, AEGIS will enable the rover to gather more data than previously possible. In the past, a rover would drive to a designated spot, take images with its wide-angle navigation camera and transmit the images to Earth; ground operators would then check for targets of interest to examine on a later day. Sometimes, time and data-volume constraints meant that the rover team would opt to move the rover before a target was identified or before examining targets that didn’t have a high priority.

Now, Opportunity can use the software at stopping points along a single day’s drive, or at the end of the day’s drive. Using criteria established by ground operators, the rover can identify and examine targets of interest that would otherwise be missed.
The rock observed on March 4 appears to be one that was tossed outward onto the surface when an impact dug a nearby crater. Opportunity selected it because it best met the criteria it was looking for – large and dark.

“It found exactly the target we would want it to find,” said Tara Estlin, a senior member of JPL’s Artificial Intelligence Group who led the development of AEGIS. “This checkout went just as we had planned, thanks to many people’s work, but it’s still amazing to see Opportunity performing a new autonomous activity after more than six years on Mars.”

Thompson said it was always his aspiration to work with planetary robots and was excited to be a part of the intensive AEGIS effort. “I think I was pretty well prepared by my studies to hit the ground running when I came here,” he said.