Friday, April 17, 2009

University of Houston on lunar surface navigation

If NASA is going to successfully establish a permanent human presence on the moon, it must be able to accurately track and direct its crew members and exploration vehicles, and the space agency has charged a University of Houston research team with helping it do just that.

"If they just stay in close proximity to where they land, they can't get a lot of information. They have to travel long distances," explained Heidar Malki, the associate dean of research at UH's College of Technology. "Now, what happens if they travel and get lost? On Earth, we have GPS, global positioning system, and we can pinpoint those things. But on the moon, we don't have that capability."

Malki's group is helping to create a navigation system that will allow explorers to contact their lunar base, find their way back to it and communicate with each other.

"Navigation systems today need consistent and reliable measurements to generate a good solution," said Steve Provence, a UH graduate who, as a NASA employee, is working with the team.

The first step is to identify which sensors best suit the environments and tasks at hand, Malki said. Then it's a matter of keeping the lines of communication clear and giving near-perfect directions.

Provence said there are more than a dozen lunar-surface navigation sensors under development in industry, at universities and at the Johnson Space Center, and it is unclear which will work the best. He's generating mathematical models with data from the most promising candidates.

It's up to the UH team, during the two-year project, to take the modeled sensor output and develop an algorithm that will help determine the lunar navigation accuracy NASA can expect to achieve.

On the moon, Provence said, explorers will need multiple sensors, including ones that work like cell-phone towers, odometers on the wheels of the vehicles, and even satellites like those used for GPS.

"We are trying to stitch together the measurements from each," he said. "The first big issue is timing. Each sensor will give us updates at different intervals. Some will report once per second, like GPS, and others may be faster or slower. This may not sound hard, but combining all these measurements into a good reliable solution is tough."

Even more challenging, perhaps, is keeping astronauts informed if some sensors become unavailable to them. If they travel into craters or over the horizon, for instance, their view of navigation towers will be obscured, and they will lose contact with those sensors, Provence said.

Malki emphasizes that the navigation system has to be more robust than the norm and has to adapt to the sensor input changes and still provide good information in the end.

The UH team will classify various sensors with respect to range, accuracy, reliability, power consumption, cost and computational-processing requirements. Using the selected sensors, they will develop estimation algorithms to provide accurate lunar tracking and navigation.

"If a sensor fails, you might still get data, but it will be unreliable," said Karolos Grigoriadis, a professor of mechanical engineering at the Cullen College of Engineering and a co-investigator on the project. "Your algorithm should be able to recognize the bad data and disregard it."

The UH team has expertise in various estimation approaches. One such approach uses "neural networks," or simply computational models made up of artificial neurons. The networks themselves are adaptable – some would say intelligent – in that their operations can change if the information they receive indicates they should.

"Neural networks try to mimic biological systems," said Malki. "The way that we learn things is by examples – trial and error. Neural networks work basically the same way. By presenting examples, we try to teach the networks to learn."

Once they do, they are tested with new data to see if or how they can produce similar results, said Malki, who has done neural network research for NASA before. During a fellowship at the agency's Ames Research Center, he worked on a project aimed at controlling vibration on Blackhawk helicopters.

Grigoriadis, director of the graduate program in aerospace engineering, focuses on mathematical model-based control and estimation methods that provide system regulation and tracking.

"One type of sensor alone most probably will not do the job. A challenge is to provide seamless integration of various sensors that inevitably will have different properties," he said. "The fault-tolerance of the overall navigation process – or the ability of it to work properly despite certain failures – is essential."

Ultimately, Provence said, the team has to look at what he calls "the worst-case scenario."

"Basically, if the rover drives away from the base and breaks down 20 or so miles away, the astronaut has to get out and walk back. Our best guess is that he needs to get to within about 1,800 feet of the base before he can see it. The problem is, he may have started at the broken rover with a navigation error that's pretty big, and it's only going to get worse as he moves along, seeing as how all errors grow with time," he said.