How gaming tech is making for better interplanetary exploration

Head-mounted displays and motion tracking can put you on the surface of Mars.

"My dream in this area is that, someday, when we put human boots on the surface of Mars, I want there to be millions of people in attendance for that event," Jeff Norris, Mission Operations Innovations lead at NASA's Jet Propulsion Laboratory told Ars in a recent interview. "I want them not just sitting in their living room watching a television screen; I want them standing on Mars in their own holodecks right there beside the astronauts."

That might seem like a pretty ambitious goal, even given how much time we have until a manned mission to the red planet is likely to happen. Still, it seems much more realistic when you see the fully navigable, 3D virtual reality version of Mars that Norris and other JPL researchers have already created using funding from NASA's Game Changing Development program and some technology originally designed for more realistic gaming.

While it's pretty awe-inspiring to be able to look at the surface of another world through panoramic photos or IMAX movies, a flat image is not exactly the best way to study the surface of another planet. "If you're looking at a panoramic image on a flat monitor... you might think that something's to your right that's actually behind you, because you're looking at a rolled out picture," Norris pointed out. "Or you might zoom in on something and think 'Oh, that's significantly to the right of me.' But because you're zoomed in so far, it might just be a couple of degrees to your right, because you don't have the reference points on Mars."

That's part of why JPL has been looking into ways to let researchers and members of the public experience the data it's getting from Mars more naturally. In the past, the team has looked into room-sized "surround theaters" that project images from the Martian surface on the floors, walls, and ceilings, ensconcing users in a Martian environment. These installations are comfortable, but also quite expensive. They also require users to make a special trip to use them.

For a more portable home or office-based simulation, the team turned to gaming technologies to make navigating more intuitive and effective. "We're always watching the market for new devices that are potentially useful for work," Norris said. "We like working with video game technologies because they're designed to be very accessible for people. They also tend to be very low cost."

Norris seems particularly excited about the team's current research, which combines an HD prototype of the Oculus Rift head-mounted display with Vicon's Bonita position tracking system. Users can actually walk and look around a simulation of the Martian surface, as shown in a video the team released today.

The people behind the Oculus are excited as well. “Virtual reality is immediately useful as a way to get people excited about space again, and as the hardware advances, it will become a powerful tool for real research and exploration," Oculus CEO Palmer Luckey told Ars. "In particular, virtual reality based telepresence could allow human operators to perform tasks in hostile environments without actually going there themselves. For many use scenarios, the dexterity and locomotion capabilities of a human piloted robot will surpass what a real person in a spacesuit can do. JPL has some amazing stuff in the pipeline that is going to have a big impact on mankind. It makes me very happy to see people using virtual reality and the Rift for the greater good."

Virtual Mars

JPL's simulation of the Martian surface is put together by taking a panoramic parallax image sent by a Martian rover and stitching it together onto the surface of a virtual cylinder about 50 square meters (164 square feet) in base area. Researchers then superimpose 3D point cloud data gathered by the rover on top of the scene, creating a series of meshes called "surflets" that give the impression of depth.

The result, as you see in the video, is a fully explorable, accurate 3D slice of the Martian surface sent to Earth from hundreds of thousands of miles away. "When you consider all of the data that Curiosity has acquired, plus Opportunity and Spirit, we have hundreds of these sites," JPL Human Interfaces Software Engineer Victor Luo told Ars.

Being inside a 3D virtual Mars is undoubtedly cool, but it also has important applications as a research tool, Norris said. Using the Oculus Rift, researchers can study the data sent by Curiosity just as they would view the world around them—by moving their heads and walking around. Norris said that using these natural human abilities to engage with different environments is key to making them more effective explorers in a strange world.

"Think about how amazing of an ability every human has to extract vast amount of information from the environment just by being present in it," Norris said. "Then you consider that geologists and other scientists build on those abilities with specialized training and field work and things like that. But then we ask them to go and explore the planet by peering at flat images on ordinary computer monitors and so on. We have some evidence that this makes our natural abilities work against us rather than for us. This project is about correcting that."

Adding the Vicon motion tracker to the setup gives researchers another ability they don't easily have with flat pictures of the Martian surface: the ability to walk around and examine bits that interest them up close. "If you're looking at pictures taken by the robot, you're kind of frozen where the robot is," Norris said. "We want people to be able to get up and move around in that space."

More than just locomotion, though, Norris said the motion tracker adds verisimilitude via the "little motions" humans make when looking around, such as shifting a shoulder or leaning the neck to one side. This kind of parallax motion is a key way that humans and other animals assess depth and distance in their environment, and it allows for a much more accurate impression of the user's virtual Martian surroundings.

Enlarge/ This might not look that impressive on your flat computer screen, but just wait until you can experience it on an HD head-mounted display.

This is all useful for researchers on the ground trying to form scientific hypotheses about the Martian landscape, of course. But Norris sees this kind of technology being used by the operators navigating Martian rovers, letting them "be more informed and better aware of the environment of the robot, what we call the morphology of the environment—what's it shaped like, where are the rocks, what orientation are they, how are they arranged. When you're driving a robot around on another planet, it's important to know what the environment right around it is..." he said.

The future

As impressive as the working prototype JPL has up and running is, it's very much a first draft of the kind of virtual reality environments the lab hopes to create. In the future, JPL researchers are looking to integrate other data to make the virtual Martian environment more detailed, including data from lower-resolution orbital photos and true-color rover "Mastcams" that generate much higher spatial resolution when zoomed in. The model may even eventually integrate data from wide-field, close-mounted "hazard cameras" that were designed to monitor the area immediately surrounding the rovers' treads and robotic arm for safety.

Researchers are also looking to increase immersion by adding Martian audio to the simulation, despite a lack of direct sound measurement on the planet itself. "We don't have a microphone on Mars, so we don't know exactly what it sounds like, but we can model it," Norris said. "Curiosity has a weather station. We know how hard the wind is blowing, we know the atmospheric pressure, we could render the audio in a very convincing way... It should sound like a high whistling."

As for the user interface, the researchers are looking into adding some kind of accurate hand-tracking to let users reach out and interact with the virtual Martian soil. "The interaction language inside of a virtual reality environment is kind of non-defined," Luo said. "Using a joystick really doesn't make that much sense when you're on Mars. How would you navigate in that environment? ... Is there some other kind of hardware that could be attached or installed to help with that?"

Norris said the JPL team has experimented with hand-tracking solutions ranging from the LeapMotion and Kinect to the Razer Hydra and accelerometer-based gloves, but no option has really provided the perfect combination of accurate tracking, mobility, and light encumbrance on the hands. Still, he's hopeful that the game industry might help him out on this score in the near future. "One of the things we love about working in this area is that there are a lot of companies that are attacking these problems for a lot of reasons—video games, entertainment—but we'll benefit as well," he said.

While JPL is still long way off from releasing its virtual Mars in a way that the public can enjoy, Norris said that "making it possible for the public to join us on the surface of these distant worlds in a way that feels much more real than looking at a picture on a screen we think is of great importance and really exciting."

And there's always that eventual goal of letting the world experience the first steps on Martian soil in a holodeck-like environment, of course. "I think those kinds of events, those kinds of explorations in the future, need to be experienced by our whole civilization, and these technologies are how we can make that happen," Norris said. "It's how we can be a part of that in a way that's so much more real than anything that's ever been available to us before."

Promoted Comments

Kyle Orland
Kyle is the Senior Gaming Editor at Ars Technica, specializing in video game hardware and software. He has journalism and computer science degrees from University of Maryland. He is based in the Washington, DC area. Emailkyle.orland@arstechnica.com//Twitter@KyleOrl

12 Reader Comments

Casis is a program that supports the ISS and they currently have a contest to decide an object/project to send to the ISS. I submitted an idea of a robot compatable with an Oculus Rift so that schools and students could borrow time controlling the robot and see what its like to be inside the ISS (albeit with a latency that is inherently built into a system like this). I imagine the project being a stepping stone to fully immersive robotic "space travel". The distance to mars is so much further that you'd have to have way to "look" 360° with no reliance on camera direction, so instantly responsive 3D is pretty much out. Cool stuff though.

Spoiler: show

You can vote on the project here under "robot specs" if you're so inclined, which helps decide which project they choose.

I could see them using something like this on an actual trip to Mars. Assuming it is possible to build a reasonably large space station that we could put into orbit around Mars, they could send down several rovers and control them remotely w/o having a 20-min time delay. (It would require much smaller rockets to send small samples back to orbit then an entire habitat) Could possibly use multi-use rockets for "day trips" if they are able to process the soil to refuel.

As excited as I am to use the Rift for gaming (and general immersive environments), it's these kind of inventive and wide reaching applications that really excite me.

If VR catches on this time (and I see no reason why it shouldn't), who knows the kinds of things we'll be using it for in 10 years.

We could go to the other space, underwater.

Quote:

But Norris sees this kind of technology being used by the operators navigating Martian rovers, letting them "be more informed and better aware of the environment of the robot, what we call the morphology of the environment—what's it shaped like, where are the rocks, what orientation are they, how are they arranged. When you're driving a robot around on another planet, it's important to know what the environment right around it is..." he said.

Watched an interesting doc on the Soviet Lunar ROV's the other day. They had guys anticipating the maneuvers of the rovers by watching a (very low res) TV monitor via satellite link up. Some serious time delays involved. WAAY ahead of the curve. Some of their scientists came to the US and helped JPL set up for the Martian robot rovers.

This approach seems to offer a convincing compromise between manned space exploration and robotics - send the robots, but let me control them as an avatar. Of course the extreme communication latency will need to be optimized, and I'm sure top men are working on that issue already. Quantum entanglement could be a good place to start.

I know this article is a few months old, but I just came across it. What I'm wondering is why no microphone, with all the other instruments that are onboard? It seems that a microphone would have been trivial to add.