4 things I learned designing an AR app for NASA

In the not-so-distant future, apps will move away from flat screens and will take over our surrounding space. As a designer, I find that so exciting—it’s why I love working on the user experience of augmented reality (AR) apps.

This past November, I was lucky enough to collaborate with the Ops Lab at NASA’s Jet Propulsion Laboratory (JPL) through New York University. My team helped design a file visualization interface for their AR app, ProtoSpace.

Mechanical engineers at NASA use ProtoSpace to view a realistic model of the rovers they’re building. As of now, engineers have to look at CAD files to view different parts within this model.

A user journey my team made to illustrate navigation through the app.

My task was to think of how engineers could navigate this file hierarchy with the HoloLens without referring back to their computers. While designing a solution, I found that I had to approach the problem from a different angle.

Since there are no established UX best practices for AR yet, I’d like to share my own personal approach to UX in AR apps.

1. Think about how we interact with physical objects around us, and use that as inspiration

How do we notice a sign when we walk down the street? How does a particular object catch our attention? It’s useful to think about this and to learn from real-life situations when designing an interface in AR.

Since we’ll integrate our design solution within our space, we want it to feel as natural as possible. While I was working on the ProtoSpace project, I encountered an interesting problem. I had to figure out how the engineers could search for a particular file (like a specific rover part) within a hierarchy through the HoloLens if the file size was extremely large.

I got great advice on how to approach this problem: Let’s say you’re at a subway station and you see that the train approaching the station is packed. How were you able to tell that it was packed?

This prompted me to think about how we instinctively know that we’re dealing with a large quantity, whether it’s people or objects. Within a confined environment, do we notice the lack of space first? Or do we see a large number of one thing and only then notice the lack of space? Or, most likely, both at the same time?

On a different note, how I would search for a particular person within a crowd once I realize that there are too many people to conduct a “linear search”?

Picture the nightmare scenario of going to a concert with over 500 people and having to look through every single one of these people to find your friend.

To make things easier, I could pick out a set of characteristics that I know about this person to narrow down the search. For example, I know what my friend looks like and I might also know what they’re wearing. Now that I’ve thought about a real-life version of my design problem and the logical steps I’d take to solve it, I can then think about how this solution could be visualized through design.

Pretty simple, right?

Testing out immersive environments using the HTC Vive.

2. Don’t try to apply 2D UX best practices to 3D

This one’s pretty obvious, but it still needs to be said because it’s so important.

Looking at a flat-screen television and interacting with a 3D object in your office or living room are 2 completely different experiences.

So why would you use the same design principles for both?

3. The most effective initial prototypes for AR apps are always physical

This goes with my previous point. If you’re going to create something you’ll be seeing within a physical space, then it makes more sense to prototype accordingly.

It’s not a great idea to use your typical prototype to present your initial designs. You won’t get accurate user testing results, and you also won’t get a good grasp of how your design will work/feel.

A better low-fidelity alternative: create a cardboard model of your designs and take pictures of the different stages of user interaction. For extra points, you could even create an interactive prototype that can be tested using Google Cardboard.

Physical prototype our team created for file hierarchy visualization.

4. UI elements don’t have to be in 3D

Not everything in your AR app has to have a 3D shape. For example, when you’re driving on a highway the signs are almost always flat rectangles or squares—you rarely encounter cube signs. With AR apps, it’s totally fine to display some elements in 2D.

Olivia CabelloOlivia is currently pursuing her Master’s degree in Integrated Digital Media at NYU Tandon. She specializes in user experience design and her passion is to explore the future of UX in new media.