From the mouse and touchscreen, to hand tracking platforms like the Leap Motion Controller, the design of UI elements like the humble button is shaped by the hardware and how we use it. Here’s a quick guide to designing buttons and other UI elements for VR, based on our Unity Widgets.

Everything Should Be Reactive

In VR, every interactive object should respond to any casual movement. Users don’t always know what to expect, and this helps to build a mental model of how the virtual world works, and what each action achieves. Design your buttons to have clear passive, transitional, and active states.

Dormant. It’s important that your UI doesn’t feel obstructive or clutter the scenery. The Arm HUD menu system is designed to disappear when your palm is facing away. In this state, it’s like a giant digital wristwatch that covers your forearm.

Passive. Flip your arm around, and three buttons appear. They’re distinctly colored and have text-based OFF cues, so it’s instantly apparent to the user that they’re not just part of the scenery.

Transitional. When the user touches the button, it responds by being pushed down slightly in Z-space. Even if the button isn’t fully pushed and activated, the user will feel like they’re actually touching it.

Active. Once the button is pushed far enough to hit an invisible anchor plane, it will be activated. At this point, it changes color, the OFF text changes to ON, and it rebounds to its original position. This reveals a menu of new options.

Interactive Targeting

Button size is a major usability issue on any platform. Touchscreen developers already know to match touch target sizes to the average finger size.

For Leap Motion and VR, we recommend that a single finger target should be no smaller than 20 mm in real-world size. This ensures the user can accurately hit the target without accidentally triggering targets next to it.

To help with button interactions, you can also space buttons apart, momentarily lock nearby buttons when a button is pressed, make the hand semi-transparent when it approaches a button, or make only one part of the hand able to interact with the UI (such as the index finger).

Example: Weightless

Weightless uses very large buttons that allow you to customize how your space station looks and sounds. These buttons are designed to be pressed by your entire hand. Since the core game mechanic of Weightless involves flying and sorting objects, this doesn’t feel clunky.

Button Positions

Interactive elements within your scene should typically rest in the “Goldilocks zone” between desk height and eye level.

Desk height. Be careful about putting interactive elements at desk height or below. Because there are often numerous infrared-reflective objects at that height, this can cause poor tracking. (For example, light-colored or glossy desks will overexpose the controller’s cameras.)

Above eye level. Interactive objects that are above eye level in a scene can cause neck strain and “gorilla arm.” Users may also occlude the objects with their own hand when they try to use them.