Are custom-build drones retrofitted with touchable surfaces the future of VR haptic feedback?

In a recent scientific publication entitled VRHapticDrones: Providing Haptics in Virtual Reality through Quadcopters, researchers from LMU Munich, TU Darmstadt, Wellesley College, the University of Duisburg,-Essen, and the University of Regensburg offer a solution to body-free VR haptic feedback in the form of a programmable quadcopter system.

Serving as a levitating haptic feedback proxy, these specially-designed drones are outfitted with unique touchable surfaces and track a users position to provide real-time haptics to virtual objects. These can include anything from stationary objects, such as coffee cups and chairs, to mobile objects, such as wildlife and various projectiles.

Utilizing the Motive OptiTrack motion capturing system combined with 12 Flex 3 cameras covering a total space of 4m x 4m x 3m, VRHapticDrones is able to track the HMD, multiple quadcopters, as well as a users defined body parts and stream 100 Hz samples to the VRHapticDrones backend with millimeter accuracy. Leap Motion tracking sensors mounted to the front of an Oculus Rift headset track the users hand, allowing them to interact naturally with the floating tactile drones.

The devices themselves are Parrot Rolling Spider brand quadcopters, each retrofitted with various touchable surfaces composed of tulle-textile covers.

The goal of the team was to deliver three types of VR haptic feedback: passive, active, and positioning haptic proxies. To help demonstrate the capabilities of the trackable quadcopters, researchers developed a multifaceted underwater VR experience that tackles all three forms of feedback. The experience opens in a sparsely-lit underwater environment inhabited solely by a glowing orb. After exploring the environment, players can reach out and feel the resistance of the sphere which is provided by an encased quadcopter floating passively in the exact position of its VR counterpart.

The player is then introduced to their first active VR haptic feedback, a curious shark that swims directly towards the user upon entering the vicinity. As the creature makes contact with the user in VR, a quadcopter mounted with a physical surface shaped similarly to the nose of a shark floats towards the user and makes physical contact at the exact same time, amplifying the overall impact of the interaction.

The final form of haptic feedback, positioning haptic proxy, combines both active and passive exploration to create lightweight “tokens” in which the user can directly interact with. In this demonstration, the token was represented as a small worm dangling on the string of a fishhook. Seeing as the worm is just another encased drone floating in midair, players are able to grab the object and keep it as an in-game trophy.

In terms of applicable use-case scenarios, researchers are confident the VRHapticDrones system could provide assistance throughout numerous industries.

“In a game, players could be asked to open a door (passive feedback). When the door swings open arrows are shot at the player from behind the door (active feedback),” the team states in their publication. “To stop getting shot by arrows the player must pull out a spring from a device to disarm the arrow trap.”

“We also envision application scenarios in the construction and design domains. Car designers can benefit from VRHapticDrones during the design process while potential customers can virtually touch (passive feedback) their new car before ordering.”

While an impressive solution to the issue of VR haptic feedback, the VRHapticDrones system does feature its fair-share of limitations. The current platform, for instance, still requires a reflective marker to be attached to certain parts of the user in order to provide full body tracking. The team has also limited their experiments to one quadcopter at a time in order to regulate the “complexity of our [their] flight control component;” although the Bluetooth stack integrated into the system should support up to five drones at once.

For more information on the project, check out the research teams extensive publication available here.