Collision Detection and Response

Nowadays, virtual reality technologies are able to generate realistic images of virtual scenes. The next step towards increasing the level of reality is to make possible intuitively manipulating objects in a virtual scene, while feeding back haptic collision information to a human user.

The telepresence and virtual reality group of the German Aerospace Center (DLR) researches on haptic rendering algorithms that enable such real time virtual reality manipulation scenarios. The user can manipulate complex virtual objects via our bimanual haptic interface; whenever objects collide against each other, the user perceives the corresponding collision forces. An intuitive use case is given in the following car part assembly video.

In comparison to visual rendering, which requires update rates of at least 30Hz for smooth visual feedback, haptic signals must be updated at a challenging rate of 1000Hz to obtain stable and realistic haptic feedback. We use an algorithm based on two data structures: voxelized distance fields (voxmaps) and point-sphere hierarchies (pointshells). Our work is inspired by the haptic rendering approach introduced by the Voxmap-PointShell (VPS) Algorithm, which allows for collision feedback even with objects consisting of several millions of triangles.

By using virtual reality scenarios with such haptic rendering technologies it is possible, for instance,

to check in early stages of a product design whether different parts are optimally assemblable,

to integrate into the product engineering steps knowledge of manufacturers that build the final product, or

to train mechanics in order to prepare them for future complex assembly tasks with fragile objects.