Publications

Using Visual Cues of Contact to Improve Interactive Manipulation of Virtual Objects in Industrial Assembly/Maintenance Simulations

Jean Sreng, Anatole Lécuyer, Christine Mégard and Claude Andriot

In IEEE Transactions on visualization and computer graphics 12(5):1013-1020, 2006

This paper describes a set of visual cues of contact designed to improve the interactive manipulation of virtual objects in industrial assembly/maintenance simulations. These visual cues display information of proximity, contact and effort between virtual objects when the user manipulates a part inside a digital mock-up. The set of visual cues encloses the apparition of glyphs (arrow, disk, or sphere) when the manipulated object is close or in contact with another part of the virtual environment. Light sources can also be added at the level of contact points. A filtering technique is proposed to decrease the number of glyphs displayed at the same time. Various effects –such as change in color, change in size, and deformation of shape– can be applied to the glyphs as a function of proximity with other objects or amplitude of the contact forces. A preliminary evaluation was conducted to gather the subjective preference of a group of participants during the simulation of an automotive assembly operation. The collected questionnaires showed that participants globally appreciated our visual cues of contact. The changes in color appeared to be preferred concerning the display of distances and proximity information. Size changes and deformation effects appeared to be preferred in terms of perception of contact forces between the parts. Last, light sources were selected to focus the attention of the user on the contact areas.

In Proceedings of the ACM symposium on Virtual reality software and technology, 165-173, 2007

This paper decribes a general event-based approach to improve multimodal rendering of 6DOF (degree of freedom) contact between objects in interactive virtual object simulations. The contact events represent the different steps of two objects colliding with each other : (1) the state of free motion, (2) the impact event at the moment of collision (3) the friction state during the contact and (4) the detachment event at the end of the contact. The different events are used to improve the classical feedback by superimposing specific rendering techniques based on these events. First we propose a general method to generate these events based only on the objects’ positions given by the simulation. Second, we describe a set of different types of multimodal feedback associated to the different events that we implemented in a complex virtual simulation dedicated to virtual assembly. For instance, we propose a visual rendering of impact, friction and detachment based on particle effects. We used the impact event to improve the 6DOF haptic rendering by superimposing a high frequency force pattern to the classical force feedback. We also implemented a realistic audio rendering using impact and friction sound on the corresponding events. All these first implementations can be easily extended with other event-based effects on various rigid body simulations thanks to our modular approach.

Using vibration patterns to provide impact position information in haptic manipulation of virtual objects

Jean Sreng, Anatole Lécuyer and Claude Andriot

In LNCS(5024), Proceedings of EuroHaptics, 589-598, 2008

While standard closed haptic control loop used in haptic simulation of rigid bodies are bounded to low frequency force restitution, event-based or open-loop haptic, by superimposing high-frequency transient force pattern, can provide a realistic feeling of the impact. This highfrequency transient can provide the user with rich information about the contact such as the material properties of the object. Similarly, an impact on different locations of an object produces different vibration patterns that can be used to determine the impact location. This paper investigates the use of such high-frequency vibration patterns to provide impact position information on a simulated long rod held by the edge. We propose in this paper different vibration pattern models to convey the position information: a realistic model based on a numerical simulation of a beam and three empirical simplified models based on exponentially decaying sinusoids. A preliminary evaluation has been conducted with 15 participants. Taken together, our results showed that the users are able to associate vibration information with impact position efficiently.

In this paper we introduce a "Spatialized Haptic Rendering" technique to enhance 6DOF haptic manipulation of virtual objects with impact position information using vibrations. This rendering technique uses our perceptive ability to determine the contact position by using the vibrations generated by the impact. In particular, the different vibrations generated by a beam are used to convey the impact position information. We present two experiments conducted to tune and evaluate our spatialized haptic rendering technique. The first experiment investigates the vibration parameters (amplitudes/frequencies) needed to enable an efficient discrimination of the force patterns used for spatialized haptic rendering. The second experiment is an evaluation of spatialized haptic rendering during 6DOF manipulation. Taken together, the results suggest that spatialized haptic rendering can be used to improve the haptic perception of impact position in complex 6DOF interactions.

Ph.D. Thesis

I conducted my Ph.D. studies in CS with the Bunraku team at INRIA Rennes and the CEA (French Atomic Energy Commission) under the supervision of Dr Anatole Lécuyer and Dr Claude Andriot under the direction of Pr Bruno Arnaldi. I defended my Ph.D. thesis on December 9th, 2008 in public at Rennes (France).

Title: Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments

Abstract: Virtual Reality technologies are increasingly used in numerous domains. In industrial applications for instance, virtual prototyping enables engineers to interactively test if one part can be assembled into another. In such simulations, the contact between virtual objects is an essential notion as it tightly governs the movement of objects, constraining their trajectory with respect to their direct surrounding. The contact helps the user to understand the interaction between geometries, notably through interactive manipulation.
Providing information of contact in virtual environments raises many challenges, for instance due to the growing complexity of simulated scenes. In this thesis, we propose to investigate multimodal (visual, auditory and haptic) rendering techniques focused on the contact information in virtual environments.
First, we propose an integrated approach for multimodal rendering of information of contact in 6DOF manipulations. We present a generic formulation independent from the underlying simulation. Then, we introduce a multimodal rendering architecture delivering visual, auditory, tactile and kinesthetic feedback of contact.
Next, we further investigate the specific issue raised by complex shaped objects. The multiple contacts generated by the interactive manipulation of such objects are difficult to perceive. We propose to address this issue by providing the position information associated to each contact.
Thus, we first present and evaluate a visual rendering technique based on glyph and light effects to provide information of contact in situations of multiple contacts between objects.
Then, we introduce a haptic rendering technique based on high-frequency vibrations to convey the impact position information. A 1DOF haptic case study based on a vibrating beam is first presented and experimented. Then, we generalize this approach by presenting a spatialized haptic rendering technique for 6DOF manipulation. Two experiments are presented to optimize the rendering parameters of spatialized haptic rendering and provide subjective evaluation.

Reviewer

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.