Select language

Objective

During mechanical interaction with our environment, we derive a perceptual experience which may be compared to the experience that results from acoustic and optic stimulation. Progress has been made towards the discovery of mechanisms subserving the conscious experience of interacting with mechanical objects. This progress is due in part to the availability of new instruments that can tightly control mechanical stimulation of both the ascending, i.e. sensory, and descending, i.e. motor, pathways. The program describes the design of new mechanical stimulation delivery equipment capable of fine segregation of haptic cues at different length scales and different time scales so that controlled stimuli may be delivered with the ease and accuracy which is today possible when studying vision or audition. The purpose of this equipment is to disentangle and recombine the individual cues used by the brain to recover the attributes of an object, leading to the identification of the computations that must be performed to achieve a perceptual outcome. In vision and audition, much is known of the nature of the peripheral and central computations, but in touch, for lack of proper equipment, little is known. From this knowledge, I aim at developing a theory of haptic perception which rests on the observation that these computations are distributed in the physics of mechanical contact, in the biomechanics of the hand, including the skin, the musculoskeletal organization, innervation, and in central neural processes. This research program is rich in applications ranging from improved diagnosis of pathologies, to rehabilitation devices, to haptic interfaces now part of consumer products and virtual reality systems.

Touch is our most direct access to the world around us. The haptic function, or touch, is what lets us walk and manipulate objects swiftly. It is also what keeps us safe. The ERC PATCH project developed a computational theory of haptic perception that is grounded in the physics of mechanical interactions. It aims at explaining how we transform the mechanics of touch into the conscious sensations of the shape and of the substance of which the objects we manipulate, or walk on, are made. Before the research undertaken in the PATCH project, to explain how we feel the properties of objects, such as the substance they are made of or their texture, the emphasis was on the analysis of the responsive properties of the various mechanoreptors that are embedded in our skin and other tissue. What the research conclusively demonstrated is that haptic perception is essentially the result of a computational process taking place in the brain drawing its input from the combined properties of the skin sensors, the biomechanics of the skin and of the interaction of the skin with the environment. Several brain regions and nuclei have been identified as the putative seat of where these computations take places that go far beyond the traditional view that perception arises from a classification process taking place in the primary somatosensory areas. The development of this project has resulted in two industrial outcomes in the form of a startup company and of a proof of concept prototype developed under the aegis of an attendent ERC Proof-of-Concept program. It also resulted in a direct societal outcomes in the are of accessibility through the development of a tactile communication device for use by the Deafblind.