Common haptic devices are designed to effectively provide kinaesthetic and/or cutaneous discriminative inputs to the users by modulating some physical parameters. However, in addition to this behavior, haptic stimuli were proven to convey also affective inputs to the brain. Nevertheless, such affective properties of touch are often disregarded in the design (and consequent validation) of haptic displays. In this paper we present some preliminary experimental evidences about how emotional feelings, intrinsically present while interacting with tactile displays, can be assessed. We propose a methodology based on a bidimensional model of elicited emotions evaluated by means of simple psychometric tests and statistical inference. Specifically, affective dimensions are expressed in terms of arousal and valence, which are quantified through two simple one-question psychometric tests, whereas statistical inference is based on rank-based non-parametric tests. In this work we consider two types of haptic systems: (i) a softness display, FYD-2, which was designed to convey purely discriminative softness haptic stimuli and (ii) a system designed to convey affective caress-like stimuli (by regulating the velocity and the strength of the “caress”) on the user forearm. Gender differences were also considered. In both devices, the affective component clearly depends on the stimuli and it is gender-related. Finally, we discuss how such outcomes might be profitably used to guide the design and the usage of haptic devices, in order to take into account also the emotional component, thus improving system performance.

How the human brain controls hand movements to carry out different tasks is still debated. The concept of synergy has been proposed to indicate functional modules that may simplify the control of hand postures by simultaneously recruiting sets of muscles and joints. However, whether and to what extent synergic hand postures are encoded as such at a cortical level remains unknown. Here, we combined kinematic, electromyography, and brain activity measures obtained by functional magnetic resonance imaging while subjects performed a variety of movements towards virtual objects. Hand postural information, encoded through kinematic synergies, were represented in cortical areas devoted to hand motor control and successfully discriminated individual grasping movements, significantly outperforming alternative somatotopic or muscle-based models. Importantly, hand postural synergies were predicted by neural activation patterns within primary motor cortex. These findings support a novel cortical organization for hand movement control and open potential applications for brain-computer interfaces and neuroprostheses.

This work was partially supported by the European Research Council under the ERC Advanced Grant no. 291166 SoftHands (A Theory of Soft Synergies for a New Generation of Artificial Hands) and under the grant agreement no. 601165 Wearhap (Wearable Haptics for Humans and Robots), within the FP7/2007-2013 program: Cognitive Systems and Robotics.

In this chapter we describe a softness display based on the contact area spread rate (CASR) paradigm. This device uses a stretchable fabric as a substrate that can be touched by users, while contact area is directly measured via an optical system. By varying the stretching state of the fabric, different stiffness values can be conveyed to users. We describe a first technological implementation of the display and compare its performance in rendering various levels of stiffness with the one exhibited by a pneumatic CASR-based device. Psychophysical experiments are reported and discussed. Afterwards, we present a new technological implementation for the fabric-based display, with reduced dimensions and faster actuation, which enables rapid changes in the fabric stretching state. These changes are mandatory to properly track typical force/area curves of real materials. System performance in mimicking force-area curves obtained from real objects exhibits a high degree of reliability, also in eliciting overall discriminable levels of softness.

In this paper a novel and efficient computational implementation of a Spiking Neuron-Astrocyte Network (SNAN) is reported. Neurons are modeled according to the Izhikevich formulation and the neuron-astrocyte interactions are intended as tripartite synapsis and modeled with the previ- ously proposed nonlinear transistor-like model. Concerning the learning rules, the original spike-timing dependent plasticity is used for the neural part of the SNAN whereas an ad-hoc rule is proposed for the astrocyte part. SNAN performances are compared with a standard spiking neural network (SNN) and evaluated using the polychronization concept, i.e., number of co-existing groups that spontaneously generate patterns of polychronous activity. The astrocyte-neuron ratio is the biologically inspired value of 1.5. The proposed SNAN shows higher number of polychronous groups than SNN, remarkably achieved for the whole duration of simulation (24 hours).

Dynamic stimuli in visual and tactile sensory modalities share fundamental psychophysical features that can be explained by similar computational models. In vision, information about relative motion between objects and the observer are mainly processed by optic flow, which is a 2D field of velocities associated with variation of brightness patterns in the image plane. It provides important information about cues for region and boundary segmentation, shape recovery, and so on. For instance, radial patterns of optic flow are often used to estimate time before contact with an approaching object. We put forward the hypothesis that a similar behavior can be present in the tactile domain, in which an analogous paradigm to optic flow might exist. Moreover, as optic flow is also invoked to explain several visual illusions, including the well-known "barber-pole" effect and Ouchi

In this work we investigate the possibility of mimicking haptic perception by using rheological materials. An analysis of the rheological behaviour of some "smart fluids", such as Electro-rheological Fluids (ERFs) and Magneto-rheological Fluids (MRFs), is provided to design new haptic interfaces capable of reproducing shape and compliance of some virtual objects. Some theoretical design considerations are discussed and supported by magnetic simulations implemented by means of a numerical code. Several prototypes were designed and realized through a progressive enhancement of performance up to a final 3D immersive device. Furthermore, to assess performance a set of psychophysical tests was carried out and experimental results in terms of softness and shape recognition are reported.

In this paper, we describe a biomimetic-fabric-based sensing glove that can be used tomonitor hand posture and gesture. Our device is made of a distributed sensor network of piezoresistive conductive elastomers integrated into an elastic fabric. This solution does not affect natural movement and hand gestures, and can be worn for a long time with no discomfort. The glove could be fruitfully employed in behavioral and functional studies with functional MRI (fMRI) during specific tactile or motor tasks. To assess MR compatibility of the system, a statistical test on phantoms is introduced. This test can also be used for testing the compatibility of mechatronic devices designed to produce different stimuli inside the MR environment. We propose a statistical test to evaluate changes in SNR and time-domain standard deviations between image sequences acquired under different experimental conditions. fMRI experiments on subjects wearing the glove are reported. The reproducibility of fMRIresults obtained with andwithout the glove was estimated. A good similarity between the activated regions was found in the two conditions.

Interaction with the external world requires the ability to perceive dynamic changes in complex sensorial input and react promptly. Here we show that perception of dynamic stimuli in the visual and tactile sensory modalities share fundamental psychophysical aspects that can be explained by similar computational models. In vision, optic flow provides information on relative motion between the individual and the content of percept. For instance, radial patterns of optic flow are used to estimate time before contact with an approaching object4. Similarly, in the tactile modality, radial patterns of stimuli provide information on softness of probed objects3. Optic flow is also invoked to explain several visual illusions, including the well-known "barber-pole" effect10. Here we introduce a computational model of tactile flow, which is intimately related to existing models for the visual counterpart. The model accounts for psychophysical aspects of dynamic tactile perception and predicts illusory phenomena in the tactile domain, analogous to the barber-pole effect. When subjects touched translating pads with differently oriented gratings, they perceived a direction of motion that was significantly biased towards the orientation of the gratings. Therefore, these findings indicate that visual and tactile flow share similarities at the psychophysical and computational level and may be intended for similar perceptive goals. Results of this analysis have impact on the engineering of better haptic and multimodal interfaces for human-computer interaction.

We investigated whether the visual hMT+ cortex plays a role in supramodal representation of sensory flow, not mediated by visual mental imagery. We used functional magnetic resonance imaging to measure neural activity in sighted and congenitally blind individuals during passive perception of optic and tactile flows. Visual motion-responsive cortex, including hMT+, was identified in the lateral occipital and inferior temporal cortices of the sighted subjects by response to optic flow. Tactile flow perception in sighted subjects activated the more anterior part of these cortical regions but deactivated the more posterior part. By contrast, perception of tactile flow in blind subjects activated the full extent, including the more posterior part. These results demonstrate that activation of hMT+ and surrounding cortex by tactile flow is not mediated by visual mental imagery and that the functional organization of hMT+ can develop to subserve tactile flow perception in the absence of any visual experience. Moreover, visual experience leads to a segregation of the motion-responsive occipitotemporal cortex into an anterior subregion involved in the representation of both optic and tactile flows and a posterior subregion that processes optic flow only.

This paper deals with design and implementation of innovative haptic interfaces based on Magnetorheological fluids (MRFs). This pioneering research work consists in developing 2D and quasi-3D MRF-based devices capable of suitably energizing the fluid with a magnetic field in order to build figures which can be directly squeezed by hands. These devices are able to properly create a distribution of magnetic field over time and space inducing the fluid to assume desired shape and compliance. We implemented different prototypes whose synthesis and design phase, here described in detail, was prepared by preliminary simulations obtained by means of software based on a 3D Finite Element code. In this way, both magnetic field and shear stress profiles inside the fluid can be carefully predicted. Finally, performance of these devices was evaluated and assessed.

This paper deals with a new configuration for a haptic system, which is able to simultaneously replicate independent force/displacement and force/area behaviors of a given material. Being force/area information a relevant additional haptic cue for improving softness discrimination, this system allows to extend the range of materials whose rheology can be carefully mimicked. Moreover, according to the Hertz theory, two objects with different curvature radius having the same force/displacement behavior can respond with different contact area to the same applied force. These behaviors can be effectively replicated by the integrated haptic system here proposed enabling and independent control of force/displacement and force/area. The system is comprised of a commercial device (Delta Haptic Device) serially coupled with a Contact Area Spread Rate (CASR) device. Two specimens of a material and two of another one, all with different curvature radii, were identified and modeled in terms of force/area and force/displacement. These behaviors were successfully tracked by the integrated haptic system here proposed.

Functional brain exploration methodologies such as functional magnetic resonance imaging (fMRI), are critical tools to study perceptual and cognitive processes. In order to develop complex and well controlled fMRI paradigms, researchers are interested in using active interfaces with electrically powered actuators and sensors. Due to the particularity of the MR environment, safety and compatibility criteria have to be strictly followed in order to avoid risks to the subject under test, to the operators or to the environment, as well as to avoid artifacts in the images. This paper describes the design of an fMRI compatible mechatronic interface based on MR compatibility tests of materials and actuators. In particular, a new statistical test looks at the mean and variations of activity as a time series. The device with two degrees of freedom, allowing one translation with positionfeedback along a horizontal axis and one rotation about a vertical axis linked to the translation, was realized to investigate the brain mechanisms of dynamic tactile perception tasks. It can be used to move and orient various objects below the finger for controlled tactile stimulation. The MR compatibility of the complete interface is shown using the same statistical test as well as a functional study with a human subject.

In this paper we report on a new improved free-hand haptic interface based on magnetorheological fluids (MRFs). MRFs are smart materials which change their rheology according to an external magnetic field. The new architecture here proposed results from the development and improvement of earlier prototypes. The innovative idea behind this device is to allow subjects interacting directly with an object, whose rheology is rapidly and easily changeable, freely moving their hands without rigid mechanical linkages. Numerical advanced simulation tests using algorithms based on finite element methods have been implemented, in order to analyze and predict the spatial distribution of the magnetic field. A special focus was laid on investigating on how the magnetic filed profile is altered by the introduction of the hand. Possible solutions were proposed to overcome this perturbation. Finally some preliminary psychophysical tests in order to assess the performance of the device are reported and discussed.

This paper is concerned with exploring the possibility of using Magneto-Rheological Fluids (MRF) as haptic interface. MRF are special materials capable of changing their rheological behaviour with an external magnetic field. This property suggested us to use MRF to mimic virtual objects whose compliance can be gradually modulated. Several architectures of prototypes have been envisaged. The general scheme of both prototypes refers to a Haptic Black Box (HBB) concept, intended as a box where the operator can poke his/her bare hand, and interact with the virtual object by freely moving the hand without mechanical constraints. In this way sensory receptors on the whole operator

Recent developments in advanced interface technology allowed to implement new haptic device for biomedical applications. Specifically, several innovative and more effective tools that allow people to interact by touch with virtual objects have been developed. Besides several applications such as gaming, entertainment, virtual reality, an important and promising field of application is the surgical simulation. Novice surgeons can be able to practice their first incisions without actually cutting anyone. Simulation for surgical training is a major focus for several research activity during the last few years. Simulating an organ is not easy, because is more complicated to model than is a common physical object, e.g. a ball. In this chapter we report several examples of haptic interfaces and introduce new technologies for implementation.

In this paper we report on results of a set of tests in which a group of subjects were asked to trace a straight line with the forefinger while actively scanning over a textured surface. A pattern of bumped dots, randomly distributed, and a diagonal striped pattern were used in order to investigate the occurrence of misleading perceptions based on the aperture problem of tactile flow during an active exploration. Obtained results are compared with findings achieved from a previous experiment based on passive exploration.

In this paper we report on results of a psychophysical experiment in which the optic illusion of Ouchi is reproduced in the tactile domain. In the vision field, when eyes scan over a texture grid, consisting in two rectangular checkerboard patterns oriented in orthogonal directions, the inset pattern appears to move relatively to the surrounding grid. A simplified 3D version of this pattern was realized and a group of subjects were asked to touch it while it was vibrating. Outcomes of this experiment are discussed in terms of tactile flow and the related aperture problem.

In this paper we describe a design of an innovative immersive Haptic Black Box (HBB) based on Magneto Rheological Fluids (MRF). By exploiting results from an accurate analysis performed on a previously operating haptic display a new device capable of exciting the MRF with improved performance in terms of magnetic field intensity and spatial resolution has been developed. Due to the core structure and feeding conditions, only a 3D numerical analysis, taking into account the material non-linearity, provides an accurate prediction of the excitation field and, consequently, of the rheological behavior of the uid. The results of the present paper will be used in subsequent work where the realization of the prototype and the results of several psychophysical tests on excited MRF in terms of softness and/or shape reconstruction will be described.

In this paper we pursue an investigation on the role of perceptual flow in the tactile domain, which appears to be a primary source for information such as shape, motion and softness. In this paper, we report on a set of psychophysical experiments involving how humans integrate incoherent tactile flow stimuli. Two experiments are reported, whereby discordant stimuli are conveyed to the subject by two different fingertips, or by two different families of mechanoreceptors in the same finger. Results from the first experiment show that, in these conditions, the tactile and the optic perceptive systems act in a very similar way. In the second experiment the interaction between tactile flow and friction generate an illusory phenomenon peculiar of the tactile system.

V5/MT complex responds selectively to perception of optic flow (Morrone et al., Nature Neurosci , 2001). Since similarities exist between visual and tactile perception, we hypothesized that tactile flow might also rely on V5/MT response. We and others have shown recently that visual extrastriate cortical areas respond both during visual and tactile recognition of objects, indicating that these regions are organized in a supramodal fashion. In this study, we measured neural response evoked during visual and tactile perception of coherently moving dot patterns to test the hypothesis that V5/MT may be supramodally organized and may respond also to tactile stimulation.

We report results of a pilot study using functional magnetic resonance imaging aimed at determining the neural correlates of tactile flow. We hypothesized that brain response to tactile flow would involve the same cortical areas (V5/MT) that respond to optic flow. Our results showed that V5/MT cortex indeed is activated by tactile flow perception. These findings are consistent with a supramodal organization of brain regions involved in optic and tactile flow processing.

In this paper we explore the possibility of using magnetorheological (MR) fluids in haptic interfaces, exploiting their property of changing the rheological behaviour by tuning an external magnetic field. In particular, we propose two different prototypes of haptic display, for pinch grasp and for whole-hand immersive exploration. We briefly report on the design of these devices, describe few psychophysical experiments to assess their performance, and report on the experimental results. Such investigation is rather encouraging, and provides reliable cues as to how MR fluid based devices can be designed for haptic display applications.

In this paper we report on a set of experiments involving perceptual illusions elicited by dynamic tactile stimulation of fingertips. These misperceptions are akin to some well studied optical illusions, which have been given an explanation in terms of the mechanisms of optic flow perception. We hypothesize that a similar perceptual mechanism exists for tactile flow, which is related to how humans perceive relative motion and pressure between the fingertips and objects in contact. We present a computational model of tactile flow, and discuss how it relates to accepted models of the neurophyisiology of touch. A particularly interesting phenomenon observed under some experimental circumstances, consisting of an incoherent tactile perception generating what we call a tactile vertigo, can be explained in terms of this model. The proposed tactile flow model also explains other phenomena observed in the past (namely, the Contact Area Spread Rate effect), and is of importance in designing simpler, more effective devices for artificial haptic sensing and displays.

In this paper we propose an innovative prototype of a haptic display for whole-hand immersive exploration. We envision a new concept of haptic display, the Haptic Black Box concept, which can be imagined as a box where the operator can poke his/her bare hand, and interact with the virtual object by freely moving the hand without mechanical constraints. In this way sensory receptors on the whole operator's hand would be excited, rather than restricting to just one or few fingertips or phalanges. To progress towards such a challenging goal, magnetorheological (MR) fluids represent a very interesting technology. These fluids are composed of micron-sized, magnetizable particles immersed in a synthetic oil. Exposure to an external magnetic field induces in the fluid a change in rheological behaviour turning it into a near-solid in few milliseconds. By removing the magnetic field, the fluid quickly returns to its liquid state. We briefly report on the design of this device, describe psychophysical experiments to assess performance for softness and shape exploration, and report on the experimental results.

In this paper we present an innovative application of magnetorheological (MR) fluids to haptic interfaces. These materials consist of a suspension of a micron-sized, magnetizable particles in a synthetic oil. Exposure to an external magnetic field induces in the fluid a change in rheological behaviour turning it into a near-solid in few milliseconds. Just as quickly, the fluid can be returned to its liquid state by the removal of the field. MR fluids are already present on the market, used in devices such as valves, brakes, clutches, and dampers. In this paper we investigate the possibility of using MR fluids to mimic the compliance, damping, creep (in other terms, the rheology) of materials in order to realize a haptic display and we propose two different implementations. Here we only outline the first scheme, whose experimental results have been reported in our previous work, and will describe the second one. In this latter scheme we set up a psychophysical protocol where a group of volunteers were asked to interact with the MR fluid duly excited and qualitative results are discussed.

Many applications in teleoperation and virtual reality call for the implementation of effective means of displaying to the human operator information on the softness and other mechanical properties of objects being touched. The ability of humans to detect softness of different objects by tactual exploration is intimately related to both kinesthetic and cutaneous perception, and haptic displays should be designed so as to address such multimodal perceptual channel. Unfortunately, accurate detection and replication of cutaneous information in all its details appears to be a formidable task for current technology, causing most of today's haptic displays to merely address the kinesthetic part of haptic information. In this paper we investigate the possibility of surrogating detailed tactile information for softness discrimination, with information on the rate of spread of the contact area between the finger and the specimen as the contact force increases. Devices for implementing such a perceptual channel are described, and a pratical application to a mini-invasive surgery tool is presented. Psychophysical test results are reported, validating the effectiveness and practicality of the proposed approach.

Detection of softness by tactile exploration in humans is based on both kinesthetic and cutaneous perception, and haptic displays should be designed so as to address such multimodal perceptual channel. Unfortunately, accurate detection and replication of cutaneous information in all its details is difficlt and costly. In this paper we discuss a simplified model of haptic detection of softness (whereby only information on the rate of spread of the contact area between the finger and the specimen as the contact force increases is transmitted). We provide a thorough set of psychophysical tests, to support the feasibility (in at least some contexts) of a reduced-complicacy display of haptic features.

Many applications in teleoperation and virtual reality call for the implementation of effective means of displaying to the human operator information on the softness and other mechanical properties of objects being touched. The ability of humans to detect softness of different objects by tactual exploration is intimately related to both kinesthetic and cutaneous perception, and haptic displays should be designed so as to address such multimodal perceptual channel. Unfortunately, accurate detection and replication of cutaneous information in all its details appears to be a formidable task for current technology, causing most of today's haptic displays to merely address the kinesthetic part of haptic information. In this paper we investigate the possibility of surrogating detailed tactile information for softness discrimination, with information on the rate of spread of the contact area between the finger and the specimen. Devices for implementing this new perceptual channel are described, and some preliminary psychophysical test results are reported, validating the effectiveness and practicality of the proposed approach.