The eight articles in this special section focus on haptic human-robot interaction. It had its origins in a tournament announced at the 7th Annual Computational Motor Control Workshop held June 2011 at Ben-Gurion University to compare algorithms for handshake generation and classification.

The history of engineering has many examples of machines that solve human problems in a distinctly nonhuman way, and much has been learned from these examples. However, human-robot interaction is a field where it seems that imitating and understanding biological systems may be of particular interest.

This special issue has its origins in a tournament announced at the Seventh Annual Computational Motor Control Workshop, held on June 2011 at Ben-Gurion University, to compare algorithms for handshake generation and classification. This tournament realized the idea of a Turing-like handshake test (reviewed in the paper by Avraham et al.) which suggested that our ability to simulate human-like motor interaction could be measured by how well that interaction deceives a human observer into believing the interaction was with a real human. The idea of the tournament was for algorithms to compete in their ability to produce handshakes that are similar to those of humans. Human judges were asked to distinguish between human and robotic handshakes and artificial agents were asked to make similar judgments.

Our special issue kicks off with “Toward Perceiving Robots as Humans: Three Handshake Models Face the Turing-Like Handshake Test” by Guy Avraham, Ilana Nisky, Hugo L. Fernandes, Daniel E. Acuna, Konrad P. Kording, Gerald E. Loeb, and Amir Karniel, which presents three handshake algorithms, their performance, and the implications this has for the current state-of-the-art in simulating human-likeness. Beyond the specific details of each handshake model, the handshake framework introduces some general questions about interactions via touch, including the gap between perception and action, co-adaptation, safety, intuitiveness, and human-likeness. Addressing these questions will be critical for future robotic systems that are supposed to interact directly with humans, assist them in performing physical tasks, enhance motor training and rehabilitation, and even interact socially such as when shaking hands or dancing.

A total of seven other manuscripts were accepted for this special issue targeting research topics of haptic shared control, guidance, and negotiation using kinesthetic and tactile feedback in human-human, human-agent, and human-robot interaction with applications to motor learning, rehabilitation, social interaction, as well as human-computer interaction.

Dane Powell and Marcia O'Malley investigated and compared different haptic shared-control guidance paradigms for motor learning in “The Task-Dependent Efficacy of Shared-Control Haptic Guidance Paradigms.” They introduce a taxonomy spanning assistance/resistance, confounding of task and assistance forces, and adjustment of assistance. They propose a novel shared-control proxy algorithm that supports stable implementation of a wide range of guidance strategies covered by the taxonomy. Four different guidance paradigms are implemented and used to train subjects in two dynamic tasks. Results of a user study confirm the “guidance hypothesis,” showing that challenge is essential for motor learning, but also highlighting the strong task-dependency of shared-control guidance techniques.

Samuel McAmis and Kyle Reed employed a bimanual haptic shared control guidance scheme that separated task and guidance forces in “Simultaneous Perception of Forces and Motions Using Bimanual Interactions.” They explored how humans used guidance information applied to one hand to imitate paths with the other hand that simultaneously experienced task-related forces. In this context, they explored trajectories that can be effectively transferred by guidance of one hand to movements of the other by using different reference frames and guiding stiffnesses as well as delays of the haptic task force. The authors found that subjects who explored a rod with one hand while that hand was being guided by passive movement of the other could perceive its orientation just as well as subjects engaged in active determination of the rod's angle. Such findings are important for understanding the exchanges of information between the cerebral hemispheres and for the design of rehabilitation robots used to encourage cortical plasticity and relearning in patients who have suffered unilateral damage such as from stroke.

Using a specially designed wrist-robot device, Lorenzo Masia, Valentina Squeri, Etienne Burdet, Giulio Sandini, and Pietro Morasso investigated the role of haptic feedback in influencing different coordination strategies between multiple degrees of freedom in a simple dynamic motor learning task in “Wrist Coordination in a Kinematically Redundant Stabilization Task.” Subjects were asked to stabilize a one degree-of-freedom inverted pendulum using two degrees-of-freedom of their arm posture (wrist and elbow). The authors found that subjects select the degrees-of-freedom depending upon the task's dynamical properties and upon the haptic feedback that is made available to them. This finding illustrates the importance of adequately designed haptic feedback and its potential to influence coordination strategies such as those targeted by sensorimotor rehabilitation treatments.

Haptic feedback has most often involved signals intended for the proprioceptors of the operator, i.e., force and position. Such information is relatively easy to obtain from transducers in a slave robot and to present via motors in the master controller. Dexterous manipulation of objects depends heavily on tactile feedback, well-known to anyone whose fingers have become numb from the cold. The rapidly evolving technology of tactors—haptic display devices targeting the operator's cutaneous receptors—is featured in two papers in this issue. In “Evaluation of Tactile Feedback Methods for Wrist Rotation Guidance,” Andrew Stanley and Katherine Kuchenbecker present the design of five different tactile devices for the wrist combined with two types of drive algorithms. They investigated different combinations of devices and algorithms in terms of their effectiveness for tasks requiring directional response, position targeting, and trajectory following. Their results show that the optimal combination of actuator and drive algorithm is highly task-specific but must generally include adequate cues for both movement direction and magnitude. In “A High Performance Tactile Feedback Display and Its Integration in Teleoperation,” Ioannis Sarakoglou, Nadia Garcia-Hernandez, Nikos Tsagarakis, and Darwin Caldwell present a new and enhanced version of a tactor array for the fingertips, first introduced as the Optacon for the blind in the 1960s (“Optacon,” Wikipedia). They demonstrate the use of vibratory tactile feedback to enhance performance of an edge-tracking task using a telerobot.

Telerobots physically separate the master controller that is operated by a human from the slave robot that interacts with the environment. Hongbo Wang and Kazuhiro Kosuge studied a more intimate interaction in which a robotic female dancer was guided by its direct interaction with a human dance partner in “Control of a Robot Dancer for Enhancing Haptic Human-Robot Interaction in Waltz.” Their approach is based on modeling each dance partner as an inverted pendulum and then predicting and minimizing the interaction forces by the robot taking the appropriate follower step. Results show the advantage of the more human-like pendulum model over classical admittance controllers and the importance of considering the whole physical coupled human-robot system for an accurate prediction of the current user state.

Finally, in “Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces,” S. Ozgur Oguz, Ayse Kucukyilmaz, Tevfik Metin Sezgin, and Cagatay Basdogan investigate the ability of haptic feedback to support negotiations in human-computer interaction tasks using tools from game theory. They developed a game in which human and computer cooperate in performing a task and demonstrated that subjects were more successful in differentiating the preprogrammed computer negotiation behavior when haptic cues were displayed in addition to visual cues. This study suggests that it is useful to introduce haptics into human-computer interaction tasks, particularly those that permit multiple strategies that must be negotiated between the participants.

A central theme in all of the papers in this special issue is the bidirectional interaction between sensory information and motor behavior. As David Katz pointed out almost a century ago ( The World of Touch, 1925), the somatosensory system differs from other exteroceptive senses in that the information that it provides is inextricably connected to the movements made to obtain it. This leads to a circularity that complicates experimental studies and their interpretation —each movement is continuously modified by the sensory feedback but the meaning of the sensory information depends on the movement. Robots provide both the need and a means to tease this problem apart because they are themselves capable of at least some human-like complexity. When humans interact with robots, they must use more of their human wiles to succeed. When robots interact with humans, their algorithms should reflect at least some of those wiles.

We wish to thank the editorial and administrative staff of the journal and, most importantly, the numerous anonymous reviewers who participated in the effort of screening the submissions for this special issue. Submissions in which any of the guest editors was an author were reviewed separately by anonymous associate editor(s) selected by the editor-in-chief. The peer-review system contributed significantly to the quality of the papers; each was revised at least once according to the reviewers' suggestions. Unfortunately, some fine submissions could not be revised before our special issue deadline; we look forward to their appearance in future publications. We hope that the articles presented here will advance research on haptics, sensorimotor neuroscience, and human-robot interactions.

Amir Karniel

Angelika Peer

Opher Donchin

Ferdinando A. Mussa-Ivaldi

Gerald E. Loeb

Guest Editors

A. Karniel and O. Donchin are with the Department of Biomedical Engineering, Ben-Gurion University of the Negev, Beer Sheva, Israel.

For information on obtaining reprints of this article, please send e-mail to: toh@computer.org.

Amir Karniel received the BSc (cum laude), MSc, and PhD degrees in 1993, 1996, and 2000, respectively, all in electrical engineering, from the Technion-Israel Institute of Technology, Haifa, Israel. He had been a postdoctoral fellow at the Department of Physiology, Northwestern University Medical School and the Robotics Lab of the Rehabilitation Institute of Chicago. Currently, he is an associate professor in the Department of Biomedical Engineering at Ben-Gurion University of the Negev, where he serves as the head of the Computational Motor Control Laboratory and the organizer of the annual International Computational Motor Control Workshop. He received the E.I. Jury Award for excellent students in the area of systems theory, and the Wolf Scholarship Award for excellent research students. In the last few years, his studies have been funded by awards from the Israel Science Foundation, The Binational US Israel Science Foundation, and the US-AID Middle East Research Collaboration. He is on the editorial board of the IEEE Transactions on Systems, Man, and Cybernetics Part A, and Frontiers in Neuroscience. His research interests include human-machine interfaces, haptics, brain theory, motor control, and motor learning. He is a senior member of the IEEE.

Angelika Peer received the engineering degree in Electrical Engineering and Information Technology in 2004 and the Doctor of Engineering degree in Electrical Engineering in 2008 from the Technische Universität München. She is currently a senior researcher and lecturer at the Institute of Automatic Control Engineering of the Technische Universität München, TUM-IAS Junior Research Fellow of the Institute of Advanced Studies, and leader of a research group in the field of haptic interaction. From 2004-2008, she was been research assistant at the Institute of Automatic Control Engineering of the same university. Her research interests include telepresence and teleaction systems, brain and body computer interface controlled robots, physical/haptic human-robot interaction, and computational models of human haptic perception and human motor control. She is a member of the IEEE.

Opher Donchin received the BSc degree in mathematics from MIT in 1988, and the PhD degree in computational neuroscience from the Hebrew University in 1999. During his PhD studies, he was supported by a grant from Clore Foundation. After completing his PhD, he was awarded a Distinguished Postdoc position in the Department of Biomedical Engineering of Johns Hopkins University with Prof. Reza Shadmehr. He is currently an associate professor in the Department of Biomedical Engineering at Ben-Gurion University of the Negev, and has a concurrent appointment as a Lecturer at Erasmus Medical College. His studies are currently funded by awards from the Israel-Lower Saxony Foundation and he is a cocoordinator for an Initial Training Network under the the asupices of the Minerva Foundation. Dr. Donchin's research interests include the physiology of the cerebellum and the role played by the cerebellum in motor adaptation.

Ferdinando A. Muss-Ivaldi received the Laurea degree in physics from the University of Torino in 1978 and the PhD degree in biomedical engineering from the Politecnico of Milano in 1987. He is a professor of physiology, physical medicine and rehabilitation and biomedical engineering at Northwestern University. He is director of the Robotics Laboratory at the Rehabilitation Institute of Chicago. His areas of interest and expertise include robotics and neurobiology of the sensory-motor system and computational neuroscience. Among Dr. Mussa-Ivaldi's achievements are: the first measurement of human arm multijoint impedance, the development of a technique for investigating the mechanisms of motor learning through the application of deterministic force fields, the discovery of a family of integrable generalized inverses for redundant kinematic chains, the discovery of functional modules within the spinal cord that generate a discrete family of force-fields, the development of a theoretical framework for the representation, generation and learning of arm movements, the development of the first neurorobotic system in which a neural preparation in-vitro—the brainstem of a lamprey—controls the behavior of a mobile-robot through a closed-loop interaction, and the development of Body-Machine Interface techniques to promote reorganization of upper body motion for the control of assistive devices.

Gerald E. Loeb (M'98) received the BA (1969) and MD degrees (1972) from the Johns Hopkins University and did one year of surgical residency at the University of Arizona before joining the Laboratory of Neural Control at the National Institutes of Health (1973-1988). He was a professor of physiology and biomedical engineering at Queen's University in Kingston, Canada (1988-1999) and is now a professor of biomedical engineering and director of the Medical Device Development Facility at the University of Southern California ( http://mddf.usc.edu). Dr. Loeb was one of the original developers of the cochlear implant to restore hearing to the deaf and was chief scientist for Advanced Bionics Corp. (1994-1999), manufacturers of the Clarion cochlear implant. He is a fellow of the American Institute of Medical and Biological Engineers, senior member of the IEEE, holder of 52 issued US patents, and author of more than 200 scientific papers (available from http://bme.usc.edu/gloeb). Most of Dr. Loeb's current research is directed toward sensorimotor control of paralyzed and prosthetic limbs. His research team developed BION injectable neuromuscular stimulators and has been conducting several pilot clinical trials. They are developing and commercializing a biomimetic tactile sensor for robotic and prosthetic hands through a start-up company for which Dr. Loeb is Chief Executive Officer (www.SynTouchLLC.com). His lab at USC is developing computer models of musculoskeletal mechanics and the interneuronal circuitry of the spinal cord, which facilitates control and learning of voluntary motor behaviors by the brain. These projects build on Dr. Loeb's long-standing basic research into the properties and natural activities of muscles, motoneurons, proprioceptors, and spinal reflexes.