Authors

Editors

Source

Excerpt

The idea that the brain processes sensory inputs in parallel modality-specific streams has given way to the concept of a “metamodal” brain with a multisensory task-based organization (Pascual-Leone and Hamilton 2001). For example, recent research shows that many cerebral cortical regions previously considered to be specialized for processing various aspects of visual input are also activated during analogous tactile or haptic tasks (reviewed by Sathian and Lacey 2007). In this article, which concentrates on shape processing in humans, we review the current state of knowledge about the mental representation of object form in vision and touch. We begin by describing the cortical regions showing multisensory responses to object form. Next, we consider the extent to which the underlying representation of object form is explained by cross-modal visual imagery or multisensory convergence. We then review recent work on the view-dependence of visuo-haptic shape representations and the resulting model of a multisensory, view-independent representation. Finally, we discuss a recently presented conceptual framework of visuo-haptic shape processing as a basis for future investigations.