Abstract

We present a computational model that learns a Coupling between motor parameters and their sensory consequences in vocal production during a babbling phase. Based on the Coupling, preferred motor parameters and prototypically perceived sounds develop concurrently. Exposure to an ambient language modifies perception to coincide with the Sounds from the language. The model develops motor mirror neurons that are active when an external sound is perceived. An extension to visual mirror neurons for oral gestures is suggested. (C) 2003 Elsevier Inc. All rights reserved.