Inspired by neuroscience and developmental studies, we have proposed a computational model for emotional development based on multimodal perceptual information. A synesthetic mechanism enables a robot first to detect invariant features among multiple modalities and then to extract emotion such as pleasure/unpleasure and moreover six basic emotions (i.e., happy, surprise, anger, etc.) by abstracting the features using a probabilistic neural network. Our hypothesis that among multiple modalities, tactile information leads to emotional development has been verified through our comparative experiments. These results yield new insights about neural mechanisms of emotional development and promising ideas for the design of computational models for emotion recognition.