Download

Full text not available from this repository.

Description/Abstract

The human visual system adapts to the changing statistics of its environment. For example, the light-from-above prior, an assumption that aids the interpretation of ambiguous shading information, can be modified by haptic (touch) feedback. Here we investigate the mechanisms that drive this adaptive learning. In particular, we ask whether visual information can be as effective as haptics in driving visual recalibration and whether increased information (feedback from multiple modalities) induces faster learning.

The intermittence of conflicting information, or feedback, appears critical for learning. It causes an initial, erroneous percept to be corrected. Contrary to previous proposals, we found no particular advantage for cross-modal feedback. Instead, we suggest that an ‘oops’ factor drives efficient learning; recalibration is prioritised when a mismatch exists between sequential representations of an object property. This ‘oops’ factor appears important both across and within sensory modalities, suggesting a general principle for perceptual learning and recalibration.