Neural pattern similarity and visual perception

Citation

Abstract

This thesis addresses the question of whether people actually see the same visual stimuli somehow differently, and under what conditions, if so. It is an experimental contribution to the basic understanding of visual and especially face perception, and its neural correlates, with an emphasis on comparing patterns of neural activity driven by visual stimuli across trials and across individuals. We make extensive use of functional magnetic resonance imaging (fMRI); all inferences about neural activity are made via this intermediary. The thesis is organized into two parts:

In Part I, we investigate the nature of face familiarity and distinctiveness at perceptual and neural levels. We first address the question of how the faces of those people personally familiar to a viewer appear different than they would to an unfamiliar viewer. The main result is that they appear more distinctive, i.e., dissimilar to and distinguishable from other faces, and more so the higher the level of familiarity. Having established this connection between face familiarity and distinctiveness, we ask next what is different about the perception of such faces, as compared with indistinct and unfamiliar faces, at the level of brain activation. We find that familiar and distinctive faces are represented more consistently: compared with indistinct faces, which evoke slightly different patterns of activity with each new presentation, these faces evoke slightly similar patterns. Combined with the observation that consistency can enhance memory encoding (a result reported by Xue et al. [102]), this suggests a cyclic process for the learning of unfamiliar faces in which consistent representation and the presence of newly formed memories mutually feedback on each other.

Whereas in Part I we focus on individual differences in neural activity, principally by experimentally manipulating stimulus familiarity, in Part II, we shift our focus to similarities across individuals and extend our investigation beyond faces to the perception of visual objects in general and moving images. We begin with an experiment involving the perception of static images selected from 44 object categories, where we find that the distances between these categories, induced from activity in cortical visual object areas, correlate highly between subjects, and also to distances inferred from a behavioral clustering task, and that this correlation remains significant even among subsets of closely related categories. We also show that one subject's brain activity can be accurately modeled using another's, and that this allows us to predict which image a subject is viewing based on his/her brain activity. Then, in a different experiment investigating the perception of dynamic/video stimuli, we find evidence that when watching videos with sound, visual attention is likely blurred at times and transferred to audition; subjects relatively temporally decorrelate in visual areas compared to the muted case, in which the patterns of neural activity correlate across subjects at an average of 78% the level found with oneself later in time.

The findings reported in this thesis thus offer quantitative lower bounds on how similarly different individuals neurally experience visual stimuli, and an explanation for how they perceptually and neurally diverge when familiarity with a (face) stimulus varies, suggesting a possible mechanism for the encoding of new visual objects into memory. We conclude with a discussion of some of the questions raised by this work and directions for future research.