The researchers demonstrated how words can affect perception. A particular wave that occurs a tenth of a second after a visual image appears was enhanced by a matching word but not by a matching natural sound. And the word made the identification of the visual quicker but the natural sound did not. For example a picture of a dog, the spoken word ‘dog’, and a dog’s bark would be a set.

They believe this is because the word is about a general category and the natural sound is a specific example from that category. Symbols such as words are the only way to indicate categories. “Language allows us this uniquely human way of thinking in generalities. This ability to transcend the specifics and think about the general may be critically important to logic, mathematics, science, and even complex social interactions.“

Here is the abstract: “People use language to shape each other’s behavior in highly flexible ways. Effects of language are often assumed to be “high-level” in that, whereas language clearly influences reasoning, decision making, and memory, it does not influence low-level visual processes. Here, we test the prediction that words are able to provide top-down guidance at the very earliest stages of visual processing by acting as powerful categorical cues. We investigated whether visual processing of images of familiar animals and artifacts was enhanced after hearing their name (e.g., “dog”) compared with hearing an equally familiar and unambiguous nonverbal sound (e.g., a dog bark) in 14 English monolingual speakers. Because the relationship between words and their referents is categorical, we expected words to deploy more effective categorical templates, allowing for more rapid visual recognition. By recording EEGs, we were able to determine whether this label advantage stemmed from changes to early visual processing or later semantic decision processes. The results showed that hearing a word affected early visual processes and that this modulation was specific to the named category. An analysis of ERPs showed that the P1 was larger when people were cued by labels compared with equally informative nonverbal cues—an enhancement occurring within 100 ms of image onset, which also predicted behavioral responses occurring almost 500 ms later. Hearing labels modulated the P1 such that it distinguished between target and nontarget images, showing that words rapidly guide early visual processing.”