Guessing beats knowing in visual test

When it came to a visual recognition test, people's guesses were more accurate …

Your brain can recognize and process things that you aren’t consciously retrieving from your memory. For example, you don’t need to intentionally think about how to get dressed in the morning, nor do you need to consciously go through all the steps necessary to drive from home to work. You automatically do those things because they are routine procedures that are a part of your implicit memory. Implicit memory can influence your actions without your awareness of any mental activation, whereas explicit memory involves active and conscious memory retrieval.

While scientists generally agree that routine tasks are part of implicit memory, they have less confidence about visual recognition. Many work under the premise that recognition is the exclusive domain of explicit memory. While it's true that you must consciously dig through your memories to figure out if you have seen something before, it might be possible for your subconscious to help you identify things without your knowledge—a gut instinct or lucky guess that helped you on an exam could have originated in your implicit memory. With those things in mind, neuroscientists Joel Voss and Ken Paller set out to determine if implicit memory can direct explicit, visual recognition and, if so, how.

Their research, published in the current issue of NatureNeuroscience, involved asking volunteers to study kaleidoscope images with either their full attention or when their attention was divided by having to perform a distracting task during the experiment. Voss and Paller propose that distraction will make it harder for explicit memory storage.

Following this encoding stage, participants had to select the images that they previously saw from a set of images that were similar in appearance. After they made a selection, they had to report if they were guessing or if they consciously knew the answer. Throughout the experiment, Voss and Paller recorded brain potentials from 68 electrodes that they had evenly distributed across each participant’s head.

Interestingly, regardless of attentiveness during the image studying process, participants were significantly more accurate with their guesses than with their confident answers. However, distracted participants were more accurate with their guesses (81.4 percent accuracy) than the undisrupted participants (72.1 percent accuracy). Voss and Paller saw these behavioral results as an indication that memory “retrieval processes operative during guess decisions were distinct from those responsible for recognition with retrieval awareness.”

The brain potentials recorded by the electrodes also showed dramatic differences between guesses and reportedly known answers. Conscious recognition of an image corresponded to increases in two potentials that are indicative of explicit memory and located around the parietal cortex and medial temporal lobe. Correct guesses did not elicit changes in these potentials; instead, they caused rapid potential changes in the occipital and left frontal lobes. This indicates that correct guesses resulted from implicit memory and are unrelated to explicit memory.

Thus, how we recognize what we see is far more complicated than a single process of conscious memory recall—without our knowledge, our brain can recognize and sway our responses to images. Voss and Paller propose reevaluating the popular assumption that recognition is solely based on explicit memory, and they recommend taking implicit memory into account in future studies on recognition.

Yun Xie / Yun Xie / Yun Xie is a contributing science writer at Ars, where she covers the latest advancements in science and technology for Ars. She currently works in scientific communications, policy, and review.