For the archival version of this research, and the preferred citation, see:

To navigate successfully from place to place, many animals, including humans, rely on a “cognitive map” that represents the locations of familiar places and the spatial relationships between them (e.g., the bookstore is 2 blocks east of the parking garage). In order to use this map, you also have to know where you are currently located and what direction you are currently facing, a process called self-localization. Like a “You Are Here” sign on a physical map, these self-localization signals are important for determining where your destination is relative to your current location and planning a route to get there. What is the neural basis of these location and direction signals?

Previous work in rodents and humans provide different predictions of where in the brain these signals might be found. Invasive studies on rodents, which allow the researcher to measure the activity of individual neurons, have revealed specialized cells in the medial temporal lobe that appear to code for the animal’s current location (“place cells”; O’Keefe & Dostrovsky, 1971) and the animal’s current direction (“head direction cells”; Taube et al., 1990). However, data from studies of human navigation indicate that a region in medial parietal cortex known as the “retrosplenial complex” (RSC) is also particularly important. In fMRI studies, RSC is strongly activated when people navigate through a virtual environment (Maguire, 1998) and when people make decisions that require long-term spatial knowledge, like whether a certain place is on the East or West side of campus (Epstein et al., 2007). When RSC is damaged due to a stroke, these patients are able to recognize their surroundings, but cannot use this information to orient themselves on their cognitive map (Takahashi et al., 1997). For example, a patient could see the building in front of them and know that it is the post office, but could not use that information to decide whether to turn left or right to get home.

In this experiment, we wanted to determine which regions of the human brain carry information about your current location and current direction. Ideally, one would want to measure brain activity when people are standing at different places in the real world. Since that is not yet possible with fMRI, we showed students pictures of their campus and asked them to imagine being there (Vass and Epstein, 2013).

We went around to 8 intersections on campus and took photographs depicting the view facing North, South, East, and West (Figure 1). We then showed these photographs to students one at a time while they were in the fMRI scanner and asked them to tell us which direction the camera was facing when it took the picture (i.e., North, South, East, or West). Critically, in order to solve this problem, they also had to determine the location depicted in the picture. In other words, every time they saw a picture, they had to think about the real-world location and direction implied by it.

Figure 1. Schematic of the locations used in the experiment. A) Locations consisted of 8 intersections on the Penn campus. B) At each intersection, we took many photographs of views facing to the North, East, South, and West.

We then tested whether specific brain regions contained information about the location or facing direction implied by the photographs. To do so, we measured the activity patterns in these brain regions evoked by the different photographs. We hypothesized that if a region contains information about a particular spatial quantity, such as location, the evoked activity patterns for two stimuli that share that quantity (e.g., same location) should be more similar than the evoked activity patterns for two stimuli that do not share that quantity (e.g., different locations; Figure 2).

Figure 2. Example of location coding logic. If a region codes for location, the pattern for 1N should be more similar to 1E than to 2E, because 1N and 1E come from the same location.

To demonstrate coding of location, we wanted to find a brain region that could generalize across the different views at a given intersection and recognize them as coming from the same place. Thus, we tested whether there was a consistent pattern for each intersection that is activated whenever a person sees a picture of it, regardless of which direction the picture is facing. To demonstrate coding of direction, we asked whether each brain region exhibited a consistent pattern for a particular direction (e.g., North) across the different intersections. In other words, we tested whether there is a consistent “North” pattern that is activated whenever you face North, regardless of which intersection you are at.

We first looked for location and direction coding in the regions predicted by the prior human and rodent literature. These included RSC and six regions in the medial temporal lobe: anterior and posterior hippocampus, presubiculum, entorhinal cortex, perirhinal cortex, and parahippocampal cortex. We found that RSC coded for location in its activity patterns, but there was only a trend for direction coding. Of the medial temporal lobe regions, only presubiculum exhibited sensitivity to spatial quantities, coding for both locations and directions. This result is consistent with the reports from rodent studies, which indicate that presubiculum contains a mixture of cells that carry information about direction and location, including head direction cells, among others (Boccara et al., 2010, Cacucci et al., 2004).

To look for any brain regions we may have missed in our targeted analyses, we tested for location and direction coding everywhere in the brain by using a “searchlight” technique, in which we measure activity patterns in small spherical regions that cover the entire brain (Kriegeskorte et al., 2006). These exploratory tests confirmed the effects from the previous analyses and identified new areas exhibiting spatial coding that were not predicted by the prior literature. This analysis showed that location could be decoded not only in RSC, but also in a swath of cortex along the parietal-occipital sulcus (POS) that extended into the precuneus (Figure 4). The analysis for direction coding confirmed that this information was present in presubiculum, and also showed that nearby territory in the anterior calcarine sulcus contained direction signals.

Figure 4. A whole-brain analysis for location coding revealed that this information was represented in the parietal-occipital sulcus (POS), overlapping with RSC. A whole-brain analysis for direction coding revealed coding in anterior calcarine sulcus, overlapping with posterior presubiculum.

In summary, we demonstrated neural coding of spatial quantities necessary for navigation within a large-scale, real world environment. We find that these self-localization signals are represented in distributed activity patterns in specific brain regions, with medial temporal (presubiculum) and medial parietal (RSC, POS) regions coding location, and medial temporal (presubiculum) regions coding direction. These results link the animal and human navigation literatures, and suggest that these location and direction representations may be critical for our ability to navigate through our environment.

♦ Kriegeskorte, N., Goebel, R., & Bandettini, P. (2006). Information-based functional brain mapping. In the Proceedings of the National Academy of Sciences of the United States of America, 103, 3863-3868.