Abstract

Minimizing uncertainty in the brain's representation of the environment is essential for accurate decision making but the underlying neural mechanisms remain elusive. Here we show analytically that polarising spatial cues produce an anisotropy in the information available to movement trajectories. Secondly, we simulate a population of entorhinal grid cells in an environment with anisotropic spatial information and show that self-location is decoded with the highest accuracy when grid-patterns are aligned with the axis of greatest information. Thirdly, we expose human participants to a polarised virtual reality (VR) environment and confirm the presence of the predicted asymmetry in spatial navigation performance. Finally, we use fMRI of virtually navigating humans and observe that, as predicted, the orientation of entorhinal grid-like hexa-directional activity is aligned with the environmental axis of greatest spatial information. In sum, we demonstrate a crucial role of the entorhinal grid system in reducing uncertainty in the neural representation of self-location and shed new light on the adaptive spatial computations underlying entorhinal grid representations in the service of optimal decision making and wayfinding.