A car that approaches you through the fog is not only less visible; it also appears to move more slowly than it really does. This is a rather unfortunate but real-life example of the laboratory observation that lowering contrast causes a reduction in perceived speed. A common hypothesis about the neural representation of perceived speed states that perceived speed corresponds to a vector-average readout of cells in the medial temporal area. We exploited the lawful relationship between contrast and perceived speed as a tool to test this hypothesis.

We used circular patches of random dots as test stimuli. The patches were positioned in a cell's receptive field and 100 dots moved in the cell's preferred direction for 0.5s. On randomly interleaved trials, the dots moved at 1, 2, 4, 8, 16, 32, or 64 deg/s and the Michelson point contrast between any individual dot and the 5cd/m2 background was either 5%, 10%, 20% or 70%.

We recorded from 96 speed-tuned cells in two monkeys that fixated a central dot during stimulus presentation. The vast majority of cells had a reduced firing rate for low contrast stimuli. Interestingly, we found that, for a subset of cells, lowering the contrast at low speeds led to an increase in firing rate. Lowering the contrast, however, did not just change the overall firing rate. Most speed tuning curves became narrower and, surprisingly, the peaks (the preferred speed) shifted to lower speeds.

In the vector-average labeled-line model, such a shift of the preferred speed to lower speeds predicts an increase in perceived speed, clearly in contradiction with behavioral report. Hence, our data show that a vector-average readout of MT cell responses cannot explain the full range of perceptual speed reports.