Quantitative features of stimuli may be ordered along a magnitude continuum, or line. Magnitude refers to parameters of different types of stimulus properties. For instance, the frequency of a sound relates to sensory and continuous stimulus properties, whereas the number of items in a set is an abstract and discrete property. In addition, within a stimulus property, magnitudes need to be processed not only in one modality, but across multiple modalities. In the sensory domain, for example, magnitude applies to both to the frequency of auditory sounds and tactile vibrations. Similarly, both the number of visual items and acoustic events constitute numerical quantity, or numerosity. To support goal-directed behavior and executive functions across time, magnitudes need to be held in working memory, the ability to briefly retain and manipulate information in mind. How different types of magnitudes across multiple modalities are represented in working memory by single neurons has only recently been explored in primates. These studies show that neurons in the frontal lobe can encode the same magnitude type across sensory modalities. However, while multimodal sensory magnitude in relative comparison tasks is represented by monotonically increasing or decreasing response functions (&ldquo;summation code&rdquo;), multimodal numerical quantity in absolute matching tasks is encoded by neurons tuned to preferred numerosities (&ldquo;labeled-line code&rdquo;). These findings indicate that most likely there is not a single type of cross-modal working-memory code for magnitudes, but rather a flexible code that depends on the stimulus dimension as well as on the task requirements.

Figure 1: Cross-modal representation of flutter frequency in pre-SMA. (A) Delayed flutter discrimination task. The monkey is required to compare the frequency of two stimuli (first sample, then test) presented sequentially over a delay period between them. In the cross-modal condition, vibrotactile (top) or auditory flutter sample frequencies (bottom) are compared to auditory flutter or vibrotactile frequencies (respectively). (B) Time course of a PFC neuron responding monotonically to vibrotactile flutter frequencies during the sample and delay periods. Colors correspond to frequencies (Permission has been obtained from the copyright holder for the reproduction of this image from Romo and Salinas, 2003). (C) Monotonically increasing (neuron #1) and decreasing (neuron #2) response functions during the memory delay of two pre-SMA example neurons to both vibrotactile (blue) and auditory flutter frequencies (red). (from Vergara et al., 2016).

Mentions:
Vergara et al. (2016) examined how frontal lobe neurons maintain the frequency of tactile and acoustic stimuli in working memory by using a delayed discrimination task (Figure 1A). Monkeys were trained to compare the stimulus frequency across tactile and auditory modalities. In the standard task layout (Romo and Salinas, 2003), a sample stimulus vibrating at frequencies between 8 and 32 Hz was presented to the monkey's fingertip. The monkey had to remember the sample frequency during the following delay and judge whether a second test stimulus was higher or lower in frequency than the first. In another fraction of the trials, the monkey performed this task with acoustic-flutter sample stimuli. Here, sound pulses separated by silence were played, and the monkey had to judge the frequency of those acoustic flutter stimuli (Lemus et al., 2010). For instance, five sound pulses per 500 ms equaled an acoustic flutter frequency of 10 Hz. In the crucial cross-modal stimulus trials, the monkeys had to compare the frequency of a vibrotactile stimulus with the frequency of an acoustic flutter stimulus over a delay, and vice versa.

Figure 1: Cross-modal representation of flutter frequency in pre-SMA. (A) Delayed flutter discrimination task. The monkey is required to compare the frequency of two stimuli (first sample, then test) presented sequentially over a delay period between them. In the cross-modal condition, vibrotactile (top) or auditory flutter sample frequencies (bottom) are compared to auditory flutter or vibrotactile frequencies (respectively). (B) Time course of a PFC neuron responding monotonically to vibrotactile flutter frequencies during the sample and delay periods. Colors correspond to frequencies (Permission has been obtained from the copyright holder for the reproduction of this image from Romo and Salinas, 2003). (C) Monotonically increasing (neuron #1) and decreasing (neuron #2) response functions during the memory delay of two pre-SMA example neurons to both vibrotactile (blue) and auditory flutter frequencies (red). (from Vergara et al., 2016).

Mentions:
Vergara et al. (2016) examined how frontal lobe neurons maintain the frequency of tactile and acoustic stimuli in working memory by using a delayed discrimination task (Figure 1A). Monkeys were trained to compare the stimulus frequency across tactile and auditory modalities. In the standard task layout (Romo and Salinas, 2003), a sample stimulus vibrating at frequencies between 8 and 32 Hz was presented to the monkey's fingertip. The monkey had to remember the sample frequency during the following delay and judge whether a second test stimulus was higher or lower in frequency than the first. In another fraction of the trials, the monkey performed this task with acoustic-flutter sample stimuli. Here, sound pulses separated by silence were played, and the monkey had to judge the frequency of those acoustic flutter stimuli (Lemus et al., 2010). For instance, five sound pulses per 500 ms equaled an acoustic flutter frequency of 10 Hz. In the crucial cross-modal stimulus trials, the monkeys had to compare the frequency of a vibrotactile stimulus with the frequency of an acoustic flutter stimulus over a delay, and vice versa.

Quantitative features of stimuli may be ordered along a magnitude continuum, or line. Magnitude refers to parameters of different types of stimulus properties. For instance, the frequency of a sound relates to sensory and continuous stimulus properties, whereas the number of items in a set is an abstract and discrete property. In addition, within a stimulus property, magnitudes need to be processed not only in one modality, but across multiple modalities. In the sensory domain, for example, magnitude applies to both to the frequency of auditory sounds and tactile vibrations. Similarly, both the number of visual items and acoustic events constitute numerical quantity, or numerosity. To support goal-directed behavior and executive functions across time, magnitudes need to be held in working memory, the ability to briefly retain and manipulate information in mind. How different types of magnitudes across multiple modalities are represented in working memory by single neurons has only recently been explored in primates. These studies show that neurons in the frontal lobe can encode the same magnitude type across sensory modalities. However, while multimodal sensory magnitude in relative comparison tasks is represented by monotonically increasing or decreasing response functions (&ldquo;summation code&rdquo;), multimodal numerical quantity in absolute matching tasks is encoded by neurons tuned to preferred numerosities (&ldquo;labeled-line code&rdquo;). These findings indicate that most likely there is not a single type of cross-modal working-memory code for magnitudes, but rather a flexible code that depends on the stimulus dimension as well as on the task requirements.