We performed three series of experiments. In each experiment, we first trained animals to perform a specific cue-guided two-alternative forced-choice task and then examined mPFC neural activity during the task. All of the behavioral tasks were conducted in a training box (Fig. 1a). In Task 1 (details below), animals were required to make a choice based on whether the auditory stimulus or the auditory component of a multisensory cue, was a lower (3 kHz) or higher frequency (10 kHz) pure tone. Task 2 required animals to discriminate two criteria, the cue modality, and, if multisensory, the frequency content of the auditory component. In Task 3, animals were not required to discern stimuli at all and could make a free choice. These tasks allowed us to investigate how mPFC multisensory perceptual decision strategies changed with the demands of the task.
The effect of an uninformative visual cue on mPFC neurons' multisensory perceptual decision
A total of 9 rats were trained to perform Task 1 (Fig. 1a). A trial was initiated when a rat poked its nose into the central port in a line of three ports on one wall of the training box (see Fig. 1a). After the waiting period of 500-700 ms, a cue, randomly chosen from a group of 4 cues (3 kHz pure tone, A3k; 10 kHz pure tone, A10k; 3k Hz pure tone+ flash of light, VA3k; 10 kHz pure tone+ flash of light, VA10k), was presented in front of the central port. Based on the auditory cue, the rat was required to choose a port (left or right) to obtain a water reward within 3s. If the stimulus was A10k or VA10k, the rat should move to the left port for harvesting the water reward (Fig. 1a). Any other cue indicated the animal should move to the right port for a reward. Rats readily learned this cue-guided two-alternative-choice task. After the animals performed the task correctly >75% of the time in five consecutive sessions, they were deemed well-trained and could then undergo implantation and later electrophysiological recording.
Once well-trained, the average behavioral performance stabilized at 84±2.9% (Fig. 1b). There was no difference in behavioral performance between auditory and multisensory cued trials (Fig. 1c). Despite this, the presence of the visual cue sped up the process of cue discrimination. The reaction time, defined as the temporal gap between the cue onset and the moment when the animal withdrew its nose from the infrared beam monitoring point in the central port (Fig. 1a), was compared between auditory and multisensory trials (Fig. 1d&e). Note that rats responded more quickly in multisensory trials with a mean reaction time of 224±14ms across animals, significantly shorter than 256±17ms in auditory trials (t(8)=-15.947, p<0.00001, paired t-test).
We used tetrode recordings to characterize the task-related activity of individual neurons in left mPFC while well-trained rats performed Task 1 (Fig. 2a). On average, animals performed 266 ± 53 trials in a daily session. A total of 654 neurons were recorded (65±14 neurons per animal), and their responses were examined. 313 of them appeared to show cue-categorization signals within 500ms after the cue onset (firing rate in continuous three bins >= spontaneous firing rate，Mann-Whitney Rank Sum Test, p<0.05), and all further analysis was focused on them. In the examples shown in Fig. 2b-d, cue-categorization signals appeared to discriminate well auditory pure tones (low vs. high) and sensory modalities (multisensory vs. auditory). For instance, as is shown in Fig. 2b, the response in A3k trials is higher than in A10k trials, and the firing rate in VA3k trials is higher than in A3k trials. Nearly 34% (107/313) of neurons examined showed both cue-categorization signals and behavioral choice signals (coding moving directions). These two signals could be easily separated because behavioral choice signals occurred much later than cue-categorization signals (typically later than 600 ms after cue onset) (Fig. 3a&b). Different from cue-categorization signals, behavioral choice signals usually showed no difference between multisensory and auditory trials (Fig. 3a&b).
We used ROC analysis to generate an index of auditory choice preference that measures how strongly a neuron's cue-categorization signal for A3k trials diverged from the cue-categorization signal for A10k trials. In the same way, an index of multisensory choice preference was defined. As shown in exemplar cases (Fig. 2b&c and Fig. 3a&b), nearly half of neurons examined (55%, 171/313；preferring A3k: N = 111; preferring A10k: N = 60) exhibited an auditory choice preference (permutation test, p<0.05). However, more neurons (71%; 222/313) showed the multisensory choice preference (Fig. 4a), in that, a sizeable minority of neurons (23%, 72/313) showed the perceptual choice preference only between two multisensory conditions (see the example in Fig. 2d). This result indicated that the visual cue, albeit uninformative, was able to facilitate mPFC neurons’ auditory choice capability. Auditory and multisensory choice preferences were fairly consistent. In other words, if the neuron preferred A10k it usually preferred VA10k (Fig. 4a).
Taking things further, we examined the influence of visual cue on auditory choice signals. We found that in 49% (155/313) of cases, the simultaneous presentation of a visual stimulus could significantly modulate the response in one or both auditory conditions (permutation test, p<0.05, Fig. 4b). Cross-modal enhancement was the favored processing strategy in use here because, for most neurons (87%, 135/155), the response in VA3k or/and VA10k trials were significantly higher than that in corresponding auditory trials (A3k: 79%, 76/96; A10k: 88%, 61/69). Due to this, across the population (n=313), the mean response in multisensory trials was a bit larger than that in corresponding auditory trials regardless of the auditory component frequency (Fig. 4c-d).
To further investigate those neurons with auditory choice signal facilitated by a visual cue, we were surprised to find that in nearly all of cases (98%, 133/135), the addition of a visual stimulus only facilitated the response in one auditory condition (p<0.05, permutation test, Fig. 5a&b). We used MI to quantify the effect of cross-modal interaction. In 76 neurons showing cross-modal enhancement in VA3k trials, the mean MI in the VA3k condition was 0.18±0.09, but the mean MI in VA10k condition was near zero (-0.05±0.14, p<0.0001, Wilcoxon Signed Rank Test, see Fig. 5a). It was also the case in those showing cross-modal enhancement in VA10k trials (n=61, mean MI: 0.20±0.11 in VA10k condition vs. -0.03±0.12 in VA3k condition, p<0.0001, Wilcoxon Signed Rank Test, Fig. 5b). Furthermore, we found that the visual cue usually just enhanced the preferred auditory choice signal (see examples in Fig. 2b&c) regardless of whether the preferred was A3k (51/54) or A10k (27/33). Such biased enhancement further strengthened neurons’ choice selectivity (Fig. 5c&d).
The influence of information congruence/incongruence between visual and auditory cues on mPFC neurons' cross-modal interaction
In behavioral Task 1, animals made their behavioral choice based on the auditory cue alone. We then wondered how mPFC neurons would change their integrative strategy if the behavioral choice became dependent on both auditory and visual cues. To examine this, we trained 7 rats to perform a new behavioral task (Task 2). In this task, the only difference from Task 1 is that an individual visual stimulus (V) was introduced into the stimulus pool as an informative cue. If the triggered stimulus is A10K, VA10k, or V, animals should go to the left port to get the reward (Fig. 6a). Otherwise, they should move to the right port to be rewarded. This task took animals about two months of training to surpass 75% correct performance for five consecutive sessions. Although there was no difference in behavioral performance between two auditory alone conditions (A3k vs. A10k: 83.7% vs. 85.7%, t(6)=0.888, p=0.41, paired t-test, Fig. 6b), the task showed a difference between two multisensory conditions. This performance increased when the cues themselves had congruent information content and declined when they indicated a cued directional mismatch (VA3k vs. VA10k: 77.1% vs. 91.1%, t(6)=5.214, p=0.002, paired t-test, Fig. 6b). The mean reaction time in multisensory trials across animals was still significantly shorter than that in corresponding auditory trials regardless of whether the auditory is A3k or A10k (A10k vs. VA10k: 263±92 ms vs. 232±79 ms, t(6)=4.585, p=0.004, paired t-test; A3k vs. VA3k: 256±73 ms vs. 234±75 ms, t(6)=3.614, p=0.01, paired t-test; Fig. 6c). There was no difference in the reaction times between two multisensory conditions (t(6)=0.0512, p=0.961, paired t-test).
We examined the responses of 456 mPFC neurons recorded during performing Task 2. 54% (247/456) of these neurons showing cue-categorization signals (see examples in Fig. 6d-f). The result showed that the introduction of an informative visual stimulus into the cue pool significantly affected mPFC neurons’ CMI strategy (Fig. 7a&b), one which was dependent on information content. Compared with Task 1, a far lower proportion of neurons (10%, 24/247 in Task 2; 24%, 76/313 in Task 1; X2 = 19.96; p< 0.00001) showed cross-modal enhancement in VA3k trials (Fig. 7b). indicating that information mismatch disrupted cross-modal enhancement. However, this proportion In information-congruent VA10k trials is similar to the observation in Task 1 (22%, 55/247 in Task 2; 19%, 61/313 in Task 1; X2 = 0.65; p=0.42). As shown in Fig. 6d-f, in each case, only the response in VA10k condition was significantly enhanced. Mean responses across the populations tested (n=247) are shown in Fig. 7c&d. Of these neurons (n=55) showing cross-modal enhancement in the information-congruent VA10k trials, 20 of them (36%) favored A10k (see the example in Fig. 6d) and 28 of them (51%) showed no overt preference of auditory choice (see the example in Fig. 6e). In several cases, like the neuron shown in Fig. 6f, the visual stimulus appeared to reverse selectivity, and for auditory, they showed a preference for A3k, but for multisensory, favored VA10k.
The mean MI in information-incongruent VA3k condition across populations (n=247) is nearly zero (0.01±0.16), which was significantly lower than 0.07±0.15 in the congruent VA10k condition (p<0.00001, Mann-Whitney Rank Sum Test; see the comparison of an individual case in Fig. 7e). Also, one would expect that the information match should induce more substantial effects of cross-modal enhancement. It was not the case, however. In examining all neurons exhibiting cross-modal enhancement in VA10k condition in Task 1 and Task 2, we found no difference between them (mean MI: 0.21±0.18 in Task 2 vs. 0.20±0.11 in Task 1, p=0.489, Mann-Whitney Rank Sum Test). Summarily, these results indicate that the activities of mPFC neurons reflected the context of the task and maintained their ability to discriminate, and, ostensibly, aid in successful task completion.
Cross-modal interaction in a choice-free task
Tasks 1&2 required animals to discriminate sensory cues. The next intriguing question to us was how then mPFC neurons would treat different combinations of sensory cues and CMI when cue discrimination is not required? To investigate this, we trained another group of rats (n=9) to perform a choice-free task (Task 3). In this task, animals would get a water reward in either the left or right port regardless of which stimulus was presented, rendering the cueing discrimination irrelevant. We carefully examined 184 mPFC neurons recorded during the performance of Task 3. For consistency with the earlier analyses, neuron’s response in A3k_right_choice trials was compared with the response in A10k_left_choice trials, and so was done in multisensory comparison. Different from those recorded in Task 1&2, in Task 3, the majority of mPFC neurons examined failed to show auditory choice preferences (74%, 137/184) and correspondingly, multisensory choice preference (73%, 135/184). Fig. 8a shows such an example. Population distributions for choice selectivity are illustrated in Fig. 8c. This was also the case in the comparison of responses between conditions of the same moving direction (right direction: auditory choice selectivity, 75%, 138/184; multisensory choice selectivity, 72%, 132/184; left direction: auditory choice selectivity, 77%, 142/184; multisensory choice selectivity, 71%, 130/184).
For the majority of neurons (72%, 132/184), their response in multisensory trials is very similar to the corresponding response in auditory or visual trials (p>0.05, permutation test, see the example in Fig. 8a&b and populations in Fig. 8d). For those neurons with the response in auditory trials that was influenced by visual stimulus (28%, n=52), they showed induced inhibitory or facilitatory effects that appear similar (facilitated: 24; inhibited: 23; facilitated & inhibited: 5, see Fig. 8d).
We carefully examined neurons exhibiting auditory choice selectivity (n=49) to see whether visual cue, as what we observed in cue-discrimination tasks, could specifically induce facilitative effect to the preferred response. The vast majority of cases (44/49) failed to do so, however. Fig. 8B showed such an example where the neuron favored the A3k over A10k, but neither response was heightened by the visual stimulus. The mean MIs for both conditions were similar (preferred vs. non-preferred: -0.03±0.27 vs. -0.04±0.27; p=0.441, Mann-Whitney Rank Sum Test). This result, taken together with those given above, reveals that the differential neural activities in mPFC likely reflect the context of the given task. When stimulus discrimination is not required, the neuronal activity exhibits no selectivity. When demanded by an appropriate task, mPFC neurons are quite capable of sensory discrimination.