**3. Neurophysiological mechanisms of multimodal integration in primary sensory areas**

Concurrent stimuli of sensory organs coactivate primary cortical areas and generate reciprocal influences contributing to the process of multimodal integration. It is noticeable that the bulk of studies on multisensory integration in early cortical areas have focused on the interplay between the visual and auditory cortices.

### **3.1 Visual-auditory interactions**

*Connectivity and Functional Specialization in the Brain*

ing sensory loss.

[10–17] (monkey; human).

modulator boutons [8] (mouse).

**areas**

connections [1–3] (macaque; ferret) ([4, 5], for reviews). Hence, it is of primary interest to unravel how heteromodal inputs interplaying with the dominant modality in primary sensory areas may contribute to improving perception. In addition, a major issue is to determine how crossmodal plasticity subserves functional compensation and behavioral recovery after the loss or impairment of a sensory organ, or following cortical damage. This review chapter has a double focus, firstly on the interplay between primary sensory cortices in normal condition, and secondly on crossmodal plasticity operating in primary and higher-order cortical areas follow-

**2. Subcortical and intracortical connectivity between primary sensory** 

There is anatomical evidence that crossmodal inputs to primary cortical areas can be conveyed through thalamo-cortical or cortico-cortical projection fibers. There is, however, only scarce anatomical evidence for heteromodal convergence from auditory, visual and somatosensory thalamic nuclei to A1, V1 and S1 [6] (gerbil). By contrast, cortico-cortical connections underpinning plurimodal interplay between these cortical areas are well documented. Tract-tracing methods have revealed the existence of visual-somatosensory projections from V2 to areas 1/3b in S1, and somatosensory projections from S2 to A1 [2] (marmoset). Direct corticocortical connections between A1, V1 and S1 have been identified [1, 7] (macaque; cat). It has been shown that V1 projects mainly to S1, but receives a moderate amount of projections from A1 and S1, while A1 sends more projections to V1 than S1, but receives sparse projections from these two areas [8, 9] (mouse). These findings indicate that the connectivity network between A1, S1 and V1 is asymmetric. Overall, both thalamocortical and corticocortical connections may contribute to the occurrence of short-latency responses to heteromodal inputs reported in these area

In the model of hierarchical organization of cortical connectivity, it is generally assumed that feedforward connections convey sensory information to higher order areas, whereas feedback connections modulate neural activity in lower-level cortical areas [18, 19] (macaque; cat). This model is somewhat challenged by retrograde tracing studies investigating the microcircuitry of reciprocal connections between primary cortical areas. These studies have shown that A1 and S1 project in a feedback manner to V1, while V1 to A1 projections are of feedforward type and V1 to S1 are mostly lateral [8, 9] (mouse). In addition, A1 and S1 are linked by reciprocal feedback projections [6] (gerbil). Hence, the available evidence suggests that connection patterns between primary sensory cortices are not at the same levels in the neural network. Furthermore, based on the labeling of reciprocal connections between V1 and S1 and the characterization of the size and laminar density of axonal swellings, it was concluded that S1 receives a stronger driver input fromV1 and that S1 inputs to V1 have a predominant modulatory influence [9] (mouse). Regarding the projections from the auditory cortex to V1, both types of input have been identified with, however, a clear dominance of small caliber axons bearing

Somatosensory-auditory interactions have been found at low-level of multisensory integration. Cutaneous responses were recorded in the caudo-medial auditory cortex, with a feedforward laminar activation profile. The initial excitatory response was located in layer 4, then followed by responses in the extragranular laminae (layers 2, 3, 5 and 6), in contrast with feedback and lateral activation

profiles beginning in the extragranular laminae [20] (macaque).

**140**

Activation of A1 neurons by noise bursts was found to induce GABAergic inhibition of supragranular pyramidal cells in V1, via cortico-cortical connections, leading to a reduced synaptic and spike activity upon bimodal stimulation [10] (mouse). Furthermore, this acoustic stimulation decreased behavioral responses to a dim flash, likely through GABAergic inhibition in V1, as this effect was prevented by acute blockade of GABAA and GABAB receptors. The authors concluded that salient auditory stimuli degrade potentially distracting sensory processing in the visual cortex. This finding was corroborated by an *in vitro* electrophysiological study showing that layer 1 and layer 2/3 inhibitory neurons in V1 receive direct excitatory inputs from A1 [22] (mouse). Along the same lines, intrinsic signal imaging aimed at simultaneously recording visuotopic maps in V1 and tonotopic maps in A1, showed that a high activation of A1 suppresses visually evoked responses in V1 [5] (mouse). As a result, under bimodal stimulation the global effect of auditory inputs to V1 was such that the neuronal firing averaged across all visual orientations was weaker. Nevertheless, the orientation selectivity of V1 excitatory neurons in layer 2/layer 3 was found to be sharpened by concurrent sound signals or optogenetic activation of A1 to V1 projections [22] (mouse). Indeed, auditory signals increased the neuronal responses at the preferred visual orientation, and decreased responses at the orthogonal orientation, with a stronger impact at lower visual contrast. Tracing data showed that axons from A1 layer 5 to V1 neurons mainly terminated in superficial layers and activated layer 1 inhibitory neurons. The sharpening effect was very likely mediated by a combination of inhibitory and disinhibitory circuits, since layer 1 neurons in V1 being excited by sound, they presumably suppressed layer 2/layer 3 pyramidal cell responses, but also inhibited other inhibitory neurons in layer 2/layer 3, thereby globally contributing to increasing the firing rate of the pyramidal cells at their preferred orientation tuning. A two-photon calcium imaging study showed that when visual and auditory stimulus features are temporally congruent, neurons in V1 exhibit a balanced pattern of response enhancement and suppression compared with unimodal visual stimuli. Temporally incongruent tones or white-noise bursts in paired audiovisual stimuli mainly produce suppressive responses across the neuronal population, in particular when the visual stimulus contrast is high [23] (mouse). Neuronal mechanisms of visual–auditory integration appear to be dependent upon the behavioral context. A study investigating the modulation of V1 neurons by auditory stimuli showed no difference in the latency or strength of visual responses in monkeys trained to a passive central

fixation, while a visual–auditory stimulus was presented in the periphery [24] (monkey). By contrast, a significant reduction in latency was observed when the animal was required to orient its gaze toward the visual–auditory stimulus. This finding suggests that projections from the auditory cortex to V1 contribute to reducing the response time of head orientation during a foveation movement toward a peripheral sound source.

There is also convincing evidence that vision impacts neuronal coding in the primary auditory cortex. Neurons sensitive to visual stimulation in A1 convey more information about stimuli in their spike trains than neurons sensitive to either auditory or visual stimuli presented alone [3] (ferret). An intriguing study revealed that adding congruent visual signals to auditory ones enhanced the weak auditory responses, had no effect on intermediate responses and suppressed strong responses [25] (macaque). In this study, measurement of the amount of information contained in visual and auditory responses showed that bimodal stimuli yielded more information provided by firing rates and spike timing than unimodal ones, but that the suppressed responses carried more information than the increased responses. This information gain was due to a reduced variability of the suppressed responses, whereas the variability of the enhanced responses was increased. The authors proposed that enhanced, but less reliable responses may be involved in detecting rare or faint sensory events, while suppressed, more reliable responses may be used to represent detailed characteristics of sensory environment. Interestingly, a recent study suggests that in layers 5 and 6 of the auditory cortex, a primary locus of visual–auditory convergence, visual signals convey the presence and timing of a salient stimulus rather than specifics about that stimulus, i.e. auditory responses are not orientation-tuned to visual gratings unlike visual cortex responses [26] (mouse).

#### **3.2 Somatosensory-auditory interactions**

Besides the notion that crossmodal interactions are reflected by changes in firing rates, the synchronization of neural signals has been proposed as a key mechanism for multisensory integration in distributed networks [27]. In this regard, it is relevant to mention a study exploring the influence of somatosensory inputs on the activity of A1 neurons using laminar current source density and multiunit recordings. The findings show that somatosensory inputs elicited by median nerve stimulation amplify the neuronal responses evoked by auditory inputs during a high-excitability phase of ongoing local neuronal oscillations and suppress those occurring during a low-excitability phase in the supragranular layers [28] (macaque). Further analysis indicated that this effect was mainly due to a somatosensory-induced phase resetting of auditory oscillations to an optimal excitability phase enhancing the ensemble response of temporally coherent auditory inputs.

Neurons in the posterior region of A1 display cutaneous receptive fields specifically located on the head and neck, which are in spatial register with the auditory receptive fields [29] (macaque). This result supports the view that the posterior auditory cortex may be the site for spatial-movement processing, analogous to the "where pathway" in the parietal stream of the visual system [30, 31] (macaque). A fMRI study documented a supra-additive integration of touch and auditory stimulation in a cortical region posterior and lateral to A1 [32] (macaque). This integration process was more prominent for temporally coincident bimodal stimuli and for less effective stimuli, in conformity with the principle of "inverse effectiveness".

**143**

*Interplay between Primary Cortical Areas and Crossmodal Plasticity*

It is worth reporting a fMRI study aiming to compare cortical activation in response to matching versus non-matching visual–haptic texture information in a task that did not require cognitive evaluation of roughness [33] (human). The results show an increased BOLD response in V1 when a dot pattern was presented in both visual and haptic conditions, all the more so whenever visual information matched haptic texture information. In addition, parametric BOLD signal variations with varying texture characteristics were recorded in both primary visual and somatosensory cortices. This study confirms that haptic information can modulate visual information processing at an early stage. A hierarchical feedback of top-down influences from higher sensory areas on early sensory cortices could account for the observed BOLD effects. However, according to the authors this is unlikely, as only matching visual–haptic texture information induced a parametric modulation of the BOLD response in the contralateral somatosensory cortex. An alternative, more plausible, interpretation of the crossmodal texture effects would be direct or indirect cortico-cortical connections between primary areas. This explanation is compatible with haptic texture information flowing from S1 to V1

Interestingly, a shrinkage of cutaneous receptive fields in areas 3b and 1 has been recorded when tactile and visual stimulations were concomitant, both during physical touch perception and touch observation [35] (human). This sharpening of coactivated receptive fields that reflects a suppressive interaction between tactile and visual cues, presumably occurring through a GABAergic modulation of intracortical inhibition in S1, is expected to improve tactile acuity (for reviews, see [36, 37]). The functional relevance of this finding was highlighted by a study reporting that viewing the hand increased the suppression of the P50 evoked potential due to simultaneous electrical stimulation of adjacent fingers and enhanced tactile acuity in a task of grating orientation discrimination [38] (human). Furthermore, a recent ultra-high-resolution fMRI study provided evidence for a spatially specific visual convergence onto S1. Neurons within the somatotopically organized cutaneous representation of the fingers in areas 3b and 1 were activated by the subject observing his fingers being touched, or the fingers of another person receiving similar tactile stimulation [39]. The visually-driven map was topographically and temporally precise and was found to be in register with the cutaneous map. Further investigations of the neuron characteristics within area 3b and area 1 are required to determine whether or not this area may

contribute to distinguishing perceived touch from observed touch.

The vestibular cortex differs from other sensory cortices in that vestibular signals are distributed in an extensive network of cortical regions [40–42]. A wholebrain electrophysiological investigation using galvanic vestibular stimulation and fMRI mapping described the cortical projections of vestibular inputs to functionally diverse cortical regions that included S1 [43] (rat). In addition, a more recent investigation revealed that the optogenetic stimulation of the medial vestibular nucleus neurons elicited bilateral fMRI activations in the sensorimotor cortices and their thalamic nuclei [44] (rat). Nevertheless, which region of S1 receives vestibular inputs and how the bimodal interplay occurs has not yet been investigated. In a recent study in rat, we reasoned that the vestibulo-somatosensory convergence in S1 could occur in the cortical zones of the paw representations that would be

**3.4 Vestibular-somatosensory interactions**

*DOI: http://dx.doi.org/10.5772/intechopen.95450*

**3.3 Visual-somatosensory interactions**

[34] (human).
