**3.3 Visual-somatosensory interactions**

*Connectivity and Functional Specialization in the Brain*

sound source.

(mouse).

**3.2 Somatosensory-auditory interactions**

coherent auditory inputs.

fixation, while a visual–auditory stimulus was presented in the periphery [24] (monkey). By contrast, a significant reduction in latency was observed when the animal was required to orient its gaze toward the visual–auditory stimulus. This finding suggests that projections from the auditory cortex to V1 contribute to reducing the response time of head orientation during a foveation movement toward a peripheral

There is also convincing evidence that vision impacts neuronal coding in the primary auditory cortex. Neurons sensitive to visual stimulation in A1 convey more information about stimuli in their spike trains than neurons sensitive to either auditory or visual stimuli presented alone [3] (ferret). An intriguing study revealed that adding congruent visual signals to auditory ones enhanced the weak auditory responses, had no effect on intermediate responses and suppressed strong responses [25] (macaque). In this study, measurement of the amount of information contained in visual and auditory responses showed that bimodal stimuli yielded more information provided by firing rates and spike timing than unimodal ones, but that the suppressed responses carried more information than the increased responses. This information gain was due to a reduced variability of the suppressed responses, whereas the variability of the enhanced responses was increased. The authors proposed that enhanced, but less reliable responses may be involved in detecting rare or faint sensory events, while suppressed, more reliable responses may be used to represent detailed characteristics of sensory environment. Interestingly, a recent study suggests that in layers 5 and 6 of the auditory cortex, a primary locus of visual–auditory convergence, visual signals convey the presence and timing of a salient stimulus rather than specifics about that stimulus, i.e. auditory responses are not orientation-tuned to visual gratings unlike visual cortex responses [26]

Besides the notion that crossmodal interactions are reflected by changes in firing rates, the synchronization of neural signals has been proposed as a key mechanism for multisensory integration in distributed networks [27]. In this regard, it is relevant to mention a study exploring the influence of somatosensory inputs on the activity of A1 neurons using laminar current source density and multiunit recordings. The findings show that somatosensory inputs elicited by median nerve stimulation amplify the neuronal responses evoked by auditory inputs during a high-excitability phase of ongoing local neuronal oscillations and suppress those occurring during a low-excitability phase in the supragranular layers [28] (macaque). Further analysis indicated that this effect was mainly due to a somatosensory-induced phase resetting of auditory oscillations to an optimal excitability phase enhancing the ensemble response of temporally

Neurons in the posterior region of A1 display cutaneous receptive fields specifically located on the head and neck, which are in spatial register with the auditory receptive fields [29] (macaque). This result supports the view that the posterior auditory cortex may be the site for spatial-movement processing, analogous to the "where pathway" in the parietal stream of the visual system [30, 31] (macaque). A fMRI study documented a supra-additive integration of touch and auditory stimulation in a cortical region posterior and lateral to A1 [32] (macaque). This integration process was more prominent for temporally coincident bimodal stimuli and for less effective stimuli, in conformity with the principle of "inverse

**142**

effectiveness".

It is worth reporting a fMRI study aiming to compare cortical activation in response to matching versus non-matching visual–haptic texture information in a task that did not require cognitive evaluation of roughness [33] (human). The results show an increased BOLD response in V1 when a dot pattern was presented in both visual and haptic conditions, all the more so whenever visual information matched haptic texture information. In addition, parametric BOLD signal variations with varying texture characteristics were recorded in both primary visual and somatosensory cortices. This study confirms that haptic information can modulate visual information processing at an early stage. A hierarchical feedback of top-down influences from higher sensory areas on early sensory cortices could account for the observed BOLD effects. However, according to the authors this is unlikely, as only matching visual–haptic texture information induced a parametric modulation of the BOLD response in the contralateral somatosensory cortex. An alternative, more plausible, interpretation of the crossmodal texture effects would be direct or indirect cortico-cortical connections between primary areas. This explanation is compatible with haptic texture information flowing from S1 to V1 [34] (human).

Interestingly, a shrinkage of cutaneous receptive fields in areas 3b and 1 has been recorded when tactile and visual stimulations were concomitant, both during physical touch perception and touch observation [35] (human). This sharpening of coactivated receptive fields that reflects a suppressive interaction between tactile and visual cues, presumably occurring through a GABAergic modulation of intracortical inhibition in S1, is expected to improve tactile acuity (for reviews, see [36, 37]). The functional relevance of this finding was highlighted by a study reporting that viewing the hand increased the suppression of the P50 evoked potential due to simultaneous electrical stimulation of adjacent fingers and enhanced tactile acuity in a task of grating orientation discrimination [38] (human). Furthermore, a recent ultra-high-resolution fMRI study provided evidence for a spatially specific visual convergence onto S1. Neurons within the somatotopically organized cutaneous representation of the fingers in areas 3b and 1 were activated by the subject observing his fingers being touched, or the fingers of another person receiving similar tactile stimulation [39]. The visually-driven map was topographically and temporally precise and was found to be in register with the cutaneous map. Further investigations of the neuron characteristics within area 3b and area 1 are required to determine whether or not this area may contribute to distinguishing perceived touch from observed touch.
