by Shinsuke Shimojo and Ladan Shams
Historically, perception has been viewed as a modular function, with the different sensory modalities operating independently of each other. Recent behavioral and brain imaging studies challenge this view, by suggesting that cross-modal interactions are the rule and not the exception in perception, and that the cortical pathways previously thought to be sensory-specific are modulated by signals from other modalities.
Cross-modal integration is performed on a vast level in the brain and contributes significantly to adaptive behavior in our daily life. Very little is known about how integration is achieved or its underlying neural mechanisms, however, because the overwhelming majority of studies on perception have focused on one sensory modality. Studying perception in an isolated single modality would be justifiable if different modalities processed sensory inputs independently of each other, as separate ‘modules’. But are sensory modalities really separate modules? A variety of evidence seems to counter this notion of modularity. In this review, we summarize the evidence for vigorous interaction among sensory modalities.
Plasticity across sensory modalities
Both animal and human studies suggest that sensory modalities in early stages of development are not as inherently distinct and independent as was previously once thought. For example, in a study of cross-modal plasticity Sur et al. removed the superior colliculus of both the ferret and the hamster on the day of birth by direct ablation. They also deprived the medial geniculate nucleus or the ventrobasal nucleus from their normal sensory input by sectioning the major input pathways. The retina then invaded these thalamic nuclei, which under ordinary circumstances relay auditory and somatosensory signals to the cortices, respectively. They found that visual responses (i.e. responses triggered by light stimulation on the retina) were elicited from neurons in the auditory or the somatosensory cortex.