Jump to ContentJump to Main Navigation
Cerebral PlasticityNew Perspectives$

Leo M. Chalupa, Nicoletta Berardi, Matteo Caleo, Lucia Galli-Resta, and Tommaso Pizzorusso

Print publication date: 2011

Print ISBN-13: 9780262015233

Published to MIT Press Scholarship Online: August 2013

DOI: 10.7551/mitpress/9780262015233.001.0001

Show Summary Details
Page of

PRINTED FROM MIT PRESS SCHOLARSHIP ONLINE (www.mitpress.universitypressscholarship.com). (c) Copyright The MIT Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a monograph in MITSO for personal use (for details see http://www.mitpress.universitypressscholarship.com/page/privacy-policy). Subscriber: null; date: 21 June 2018

The Developmental Process of Acquiring Multisensory Integration Capabilities

The Developmental Process of Acquiring Multisensory Integration Capabilities

Chapter:
(p.179) 15 The Developmental Process of Acquiring Multisensory Integration Capabilities
Source:
Cerebral Plasticity
Author(s):

Stein Barry E.

Perrault Thomas Jr.

Stanford Terrence R.

Rowland Benjamin A.

Publisher:
The MIT Press
DOI:10.7551/mitpress/9780262015233.003.0015

Abstract and Keywords

This chapter describes a single model to explore the multisensory neuron in the superior colliculus (SC). It analyzes the general utility, normal operational principles, and the organization of the model structure of the maturation of multisensory integration. It suggests that the neonatal brain must incorporate principles that will guide the integration of inputs from different senses based on experience. This chapter supports the idea that the cortex is the portal through which early experiences with cross-modal cues gain access to multisensory SC neurons. It reveals that the SC multisensory circuit seemed to be far more flexible.

Keywords:   superior colliculus, multisensory neuron, neonatal brain, cortex, multisensory integration

Lamberto Maffei has inspired many researchers to examine how the brain develops and crafts its sensory capabilities to adapt to environmental demands. Understanding the inherent plasticity of the developing brain has seemed an obsession with Professor Maffei, on par with his need to swim out of sight into the open ocean. Both endeavors evoke strong reactions from his family and friends.

Professor Maffei has been among the most creative and prolific of researchers in the area of brain plasticity. Although he and most other researchers have concentrated on unisensory processes to understand these remarkable processes of development and plasticity, the ideas they have generated and the insights they have provided are also proving to be extremely helpful in understanding multisensory processes. Somehow the neural mechanisms that process sensory information, regardless of whether they are confined to a single sensory modality or are engaged in integrating information across the senses, must become tuned to the physical properties and statistical likelihoods of the particular environment in which they will function. Thus, the evolutionary strategy has been to ensure that the underlying neural architecture has the flexibility to use sensory experience, most of which must be acquired postnatally, to determine its operational principles. Although we know far less about the maturation of multisensory processes than unisensory processes, and what we do know points to many distinctions between them, they do share this overarching strategy.

Given that multisensory processes are present in all extant organisms that have been examined, and are likely to have predated the evolution of multicellular organisms (see Stein and Meredith, 1993, for a discussion), they, like their unisensory counterparts, have had the benefit of millions of years of selective pressure to achieve their current functional capabilities. The present discussion will use a single model to explore this process: the multisensory neuron in the superior colliculus (SC). The SC is a midbrain structure that plays a primary role in the control of orientation responses, such as gaze shifts (see Stein and Meredith, 1993; Hall and Moschovakis, 2004, for reviews), and has proven to be a highly productive and useful model for understanding multisensory integration at the level of the single neuron. However, before discussing the maturation of multisensory integration, it is useful to examine its general utility, its normal operational principles, and the organization of the model structure.

(p.180) There Are Many Advantages to Having Multiple Sensory Systems

A driving force in evolution has been to develop systems that facilitate the ability to respond rapidly and appropriately to environmental cues. This includes the creation of storage and retrieval systems that can be used to modify long-term behavior, but it also requires that a broad array of environmental cues can be used to guide that behavior. As a consequence of such evolutionary pressures, brains have access to information from multiple sensory systems, each tuned to a given source of environmental energy. As a result, incomplete information in one sense can be compensated for by additional information in another. Thus, an event that is undetectable along any single sensory dimension may become obvious if simultaneously considered across multiple dimensions. Likewise, an event that cannot be identified along a single dimension (e.g., how it looks, sounds, or feels) may have a unique signature when considered across more than one sensory dimension. Regardless of the problem to be solved, whenever the inputs from different sensory modalities are combined to produce a synthesized product, it is called “multisensory integration” (Stein and Meredith, 1993). This process has been shown to increase the likelihood of detecting and localizing external events, aid in the identification of sensory events, and promote timely and accurate responding. By using such mechanisms, the brain can achieve levels of performance that would not be possible if using the senses independently (Ernst and Banks, 2002, Pouget et al., 2002, etc.)

The Principles of Multisensory Integration

Operationally, multisensory integration enhances (or degrades) the salience of a signal in the brain (Stein and Meredith, 1993), and its functional consequences for perception and behavior have been described in both human and animal subjects (e.g., Busse et al., 2005; Corneil and Munoz, 1996; Frens and Van Opstal, 1995; Ghazanfar and Schroeder, 2006; Grant et al., 2000; Hughes et al., 1994; King and Palmer, 1985; Lakatos et al., 2007; Liotti et al., 1998; Marks, 2004; Newell, 2004; Recanzone, 1998; Sathian, 2000; Sathian, 2005; Sathian and Prather, 2004; Schroeder and Foxe, 2004; Shams et al., 2004; Stein et al., 1989; Talsma et al., 2006a; Talsma et al., 2006b; Wallace et al., 1996; Weisser et al., 2005; Woldorff et al., 2004; Woods and Recanzone, 2004; Zangaladze et al., 1999). Presumably, by enhancing the salience of neuronal responses to cross-modal cues that are derived from the same event, multisensory integration can also aid in the disambiguation of those events. Disambiguation can occur in response to a variety of signals but has been most clearly demonstrated in those involving human speech and animal communication (Bernstein et al., 2004; Ghazanfar et al., 2005; Massaro, 2004; Partan, 2004; Sugihara et al., 2006; Sumby and Pollack, 1954). It also significantly enhances both the speed with which an event is detected and the speed and reliability with which responses can be produced (Corneil and Munoz, 1996; Frens and Van Opstal, 1995; Hughes et al., 1994; Marks, 2004; Newell, 2004; Sathian and Prather, 2004; Shams et al., 2004; Stein et al., 1989; Stein and Meredith, 1993; Stein and Stanford, 2008; Woods and Recanzone, 2004).

(p.181) Three general operational principles of multisensory integration that have been derived from single-neuron studies of the cat SC that will be discussed below. Two involve space and time and are referred to, respectively, as the “spatial” and “temporal” principles of multisensory integration. In the former case, cross-modal stimuli that are in spatial concordance (as if derived from the same event) enhance neuronal responses, whereas those that are in spatial discordance (as if derived from different events) either fail to be integrated or produce depressed responses. The temporal principle has the same general form, though the depression with temporally discordant stimuli has been less well documented. A third major principle is that of “inverse effectiveness.” This principle describes the observation that the proportionate effect of combining cross-modal cues is greater when the effectiveness of the component stimuli is weaker (see Stein and Meredith, 1993, and Stein et al., 2009, for a general discussion of these principles).

Until recently, far less consideration had been given to the developmental prerequisites of multisensory integration or its inherent plasticity than to studies of its general functional organization. Newer studies have shown that its operational principles are very sensitive to experience and are largely or completely instantiated during postnatal life. Nevertheless, some multisensory processes, like cross-modal matching and the recognition of amodal stimulus (e.g., size, intensity, frequency) properties, have been demonstrated in very young children, and the inference is that they may already be present to some degree at birth (e.g., see Bower, 1974; Gibson, 1966; Gibson, 1969; Gibson, 1979; Lewkowicz and Kraebel, 2004; Lickliter and Bahrick, 2004; Marks, 1978; Werner, 1973). Thus, while experiments in animals strongly suggest that “multisensory integration” is acquired postnatally, and only after considerable sensory experience (Wallace et al., 2006; Wallace et al., 1993; Wallace and Stein, 2007; Wallace and Stein, 1997), this should not be taken to mean that there is no communication among the senses at birth. This issue deserves further exploration. However, it appears that regardless of species, multisensory integration develops postnatally, and very recent studies in human subjects, like those in animal subjects, point to the gradual postnatal elaboration of this capability (see Gori et al., 2008; Neil et al., 2006; Putzar et al., 2007).

Collectively, these studies suggest that the neonatal brain must incorporate principles that will guide the integration of inputs from the different senses based on experience. Most of these studies have emphasized multisensory enhancement rather than multisensory depression, and this index of multisensory integration will be emphasized in the following discussion.

Using the Superior Colliculus as a Model of Multisensory Integration

The role of the SC in attentive and orientation behavior has been well documented (Sparks, 1986; Stein and Meredith, 1993). In general, the appearance of a visual, auditory, and/or somatosensory event will activate its topographically organized neurons so that the event is detected, localized, and reacted to with a shift of gaze. Sometimes, the entire body orients to the event, so that the animal is well positioned to respond to it. The neurons activated by these different (p.182) sensory modalities are located in its deeper aspects where outputs to motor areas of the brainstem and spinal cord originate. Many of these neurons also receive converging inputs from the different senses, rendering them multisensory (it is a primary site of sensory convergence; see Stein and Meredith, 1993; Wallace et al., 1993), and many of them are also output neurons that link multiple sensory inputs to motor behavior (Meredith et al., 1992). These factors make it an excellent model for studying the physiology of multisensory integration and its behavioral consequences. The fact that there is a considerable amount of information concerning the development of the structure, albeit mostly about its unisensory properties (see Stein, 1984), also makes it an ideal model to examine how multisensory integration develops and adapts to the particular environment in which it will function. Indeed, most of what we know about the multisensory properties of single neurons and their ontogeny comes from this model (Peck, 1987; Stein, 1984; Stein and Clamann, 1981; Stein et al., 1973a; Stein et al., 1976; Stein and Meredith, 1993; Stein et al., 1993; Wallace, 2004; but see also Barth and Brett-Green, 2004; Calvert and Lewis, 2004; Groh and Sparks, 1996a; Groh and Sparks, 1996b; Gutfreund and Knudsen, 2004; Jay and Sparks, 1987a; Jay and Sparks, 1987b; King et al., 2004; Lakatos et al., 2007; Sathian and Prather, 2004; Woods and Recanzone, 2004; Zwiers et al., 2003).

One important factor to keep in mind when thinking about SC multisensory integration is that its underlying functional circuits are complex and that the simple fact that afferents from the different senses converge on a common target neuron neither ensures that it will be able to integrate those inputs nor specifies how the inputs would be integrated if the neuron did indeed possess that capability (Jiang et al., 2006; Jiang et al., 2001; Stein and Meredith, 1993; Wallace and Stein, 1997). Experiments have shown that there are several critical factors that must be present, generally during early postnatal life, for this capability to develop, and these factors are discussed below.

Superior Colliculus Organization

Each of the three sensory representations in the multisensory SC (visual, auditory, and somatosensory) is represented in the same topographic fashion. Thus, their maps overlap one another. Neurons in the front of the structure have receptive fields in frontal space or the front portion of the body (e.g., face), those in the rear of the structure are further temporal in visual and auditory space and further back on the body, those medial in the structure have receptive fields in upper space, and those lateral in the structure have receptive fields in lower space (Meredith et al., 1991; Meredith and Stein, 1990; Middlebrooks and Knudsen, 1984; Stein and Gallagher, 1981; Stein et al., 1975; Stein et al., 1993). The overlapping sensory maps are in register with an underlying motor map from which projections to the brainstem and spinal cord translate sensory inputs into motor responses. Thus, a cue in upper space triggers responses from neurons that will produce an upward orientation movement (e.g., gaze shift). This is an elegant and simple way to match sensory input to the output signals that reach the effector organs via the brainstem and spinal cord (Grantyn and Grantyn, 1982; Groh and Sparks, 1996a; Groh and Sparks, 1996b; Guitton and Munoz, 1991; Harris, 1980; Jay and Sparks, 1987a; Jay and Sparks, 1987b; Jay and (p.183) Sparks, 1984; Munoz and Wurtz, 1993a; Munoz and Wurtz, 1993b; Peck, 1987; Sparks, 1986; Sparks and Nelson, 1987; Stein and Clamann, 1981; Wurtz and Albano, 1980; Wurtz and Goldberg, 1971).

Because the different sensory maps in the SC overlap one another, so too do the multiple receptive fields of individual multisensory neurons (King et al., 1996; Meredith et al., 1991; Meredith and Stein, 1990; Meredith et al., 1992; see also Brainard and Knudsen, 1998; Gutfreund and Knudsen, 2004; Knudsen and Brainard, 1991; Knudsen et al., 1991; Knudsen and Knudsen, 1989). This is the basis for the spatial principle of multisensory integration (Stein and Meredith, 1993), wherein the magnitude of a neuron’s response to a cross-modal stimulus depends on where the component stimuli fall within its receptive fields. As noted earlier, when they fall within its overlapping receptive fields, they generally enhance the neuron’s response above that to either stimulus independently, and often above the sum of the responses to each stimulus alone (see figure 15.1).

Although these multisensory properties are most frequently associated with SC neurons in the cat (Meredith and Stein, 1983; Meredith and Stein, 1986a; Meredith and Stein, 1986b; Meredith and Stein, 1996; Stein et al., 1993; Wallace et al., 1993; Wallace and Stein, 1994), they have also been seen in the SC of the hamster (Meredith and Stein, 1983; see also Stein and Dixon, 1979), guinea pig (King and Palmer, 1985), and monkey (Wallace et al., 1996) and also in the cortex of the cat (Wallace et al., 1992), rat (Barth and Brett-Green, 2004; Wallace et al., 2004), monkey (Ghazanfar and Schroeder, 2006; Lakatos et al., 2007; Schroeder and Foxe, 2004; Schroeder and Foxe, 2002; Schroeder et al., 2001), and inferred based on the data acquired in human (Calvert and Lewis, 2004; Fort and Giard, 2004; de Gelder et al., 2004; Hairston et al., 2003; Laurienti et al., 2002; Lovelace et al., 2003; Macaluso and Driver, 2004; Sathian and Prather, 2004; Stein et al., 1996).

Influences from Cortex Are Essential for Multisensory Integration in the Superior Colliculus

Earlier it was noted that multisensory integration is not a simple consequence of the convergence of different sensory inputs onto the same SC neuron. The convergence is a necessary factor but not sufficient for conferring this capacity. This conclusion is based on two sets of studies, the first involving the elimination of selected corticocollicular influences, and the second evaluating the development of multisensory integration.

Studies have shown that reversible deactivation of a region of association cortex (the anterior ectosylvian sulcus, AES; and its neighboring region, the rostral lateral suprasylvian sulcus, rLS) eliminated multisensory integration in SC neurons but didn’t preclude their responding to multiple sensory inputs. Thus, a visual-somatosensory neuron remains visual-somatosensory after cortical deactivation but loses the ability to integrate visual and somatosensory information to produce an enhanced response (Jiang et al., 2001; Jiang et al., 2002; Stein et al., 2002; Alvarado et al., 2007a; Alvarado et al., 2007b; Alvarado et al., 2008; see also Burnett et al., 2007). AES appears to be the major cortical contributor to this capacity (Jiang et al., 2001) by virtue of providing converging unisensory inputs to the SC (Wallace et al., 1992; (p.184)

The Developmental Process of Acquiring Multisensory Integration Capabilities

Figure 15.1 Multisensory information converges on individual neurons within the superior colliculus (SC). (a) Visual-auditory-somatosensory maps correspond within the SC creating topographically aligned multisensory representations of the world. (b) Individual neurons within the SC will exhibit response enhancements and depressions depending up on the spatial relationships of unisensory component stimuli. (Center) A visual stimulus (white bar) within the neuron’s visual receptive field (RF) and an auditory stimulus (speaker) placed both within and outside the neuron’s RF resulted in two different multisensory responses. (Left) When visual and auditory stimuli are placed within the neuron’s corresponding receptive field, the result is response enhancement as depicted by the bar graphs representing mean activity of unisensory and multisensory responses. Right: When visual and auditory stimuli are spatially discordant, response depression is observed. (c) The magnitude of multisensory enhancement is dependent on the temporal relationship component stimuli. Multisensory stimulus presentations were presented at different onset asynchronies. Multisensory stimuli presented simultaneously elicited the greatest response. Modified from Stein and Meredith (1993).

(p.185) Fuentes-Santamaria et al., 2009); these AES-derived inputs arise from different sensory subdivisions but appear to work synergistically in conferring multisensory integrative capability onto SC neurons (Alvarado et al., 2008). Although AES and rLS can compensate for one another’s loss early in life, other brain areas appear to be uninvolved and cannot compensate for the temporary or permanent loss of AES and rLS during neonatal or adult life stages (see figure 15.2; Wilkinson et al., 1996; Jiang et al., 2006: Jiang et al., 2007).

Superior Colliculus Neurons Are Capable of Multisensory Integration Only after a Protracted Period of Postnatal Maturation

The cat is an altricial species, and its comparative immaturity at birth (its eyelids and ear canals are still closed) makes it a good subject for developmental studies. Neither its SC, nor that of its precocial laboratory counterpart, the rhesus monkey, contains neurons capable of integrating inputs from different senses at birth (see figure 15.3; Wallace and Stein, 1997, 2001). A reasonable working assumption is that because the SC develops earlier than does the cerebral cortex (see Stein et al., 1973a; Stein et al., 1973b), there is also little possibility that neurons capable of multisensory integration are present in higher brain areas at this stage of development. This is consistent with recent studies of multisensory development in association cortex (Wallace et al., 2006; Carriere et al., 2007) and is consistent with the gradual elaboration of multisensory perception in humans (Neil et al., 2006; Putzar et al., 2007).

The multisensory layers of the newborn cat’s SC are unisensory, and the only sensory-responsive neurons present are somatosensory (Stein et al., 1973a; Stein et al., 1973b). Somato-sensory-responsive neurons are already evident in late prenatal stages and are presumably organized prenatally to ensure that the neonate can use perioral tactile cues to find the nipple immediately after birth (Larson and Stein, 1984). The first neurons responsive to auditory stimuli do not appear until several days later, and visual neurons appear last (Stein et al., 1973a; Kao et al., 1994). The maturation of unisensory properties is germane to the maturation of multisensory integration, as the same neurons are often involved. The unisensory information processing of multisensory neurons is very similar to that of their unisensory neighbors. Many of the immature characteristics of these neurons seem independent of modality. Neonatal receptive fields are very large; the neurons respond poorly to stimuli that are highly effective in the adult; they fatigue rapidly and are less selective in their responses to the parameters of the stimulus (e.g., its direction of movement, velocity, size, etc.). As the animal matures, the receptive fields of SC neurons contract and their unisensory selectivities begin to become apparent, but the maturation of multisensory integration lags considerably. (p.186)

The Developmental Process of Acquiring Multisensory Integration Capabilities

Figure 15.2 (opposite) Neonatal ablation of anterior ectosylvian sulcus (AES) and rostral lateral suprasylvian sulcus (rLS) disrupt the development of superior colliculus (SC) multisensory enhancement. Ablated area of cortex is shown as shading on the schematic of the brain (top right in a). Characteristically, spatially coincident cross-modal stimuli failed to evoke multisensory enhancement at any stimulus intensity, and multisensory response was less than response predicted by summing individual unisensory responses (b, c, and d). Multisensory response was nearly identical to the best unisensory response (visual; taken from Jiang et al., 2001).

(p.187) Unisensory Development May Not Readily Be Generalized to Multisensory Development

The different time lines in unisensory and multisensory SC maturation partly reflect the fundamental distinctions between these processes and raise a significant caution. We have to be very careful when trying to use information about the development of the former to infer something about the development of the latter. Multisensory integration is a unique process that depends on the synthesis of information from independent sensory channels (Alvarado et al., 2007a; Alvarado et al., 2007b; Rowland et al., 2007; Gingras et al., 2006). As noted above, it is possible to have functional visual and auditory systems yet lack the ability to integrate information across them. A second distinction is evident in the tendency to specialize (i.e., become specifically tuned) in one case and to generalize in the other. For example, exposure to all line orientations, or to complex forms, leads to the maturation of orientation preferences in visual cortical neurons. Different groups of neurons selectively respond to different line orientations, with all orientations

The Developmental Process of Acquiring Multisensory Integration Capabilities

Figure 15.3 The developmental chronology of multisensory neurons. The percentage of multisensory neurons in the deep layer sensory-responsive population is plotted as a function of postnatal age. Pie charts in the inset show the expansion of the multisensory population as development progresses.

(p.188) being represented among the population of neurons in visual cortex. This represents a process of unisensory specialization. However, multisensory integration appears to have a very different goal. Rather than specializing, multisensory neurons seem to generalize. They are sensitive to the statistical regularities of cross-modal stimuli and use that information to create general principles that guide their integration of information from different sensory sources. Although there is some unisensory selectivity among SC multisensory neurons, as far as we know, they do not group themselves by subpopulations exhibiting different integrative modes based on stimulus properties.

The Developmental Chronology of Multisensory Neurons

Multisensory neurons follow a maturational time course that has the same sequence as their unisensory components, but their initial appearance is somewhat delayed (Stein et al., 1973a). Thus, somatosensory-auditory neurons are the first multisensory neurons to appear. They appear as early as 10 days of age (5 days later than the appearance of auditory neurons). It is not until three weeks later that visually responsive neurons are found in the multisensory layers (their superficial layer counterparts develop much earlier; see Stein et al., 1973a; Stein et al., 1973b; Kao et al., 1994), and visual-somatosensory and visual-auditory neurons are found at about the same time (see figure 15.3). Although these neurons are multisensory, they are not yet capable of integrating their different sensory inputs. This requires 2 to 3 additional months. It is also during this period that corticocollicular projections from association cortex are developing their influence over SC response properties (Stein and Gallagher, 1981; Stein et al., 2002; Wallace and Stein, 2000; Wallace and Stein, 1997).

The Maturation of Multisensory Integration Capabilities

The observation that neonatal multisensory neurons cannot yet integrate their different sensory inputs, and that this capacity requires substantial postnatal development, suggested that this capacity might depend on sensory experience, specifically, experience with cross-modal stimuli. Certainly, early life is a time during which the neonatal brain could learn the statistical regularities of cross-modal stimuli that characterize its environment. It could then use this information to adapt its multisensory integration capabilities to best suit the environment in which they would be used. An underlying assumption here is that multisensory integration is plastic and can, in fact, be crafted to best accommodate the particular stimulus characteristics that are likely to be encountered. However, plasticity is generally associated more with processes in cortex (Buonomano and Merzenich, 1998) than with those in the midbrain (Wickelgren and Sterling, 1969), and, as noted earlier, SC multisensory integration is dependent on influences from association cortex, and these influences develop during the period in which SC multisensory integration develops. In order to explore the possibility that the circuit does adapt to the cross-modal statistics it experiences, and that this process is mostly dependent on cortex, a number of strategies were employed. The first strategy was to examine the effect of eliminating cross-modal experience during early life. If the development of SC multisensory integration was independent (p.189) of specific experiences, this would not interfere with its normal expression. The easiest way to test this possibility was to use visual-nonvisual integration as a model, not only because of the high incidence of visual multisensory neurons in the normal adult but because such experience can readily be precluded by dark rearing. The second strategy involved altering the cross-modal statistics of the animal’s physical world to see if it would affect its multisensory integration in predictable ways. The third strategy was to examine SC multisensory maturation in the absence of the critical cortical inputs, either because they were removed or rendered inoperative.

Rearing Animals without Cross-Modal (i.e., Visual-Nonvisual) Experience

Raising animals in the dark precludes any visual-nonvisual experience and provides a way of testing whether such experience is critical for SC multisensory integration. Using this strategy, cats were raised in the dark until they were 6 months of age (Wallace et al., 2004). The rearing condition did not prevent the maturational appearance of SC sensory neurons with each of the characteristic convergence profiles: unisensory (visual, auditory, somatosensory) and multisensory (visual-auditory, visual-somatosensory, auditory-somatosensory, and trisensory). These neuronal subtypes proved to be in roughly in their normal proportions, but their receptive fields were considerably larger than normal and better approximated those found in much younger animals. Most pertinent in the present context, however, is that they were incapable of multi-sensory integration. Their general characteristics suggested a failure to mature, as if they were maintained in a neonatal state. The interference with the maturation of multisensory integration might have several possible explanations, but the one that seemed most likely was that, while early visual experience is a key requirement for receptive field contraction, experience with co-occurring cues from different senses (e.g., visual and auditory) is required to later integrate information across them. This was the most attractive possibility because it could also provide a way for the brain to craft multisensory integration capabilities to best fit the particular sensory environment in which this process would operate. If this were the case, one would expect that altering early cross-modal experiences would produce predictable changes in how they would later be integrated.

Rearing Animals with Spatially Disparate Cross-Modal Cues

This strategy here was to examine whether the result predicted above would be induced by rearing animals in an atypical environment in which visual and auditory cues were presented simultaneously but from different locations in space. This was not an ideal circumstance to test this possibility because the stimuli had no intrinsic meaning for the animals and were linked neither to a reward nor to any other event of significance. Therefore, it was not really surprisingly to find that the majority of the SC neurons that were sampled from these animals once they were adults had properties that were very similar to those that characterize neonatal and dark-reared animals. The receptive fields of these neurons were large, and the neurons could not integrate visual-auditory inputs (Wallace and Stein, 2007). These neurons showed no evidence that the cross-modal experience had influenced their development. However, there (p.190) were many SC neurons that had properties that indicated that their maturation was guided by this cross-modal experience. Their visual and auditory receptive fields had become smaller (albeit not as small as those in normally reared animals), and their spatial register was poor, and some showed no overlap between their visual and auditory receptive fields. The absence of any receptive field overlap was highly unusual. Thus, their organization was such that simultaneous visual and auditory stimuli could fall within their respective receptive fields in a given neuron only if they were disparate in space. When such a stimulus configuration was presented, so that the visual stimulus was within its receptive field and the auditory stimulus was within its receptive field, neurons produced an enhanced response (see figure 15.4). Apparently, the spatial rule of multisensory integration was reversed: now, spatially disparate stimuli produced enhancement rather than depression (see Kadunce et al., 1997; Meredith and Stein, 1996). These observations are highly consistent with the idea that experience with cross-modal cues helps establish the functional features of the circuits underlying multisensory integration. Presumably, much of the impact of this experience is exerted on the cortico-SC projection because of its importance in the normal expression of multisensory integration and because its developmental time course appears to parallel that of multisensory integration (Wallace and Stein, 2000).

Superior Colliculus Maturation after Ablation of Association Cortex

To test the assumption noted above, Jiang et al., (2006) removed AES and/or rLS during early life. As adults, SC neurons in these animals were highly atypical. The different receptive fields of an individual neuron were poorly aligned with each other, and the neurons were unable to engage in multisensory integration. This deficit in multisensory integration was also evident behaviorally (Jiang et al., 2007). Apparently, no other area of the brain could compensate for AES and rLS in this regard. However, it appeared that AES and rLS could compensate for the early loss of one another, because this deficit in multisensory integration was evident only when both areas were removed during neonatal life. In the absence of only one of these areas, SC neurons still developed aligned receptive fields and many still developed the capacity for multisensory integration. Its impact on overt behavior was apparent as well. These results suggest that descending influences from association cortex help ensure that the multisensory processes of SC neurons accurately reflect the statistics of the cross-modal events that have been experienced by the developing brain. Presumably, it doesn’t matter whether these statistics are those of “normal” environments or those in which the recurring relationships between visual and auditory events have been varied as long as they are reliable. Without the cortex guiding this development, SC neurons seem to default to the neonatal state where cross-modal cues cannot be used in concert.

Superior Colliculus Development during Long-Term Cortical Deactivation

However, the ablation technique does more than eliminate descending information to the SC. It is a serious insult to the brain that induces a cascade of consequent events in distant areas. (p.191)

The Developmental Process of Acquiring Multisensory Integration Capabilities

Figure 15.4 (a) Receptive field (RF) overlap in normal and spatial-disparity reared animals. The % RF overlap in control animals (light gray) was often 91%–100%, far exceeding that seen in spatial-disparity reared animals (dark gray, often 〈10%). Modified from Wallace and Stein (2007). (b) Rearing with visual-auditory spatial disparity yields anomalous multi-sensory integration. When the visual (white bar) and auditory (gray circle) stimuli are spatially disparate and in their respective RFs (left) they produce significant (〈.05, t test) enhancement (right bottom)—a striking reversal of the normal condition, but one consistent with the animal’s abnormal multisensory experience. Modified from Wallace and Stein (2007).

(p.192) It also leads to a physical and irreplaceable loss of the cortico-SC synapses, making these sites available to colonization by other afferents. To buttress this observation, ongoing experiments (see Stein et al., 2008; Stein and Rowland, 2007) have been examining the consequences of long-term reversible deactivation of association cortex. This strategy deprives the developing cortex of access to sensory experience but does not remove its projections to the SC and allows it to be reactivated later in life. The deactivation was induced by muscimol that was gradually released over many weeks during the period in which multisensory integration normally develops. The muscimol was infused into chronically implanted pledgets made from the polymer Elvax. Examination of the SC many months later revealed that there were many multisensory neurons. These neurons had developed many of their characteristic unisensory response properties, but even after cortex had been reactivated for many weeks the neurons were still unable to integrate the information derived from cross-modal stimuli and could not enhance SC-mediated behaviors. Once again, the data support the idea that cortex is the portal through which early experiences with cross-modal cues gain access to multisensory SC neurons.

In these experiments, association cortex was unilaterally deactivated and multisensory integration deficits were specific to the ipsilateral SC and to contralateral visual-auditory space. Multisensory responses to stimuli in ipsilateral visual-auditory space were normal and contrasted sharply to those in contralateral space. The deficits appeared to be permanent. However, after four years of normal experience, the animals were retested in a multisensory integration behavioral task and, surprisingly, seemed perfectly normal. All evidence of the prior multisensory integration deficit had disappeared. It seemed that long-term experience, even during adulthood, could compensate for the absence of experiences that are normally acquired during early life. The SC multisensory circuit appeared to be far more flexible than was previously suspected and may be of substantial interest to human subjects with developmental deficits in multisensory integration.

Adult Plasticity

Another ongoing line of research has described how the responses of SC neurons in the adult change or adapt as a consequence of exposure to repeated presentations of cross-modal cues (Yu et al., 2009). These researchers found evidence of short-term plasticity in SC multisensory neurons consistent with the principles of spike-timing-dependent plasticity. When animals were repeatedly presented with sequentially arranged visual-auditory stimuli, the response to the first stimulus increased in magnitude and duration, and the latency of the response to the second stimulus decreased. These findings caused the responses to the sequentially arranged stimuli to appear to “merge,” presumably as a consequence of the potentiation of inputs that were previously subthreshold (see figure 15.5). Short-term plasticity of this sort, which reflects the encoding of cross-modal correlations, may provide the basic building blocks on which the “spatial” and “temporal” principles of multisensory integration are founded during long-term development and maturation. (p.193)

The Developmental Process of Acquiring Multisensory Integration Capabilities

Figure 15.5 Repeated exposure to a sequential auditory-visual stimulus resulted in rapid response changes. (Top) A schematic of visual-auditory space in which concentric circles = 10° of space. A broadband noise burst (shown as a star) and a moving bar of light (shown as a white bar) were within their respective receptive fields (auditory = gray, visual = black). The raster display below the electronic stimulus traces shows the response changes during repeated exposures to the visual stimulus (V) followed 100 milliseconds later by the auditory stimulus (A). This cross-modal stimulus first elicited two distinct unisensory responses, but after several stimulus repetitions, the responses fused into a multisensory response. This was the result of an increase in the amplitude and duration of the first response and a shortening of the latency of the second. The black arrow at the bottom of the raster shows the initial period of silence between the two unisensory responses. Response changes were induced within 2 minutes. Data were taken from ongoing experiments (see Yu et al., 2009).

(p.194) Recapitulation

Here we have summarized the basic organizational principles of multisensory integration within the SC, its circuitry and experiential antecedents, and evidence for long- and short-term plasticity in these processes. The development of the unisensory systems on which these processes depend provides both a conceptual framework and an interesting foil, as there are clear differences between the development of processing capabilities within a sense and the ability to integrate information across senses. Although we understand that the principles of multisensory integration adapt to the statistics of the cross-modal environment in which the animal functions, we are still in the nascent stages of understanding exactly how the brain determines these statistics and the biological processes by which they are encoded. Future research will expand our focus by studying these adaptive processes and their consequences at different stages of life, from birth to senescence.

Acknowledgments

Portions of the work described here have been supported by National Institutes of Health grants EY016716 and NS036916 and a grant from the Wallace Foundation.

References

Bibliography references:

Alvarado JC, Stanford TR, Vaughan JW, Stein BE. 2007a. Cortex mediates multisensory but not unisensory integration in superior colliculus. J Neurosci 27: 12775–12786.

Alvarado JC, Vaughan JW, Stanford TR, Stein BE. 2007b. Multisensory versus unisensory integration: contrasting modes in the superior colliculus. J Neurophysiol 97: 3193–3205.

Alvarado JC, Rowland BA, Stanford TR, Stein BE. 2008. A neural network model of multisensory integration also accounts for unisensory integration in superior colliculus. Brain Res 25: 13–23.

Barth DS, Brett-Green B. 2004. Multisensory-evoked potentials in rat cortex. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 357–370. Cambridge, MA: MIT Press.

Bernstein LE, Auer ET, Moore JK. 2004. Audiovisual speech binding: convergence or association. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 203–224. Cambridge, MA: MIT Press.

Bower TGR. 1974. The evolution of sensory systems. In: Perception: Essays in Honor ofJames J. Gibson (Macleod RGPHL Jr. ed), pp 141–165. Ithaca, NY: Cornell University Press.

Brainard MS, Knudsen EI. 1998. Sensitive periods for visual calibration of the auditory space map in the barn owl optic tectum. JNeurosci 18: 3929–3942.

Buonomano DV, Merzenich MM. 1998. Cortical plasticity: from synapses to maps. Annu Rev Neurosci 21: 149–186.

Burnett LR, Stein BE, Perrault TJ, Jr., Wallace MT. 2007. Excitotoxic lesions of the superior colliculus preferentially impact multisensory neurons and multisensory integration. Exp Brain Res 179: 325–338.

Busse L, Roberts KC, Crist RE, Weissman DH, Woldorff MG. 2005. The spread of attention across modalities and space in a multisensory object. Proc Natl Acad Sci USA 102: 18751–18756.

Calvert GA, Lewis JW. 2004. Hemodynamic studies of audiovisual interactions. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 483–502. Cambridge, MA: MIT Press.

Carriere B, Royal DW, Perrault TJ, Jr., Morrison SP, Vaughan JW, Stein BE, Wallace MT. 2007. Visual deprivation alters the development of cortical multisensory integration. J Neurophysiol 98: 2858–2867.

(p.195) Corneil BD, Munoz DP. 1996. The influence of auditory and visual distractors on human orienting gaze shifts. JNeurosci 16: 8193–8207.

de Gelder B, Vroomen J, Pourtois G. 2004. Multisensory perception of emotion, its time course and its neural basis. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 581–596. Cambridge, MA: MIT Press.

Ernst MO, Banks MS. 2002. Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415: 429–433.

Fort A, Giard MH. 2004. Multiple electrophysiological mechanisms of audiovisual integration in human perception. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 503–514. Cambridge, MA: MIT Press.

Frens MA, Van Opstal AJ. 1995. A quantitative study of auditory-evoked saccadic eye movements in two dimensions. Exp Brain Res 107: 103–117.

Fuentes-Santamaria V, Alvarado JC, McHaffie JG, Stein BE. 2009. Axon morphologies and convergence patterns of projections from different sensory-specific cortices of the anterior ectosylvian sulcus onto multisensory neurons in the cat superior colliculus. Cereb Cortex 19: 2902–2915.

Ghazanfar AA, Maier JX, Hoffman KL, Logothetis NK. 2005. Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. J Neurosci 25: 5004–5012.

Ghazanfar AA, Schroeder CE. 2006. Is neocortex essentially multisensory? Trends Cogn Sci 10: 278–285.

Gibson JJ. 1966. The Senses Considered as Perceptual Systems. Boston, MA: Houghton Mifflin.

Gibson JJ. 1969. Principles of Perceptual Learning and Development. Englewood Cliffs, NJ: Prentice Hall.

Gibson JJ. 1979. An Ecological Approach to Perception. Boston, MA: Houghton Mifflin.

Gingras G, Rowland BE, Stein BE. 2006. Unisensory versus multisensory integration: computational distinctions in behavior. Program No. 639.9. 2006 Neuroscience Meeting Planner. Atlanta, GA: Society for Neuroscience. Online.

Gori M, Del Viva M, Sandini G, Burr DC. 2008. Young children do not integrate visual and haptic form information. Curr Biol 18: 694–698.

Grant AC, Thiagarajah MC, Sathian K. 2000. Tactile perception in blind Braille readers: a psychophysical study of acuity and hyperacuity using gratings and dot patterns. Percept Psychophys 62: 301–312.

Grantyn A, Grantyn R. 1982. Axonal patterns and sites of termination of cat superior colliculus neurons projecting in the tecto-bulbo-spinal tract. Exp Brain Res 46: 243–256.

Groh JM, Sparks DL. 1996a. Saccades to somatosensory targets. II. Motor convergence in primate superior colliculus. J Neurophysiol 75: 428–438.

Groh JM, Sparks DL. 1996b. Saccades to somatosensory targets. III. Eye-position-dependent somatosensory activity in primate superior colliculus. J Neurophysiol 75: 439–453.

Guitton D, Munoz DP. 1991. Control of orienting gaze shifts by the tectoreticulospinal system in the head-free cat. I. Identification, localization, and effects of behavior on sensory responses. J Neurophysiol 66: 1605–1623.

Gutfreund Y, Knudsen EI. 2004. Visual instruction of the auditory space map in the midbrain. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 613–624. Cambridge, MA: MIT Press.

Hairston WD, Wallace MT, Vaughan JW, Stein BE, Schirillo JA. 2003. Visual localization ability influences cross-modal bias. J Cogn Neurosci 15: 20–29.

Hall WC, Moschovakis A. 2004. The Superior Colliculus: New Approaches to Studying Sensorimotor Integration. Boca Raton: CRC Press.

Harris LR. 1980. The superior colliculus and movements of the head and eyes in cats. J Physiol 300: 367–391.

Hughes HC, Reuter-Lorenz PA, Nozawa G, Fendrich R. 1994. Visual–auditory interactions in sensorimotor processing: saccades versus manual responses. J Exp Psychol Hum Percept Perform 20: 131–153.

Jay MF, Sparks DL. 1984. Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature 309: 345–347.

Jay MF, Sparks DL. 1987a. Sensorimotor integration in the primate superior colliculus. I. Motor convergence. J Neurophysiol 57: 22–34.

(p.196) Jay MF, Sparks DL. 1987b. Sensorimotor integration in the primate superior colliculus. II. Coordinates of auditory signals. J Neurophysiol 57: 35–55.

Jiang W, Jiang H, Rowland BA, Stein BE. 2007. Multisensory orientation behavior is disrupted by neonatal cortical ablation. J Neurophysiol 97: 557–562.

Jiang W, Jiang H, Stein BE. 2002. Two corticotectal areas facilitate multisensory orientation behavior. J Cogn Neurosci 14: 1240–1255.

Jiang W, Jiang H, Stein BE. 2006. Neonatal cortical ablation disrupts multisensory development in superior colliculus. J Neurophysiol 95: 1380–1396.

Jiang W, Wallace MT, Jiang H, Vaughan JW, Stein BE. 2001. Two cortical areas mediate multisensory integration in superior colliculus neurons. J Neurophysiol 85: 506–522.

Kadunce DC, Vaughan JW, Wallace MT, Benedek G, Stein BE. 1997. Mechanisms of within- and cross-modality suppression in the superior colliculus. J Neurophysiol 78: 2834–2847.

Kao CQ, McHaffie JG, Meredith MA, Stein BE. 1994. Functional development of a central visual map in cat. J Neurophysiol 72: 266–272.

King AJ, Doubell TP, Skaliora I. 2004. Epigenetic factors that align visual and auditory maps in the ferret midbrain. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 613–624. Cambridge, MA: MIT Press.

King AJ, Palmer AR. 1985. Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus. Exp Brain Res 60: 492–500.

King AJ, Schnupp JW, Carlile S, Smith AL, Thompson ID. 1996. The development of topographically-aligned maps of visual and auditory space in the superior colliculus. Prog Brain Res 112: 335–350.

Knudsen EI, Brainard MS. 1991. Visual instruction of the neural map of auditory space in the developing optic tectum. Science 253: 85–87.

Knudsen EI, Esterly SD, du LS. 1991. Stretched and upside-down maps of auditory space in the optic tectum of blind-reared owls; acoustic basis and behavioral correlates. J Neurosci 11: 1727–1747.

Knudsen EI, Knudsen PF. 1989. Vision calibrates sound localization in developing barn owls. J Neurosci 9: 3306–3313.

Lakatos P, Chen CM, O’Connell MN, Mills A, Schroeder CE. 2007. Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53: 279–292.

Larson MA, Stein BE. 1984. The use of tactile and olfactory cues in neonatal orientation and localization of the nipple. Dev Psychobiol 17: 423–436.

Laurienti PJ, Burdette JH, Wallace MT, Yen YF, Field AS, Stein BE. 2002. Deactivation of sensory-specific cortex by cross-modal stimuli. J Cogn Neurosci 14: 420–429.

Lewkowicz DJ, Kraebel KS. 2004. The value of multisensory redundancy in the development of intersensory perception. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 655–678. Cambridge, MA: MIT Press.

Lickliter R, Bahrick LE. 2004. Perceptual development and the origins of multisensory responsiveness. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 643–654. Cambridge, MA: MIT Press.

Liotti M, Ryder K, Woldorff MG. 1998. Auditory attention in the congenitally blind: where, when and what gets reorganized? Neuroreport 9: 1007–1012.

Lovelace CT, Stein BE, Wallace MT. 2003. An irrelevant light enhances auditory detection in humans: a psychophysical analysis of multisensory integration in stimulus detection. Brain Res Cogn Brain Res 17: 447–453.

Macaluso E, Driver J. 2004. Neuroimaging studies of cross-modal integration for emotion. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 529–548. Cambridge, MA: MIT Press.

Marks LE. 1978. The Unity of the Senses: Interrelations Among the Modalities. New York: Academic Press.

Marks LE. 2004. Cross-modal interactions in speeded classification. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 85–106. Cambridge, MA: MIT Press.

Massaro DW. 2004. From multisensory integration to talking heads and language learning. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 153–176. Cambridge, MA: MIT Press.

(p.197) Meredith MA, Clemo HR, Stein BE. 1991. Somatotopic component of the multisensory map in the deep laminae of the cat superior colliculus. J Comp Neurol 312: 353–370.

Meredith MA, Stein BE. 1983. Interactions among converging sensory inputs in the superior colliculus. Science 221: 389–391.

Meredith MA, Stein BE. 1986a. Spatial factors determine the activity of multisensory neurons in cat superior colliculus. Brain Res 365: 350–354.

Meredith MA, Stein BE. 1986b. Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. J Neurophysiol 56: 640–662.

Meredith MA, Stein BE. 1990. The visuotopic component of the multisensory map in the deep laminae of the cat superior colliculus. JNeurosci 10: 3727–3742.

Meredith MA, Stein BE. 1996. Spatial determinants of multisensory integration in cat superior colliculus neurons. J Neurophysiol 75: 1843–1857.

Meredith MA, Wallace MT, Stein BE. 1992. Visual, auditory and somatosensory convergence in output neurons of the cat superior colliculus: multisensory properties of the tecto-reticulo-spinal projection. Exp Brain Res 88: 181–186.

Middlebrooks JC, Knudsen EI. 1984. A neural code for auditory space in the cat’s superior colliculus. J Neurosci 4: 2621–2634.

Munoz DP, Wurtz RH. 1993a. Fixation cells in monkey superior colliculus. I. Characteristics of cell discharge. J Neurophysiol 70: 559–575.

Munoz DP, Wurtz RH. 1993b. Fixation cells in monkey superior colliculus. II. Reversible activation and deactivation. J Neurophysiol 70: 576–589.

Neil PA, Chee-Ruiter C, Scheier C, Lewkowicz DJ, Shimojo S. 2006. Development of multisensory spatial integration and perception in humans. Dev Sci 9: 454–464.

Newell FN. 2004. Cross-modal object recognition. In: The Handbook ofMultisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 123–140. Cambridge, MA: MIT Press.

Partan SR. 2004. Multisensory animal communication. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 225–242. Cambridge, MA: MIT Press.

Peck CK. 1987. Visual-auditory interactions in cat superior colliculus: their role in the control of gaze. Brain Res 420: 162–166.

Pouget A, Deneve S, Duhamel JR. 2002. A computational perspective on the neural basis of multisensory spatial representations. Nat Rev Neurosci 3: 741–747.

Putzar L, Goerendt I, Lange K, Rosler F, Roder B. 2007. Early visual deprivation impairs multisensory interactions in humans. NatNeurosci 10: 1243–1245.

Recanzone GH. 1998. Rapidly induced auditory plasticity: the ventriloquism aftereffect. Proc Natl Acad Sci USA 95: 869–875.

Rowland BA, Quessy S, Stanford TR, Stein BE. 2007. Multisensory integration shortens physiological response latencies. J Neurosci 27: 5879–5884.

Sathian K. 2000. Practice makes perfect: sharper tactile perception in the blind. Neurology 54: 2203–2204.

Sathian K. 2005. Visual cortical activity during tactile perception in the sighted and the visually deprived. Dev Psychobiol 46: 279–286.

Sathian K, Prather SCZM. 2004. Visual cortical involvement in normal tactile perception. In: The Handbook ofMultisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 703–710. Cambridge, MA: MIT Press.

Schroeder CE, Foxe JJ. 2002. The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. Brain Res Cogn Brain Res 14: 187–198.

Schroeder CE, Foxe JJ. 2004. Multisensory convergence in early cortical processing. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 295–310. Cambridge, MA: MIT Press.

Schroeder CE, Lindsley RW, Specht C, Marcovici A, Smiley JF, Javitt DC. 2001. Somatosensory input to auditory association cortex in the macaque monkey. J Neurophysiol 85: 1322–1327.

Shams L, Kamitani Y, Shimojo S. 2004. Modulations of visual perception by sound. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 27–34. Cambridge, MA: MIT Press.

(p.198) Sparks DL. 1986. Translation of sensory signals into commands for control of saccadic eye movements: role of primate superior colliculus. Physiol Rev 66: 118–171.

Sparks DL, Nelson JS. 1987. Sensory and motor maps in the mammalian superior colliculus. Trends Neurosci 10: 312–317.

Stein BE, Magalhaes-Castro B, Kruger L. 1975. Superior colliculus: visuotopic-somatotopic overlap. Science 189: 224–226.

Stein BE. 1984. Development of the superior colliculus. Ann Rev Neurosci 7: 95–125.

Stein BE, Labos E, Kruger L. 1973a. Sequence of changes in properties of neurons of superior colliculus of the kitten during maturation. J Neurophysiol 36: 667–679.

Stein BE, Labos E, Kruger L. 1973b. Determinants of response latency in neurons of superior colliculus in kittens. J Neurophysiol 36: 680–689.

Stein BE, Magalhaes-Castro B, Kruger L. 1976. Relationship between visual and tactile representations in cat superior colliculus. JNeurophysiol 39: 401–419.

Stein BE, Dixon JP. 1979. Properties of superior colliculus neurons in the golden hamster. J Comp Neurol 183: 269–284.

Stein BE, Gallagher HL. 1981. Maturation of cortical control over superior colliculus cells in cat. Brain Res 223: 429–435.

Stein BE, Clamann HP. 1981. Control of pinna movements and sensorimotor register in cat superior colliculus. Brain Behav Evol 19: 180–192.

Stein BE, Meredith MA, Huneycutt WS, McDade L. 1989. Behavioral indices of multisensory integration: orientation to visual cues is affected by auditory stimuli. J Cogn Neurosci 1: 12–24.

Stein BE, Meredith MA. 1993. The Merging ofthe Senses. Cambridge, MA: MIT Press.

Stein BE, Meredith MA, Wallace MT. 1993. The visually responsive neuron and beyond: multisensory integration in cat and monkey. Prog Brain Res 95: 79–90.

Stein BE, London N, Wilkinson LK, Price DD. 1996. Enhancement of perceived visual intensity by auditory stimuli: a psychophysical analysis. J Cogn Neurosci 8: 497–506.

Stein BE, Wallace MW, Stanford TR, Jiang W. 2002. Cortex governs multisensory integration in the midbrain. Neuro-scientist 8: 306–314.

Stein BE, Rowland BA 2007. The critical role of cortico-collicular interactions in the development of multisensory integration. Program No. 614.7. 2007 Neuroscience Meeting Planner. San Diego, CA: Society for Neuroscience, 2007. Online.

Stein BE, Stanford TR. 2008. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci 9: 255–266.

Stein BE, Perrault TJ, Jr., Vaughan JW, Rowland BA. 2008. Long term plasticity of multisensory neurons in the superior colliculus. Program No. 457.14, 2008 Neuroscience Meeting Planner. Washington, DC: Society for Neuroscience, 2008. Online.

Stein BE, Stanford TR, Ramachandran R, Perrault TJ, Jr., Rowland BA. 2009. Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness. Exp Brain Res 198(2–3): 113–126.

Sugihara T, Diltz MD, Averbeck BB, Romanski LM. 2006. Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex. J Neurosci 26: 11138–11147.

Sumby WH, Pollack I. 1954. Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26: 212–215.

Talsma D, Doty TJ, Strowd R, Woldorff MG. 2006a. Attentional capacity for processing concurrent stimuli is larger across sensory modalities than within a modality. Psychophysiology 43: 541–549.

Talsma D, Kok A, Ridderinkhof KR. 2006b. Selective attention to spatial and non-spatial visual stimuli is affected differentially by age: effects on event-related brain potentials and performance data. Int J Psychophysiol 62: 249–261.

Wallace MT, Meredith MA, Stein BE. 1992. Integration of multiple sensory modalities in cat cortex. Exp Brain Res 91: 484–488.

Wallace MT, Meredith MA, Stein BE. 1993. Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. J Neurophysiol 69: 1797–1809.

(p.199) Wallace MT, Stein BE. 1994. Cross-modal synthesis in the midbrain depends on input from cortex. J Neurophysiol 71: 429–432.

Wallace MT, Stein BE. 1997. Development of multisensory neurons and multisensory integration in cat superior colliculus. J Neurosci 17: 2429–2444.

Wallace MT, Stein BE. 2000. Onset of cross-modal synthesis in the neonatal superior colliculus is gated by the development of cortical influences. J Neurophysiol 83: 3578–3582.

Wallace MT, Stein BE. 2001. Sensory and multisensory responses in the newborn monkey superior colliculus. J Neurosci 21: 8886–8894.

Wallace MT. 2004. The development of multisensory integration. In: The Handbook ofMultisensory Processing (Calvert GA, Spence C, Stein BE, eds), pp 625–642. Cambridge, MA: MIT Press.

Wallace MT, Ramachandran R, Stein BE. 2004. A revised view of sensory cortical parcellation. Proc Natl Acad Sci USA 101: 2167–2172.

Wallace MT, Carriere BN, Perrault TJ, Jr., Vaughan JW, Stein BE. 2006. The development of cortical multisensory integration. JNeurosci 26: 11844–11849.

Wallace MT, Stein BE. 2007. Early experience determines how the senses will interact. J Neurophysiol 97: 921–926.

Wallace MT, Wilkinson LK, Stein BE. 1996. Representation and integration of multiple sensory inputs in primate superior colliculus. J Neurophysiol 76: 1246–1266.

Weisser V, Stilla R, Peltier S, Hu X, Sathian K. 2005. Short-term visual deprivation alters neural processing of tactile form. Exp Brain Res 166: 572–582.

Werner H. 1973. Comparative Psychology of Mental Development. New York: International Universities Press.

Wickelgren BG, Sterling P. 1969. Influence of visual cortex on receptive fields in the superior colliculus of the cat. J Neurophysiol 32: 1–15.

Wilkinson LK, Meredith MA, Stein BE. 1996. The role of anterior ectosylvian cortex in cross-modality orientation and approach behavior. Exp Brain Res 112: 1–10.

Woldorff MG, Hazlett CJ, Fichtenholtz HM, Weissman DH, Dale AM, Song AW. 2004. Functional parcellation of attentional control regions of the brain. J Cogn Neurosci 16: 149–165.

Woods TM, Recanzone GH. 2004. Cross-modal interactions evidenced by the ventriloquism effect in humans and monkeys. In: The Handbook of Multisensory Processes (Calvert GA, Spence C, Stein BE, eds), pp 35–48. Cambridge, MA: MIT Press.

Wurtz RH, Albano JE. 1980. Two visual systems: brain mechanisms for localization and discrimination are dissociated by tectal and cortical lesions. Ann Rev Neurosci 3: 189–226.

Wurtz RH, Goldberg ME. 1971. Superior colliculus cell responses related to eye movements in awake monkeys. Science 171: 82–84.

Yu L, Rowland BA, Stein BE. 2009. Plasticity of multisensory neurons in adult superior colliculus: effects of repeated sequential visual and auditory stimuli. Program No. 847.3. 2009 Neuroscience Meeting Planner. Chicago, IL: Society for Neuroscience, 2009. Online.

Zangaladze A, Epstein CM, Grafton ST, Sathian K. 1999. Involvement of visual cortex in tactile discrimination of orientation. Nature 401: 587–590.

Zwiers MP, Van Opstal AJ, Paige GD. 2003. Plasticity in human sound localization induced by compressed spatial vision. NatNeurosci 6: 175–181. (p.200)