Multisensory processes include the capacity to combine information from the different senses, oft... more Multisensory processes include the capacity to combine information from the different senses, often improving stimulus representations and behavior. The extent to which multisensory processes are an innate capacity or instead require experience with environmental stimuli remains debated. We addressed this knowledge gap by studying multisensory processes in prematurely born and full-term infants. We recorded 128-channel event-related potentials (ERPs) from a cohort of 55 full-term and 61 preterm neonates (at an equivalent gestational age) in response to auditory, somatosensory, and combined auditory-somatosensory multisensory stimuli. Data were analyzed within an electrical neuroimaging framework, involving unsupervised topographic clustering of the ERP data. Multisensory processing in full-term infants was characterized by a simple linear summation of responses to auditory and somatosensory stimuli alone, which furthermore shared common ERP topographic features. We refer to the ERP ...
Cerebral cortex (New York, N.Y. : 1991), Jan 20, 2018
The perception of an acoustic rhythm is invariant to the absolute temporal intervals constituting... more The perception of an acoustic rhythm is invariant to the absolute temporal intervals constituting a sound sequence. It is unknown where in the brain temporal Gestalt, the percept emerging from the relative temporal proximity between acoustic events, is encoded. Two different relative temporal patterns, each induced by three experimental conditions with different absolute temporal patterns as sensory basis, were presented to participants. A linear support vector machine classifier was trained to differentiate activation patterns in functional magnetic resonance imaging data to the 2 different percepts. Across the sensory constituents the classifier decoded which percept was perceived. A searchlight analysis localized activation patterns specific to the temporal Gestalt bilaterally to the temporoparietal junction, including the planum temporale and supramarginal gyrus, and unilaterally to the right inferior frontal gyrus (pars opercularis). We show that auditory areas not only process...
Most evidence on the neural and perceptual correlates of sensory processing derives from studies ... more Most evidence on the neural and perceptual correlates of sensory processing derives from studies that have focused on only a single sensory modality and averaged the data from groups of participants. Although valuable, such studies ignore the substantial interindividual and intraindividual differences that are undoubtedly at play. Such variability plays an integral role in both the behavioral/perceptual realms and in the neural correlates of these processes, but substantially less is known when compared with group-averaged data. Recently, it has been shown that the presentation of stimuli from two or more sensory modalities (i.e., multisensory stimulation) not only results in the well-established performance gains but also gives rise to reductions in behavioral and neural response variability. To better understand the relationship between neural and behavioral response variability under multisensory conditions, this study investigated both behavior and brain activity in a task requi...
Several studies indicate that the outcome of nutritional and lifestyle interventions can be linke... more Several studies indicate that the outcome of nutritional and lifestyle interventions can be linked to brain 'signatures' in terms of neural reactivity to food cues. However, 'dieting' is often considered in a rather broad sense, and no study so far investigated modulations in brain responses to food cues occurring over an intervention specifically aiming to reduce sugar intake. We studied neural activity and liking in response to visual food cues in 14 intensive consumers of sugar-sweetened beverages before and after a 3-month replacement period by artificially-sweetened equivalents. Each time, participants were presented with images of solid foods differing in fat content and taste quality while high-density electroencephalography was recorded. Contrary to our hypotheses, there was no significant weight loss over the intervention period and no changes were observed in food liking or in neural activity in regions subserving salience and reward attribution. However, n...
Space is a dimension shared by different modalities, but at what stage spatial encoding is affect... more Space is a dimension shared by different modalities, but at what stage spatial encoding is affected by multisensory processes is unclear. Early studies observed attenuation of N1/P2 auditory evoked responses following repetition of sounds from the same location. Here, we asked whether this effect is modulated by audiovisual interactions. In two experiments, using a repetition-suppression paradigm, we presented pairs of tones in free field, where the test stimulus was a tone presented at a fixed lateral location. Experiment 1 established a neural index of auditory spatial sensitivity, by comparing the degree of attenuation of the response to test stimuli when they were preceded by an adapter sound at the same location versus 30° or 60° away. We found that the degree of attenuation at the P2 latency was inversely related to the spatial distance between the test stimulus and the adapter stimulus. In Experiment 2, the adapter stimulus was a tone presented from the same location or a more medial location than the test stimulus. The adapter stimulus was accompanied by a simultaneous flash displayed orthogonally from one of the two locations. Sound-flash incongruence reduced accuracy in a same-different location discrimination task (i.e., the ventriloquism effect) and reduced the location-specific repetition-suppression at the P2 latency. Importantly, this multisensory effect included topographic modulations, indicative of changes in the relative contribution of underlying sources across conditions. Our findings suggest that the auditory response at the P2 latency is affected by spatially selective brain activity, which is affected crossmodally by visual information.
Archives of physical medicine and rehabilitation, Jan 5, 2017
To evaluate the effects of electrically assisted movement therapy (EAMT) in which patients use fu... more To evaluate the effects of electrically assisted movement therapy (EAMT) in which patients use functional electrical stimulation, modulated by a custom device controlled through the patient's unaffected hand, to produce or assist task-specific upper limb movements, which enables them to engage in intensive goal-oriented training. Randomized, crossover, assessor-blinded, 5-week trial with follow-up at 18 weeks. Rehabilitation university hospital. Patients with chronic, severe stroke (N=11; mean age, 47.9y) more than 6 months poststroke (mean time since event, 46.3mo). Both EAMT and the control intervention (dose-matched, goal-oriented standard care) consisted of 10 sessions of 90 minutes per day, 5 sessions per week, for 2 weeks. After the first 10 sessions, group allocation was crossed over, and patients received a 1-week therapy break before receiving the new treatment. Fugl-Meyer Motor Assessment for the Upper Extremity, Wolf Motor Function Test, spasticity, and 28-item Motor ...
Every year, 15 million preterm infants are born, and most spend their first weeks in neonatal int... more Every year, 15 million preterm infants are born, and most spend their first weeks in neonatal intensive care units (NICUs) [1]. Although essential for the support and survival of these infants, NICU sensory environments are dramatically different from those in which full-term infants mature and thus likely impact the development of functional brain organization [2]. Yet the integrity of sensory systems determines effective perception and behavior [3, 4]. In neonates, touch is a cornerstone of interpersonal interactions and sensory-cognitive development [5-7]. NICU treatments used to improve neurodevelopmental outcomes rely heavily on touch [8]. However, we understand little of how brain maturation at birth (i.e., prematurity) and quality of early-life experiences (e.g., supportive versus painful touch) interact to shape the development of the somatosensory system [9]. Here, we identified the spatial, temporal, and amplitude characteristics of cortical responses to light touch that d...
Diagnosing heart conditions by auscultation is an important clinical skill commonly learnt by med... more Diagnosing heart conditions by auscultation is an important clinical skill commonly learnt by medical students. Clinical proficiency for this skill is in decline [1], and new teaching methods are needed. Successful discrimination of heartbeat sounds is believed to benefit mainly from acoustical training [2]. From recent studies of auditory training [3,4] we hypothesized that semantic representations outside the auditory cortex contribute to diagnostic accuracy in cardiac auscultation. To test this hypothesis, we analysed auditory evoked potentials (AEPs) which were recorded from medical students while they diagnosed quadruplets of heartbeat cycles. The comparison of trials with correct (Hits) versus incorrect diagnosis (Misses) revealed a significant difference in brain activity at 280-310 ms after the onset of the second cycle within the left middle frontal gyrus (MFG) and the right prefrontal cortex. This timing and locus suggest that semantic rather than acoustic representations ...
An object&amp... more An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported.
Abstract. We propose and demonstrate a novel method for analyzing human EEG at a single-subject a... more Abstract. We propose and demonstrate a novel method for analyzing human EEG at a single-subject and single-trial level. We focus here on the analysis of data from an auditory object recognition experiment. The analysis is based on the topographic information that can be ...
Http Dx Doi Org 10 1162 089892900562372, Mar 13, 2006
Object recognition is achieved even in circumstances when only partial information is available t... more Object recognition is achieved even in circumstances when only partial information is available to the observer. Perceptual closure processes are essential in enabling such recognitions to occur. We presented successively less fragmented images while recording high-density event-related potentials (ERPs), which permitted us to monitor brain activity during the perceptual closure processes leading up to object recognition. We reveal a bilateral ERP component (Ncl) that tracks these processes (onsets ∼ 230 msec, maximal at ∼290 msec). Scalp-current density mapping of the Ncl revealed bilateral occipito-temporal scalp foci, which are consistent with generators in the human ventral visual stream, and specifically the lateral-occipital or LO complex as defined by hemodynamic studies of object recognition.
Approaching or looming sounds (L-sounds) have been shown to selectively increase visual cortex ex... more Approaching or looming sounds (L-sounds) have been shown to selectively increase visual cortex excitability [Romei, V., Murray, M. M., Cappe, C., & Thut, G. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology, 19, 1799–1805, 2009]. These cross-modal effects start at an early, preperceptual stage of sound processing and persist with increasing sound duration. Here, we identified individual factors contributing to cross-modal effects on visual cortex excitability and studied the persistence of effects after sound offset. To this end, we probed the impact of different L-sound velocities on phosphene perception postsound as a function of individual auditory versus visual preference/dominance using single-pulse TMS over the occipital pole. We found that the boosting of phosphene perception by L-sounds continued for several tens of milliseconds after the end of the L-sound and was temporally sensitive to different L-sound profiles (velocities). In addition, we found that this depended on an individual's preferred sensory modality (auditory vs. visual) as determined through a divided attention task (attentional preference), but not on their simple threshold detection level per sensory modality. Whereas individuals with “visual preference” showed enhanced phosphene perception irrespective of L-sound velocity, those with “auditory preference” showed differential peaks in phosphene perception whose delays after sound-offset followed the different L-sound velocity profiles. These novel findings suggest that looming signals modulate visual cortex excitability beyond sound duration possibly to support prompt identification and reaction to potentially dangerous approaching objects. The observed interindividual differences favor the idea that unlike early effects this late L-sound impact on visual cortex excitability is influenced by cross-modal attentional mechanisms rather than low-level sensory processes.
ABSTRACT This talk summarizes neural bases of multisensory processes in adults alongside improvem... more ABSTRACT This talk summarizes neural bases of multisensory processes in adults alongside improvements in EEG, fMRI, and TMS signal analysis methods. First, primary and other low-level cortices are loci of behaviourally-relevant multisensory processes. Second, multisensory processes based on single-trial learning can enhance later object recognition. Together, these data underscore how multisensory research is changing long-held models of functional brain organization and perception.
Multisensory processes include the capacity to combine information from the different senses, oft... more Multisensory processes include the capacity to combine information from the different senses, often improving stimulus representations and behavior. The extent to which multisensory processes are an innate capacity or instead require experience with environmental stimuli remains debated. We addressed this knowledge gap by studying multisensory processes in prematurely born and full-term infants. We recorded 128-channel event-related potentials (ERPs) from a cohort of 55 full-term and 61 preterm neonates (at an equivalent gestational age) in response to auditory, somatosensory, and combined auditory-somatosensory multisensory stimuli. Data were analyzed within an electrical neuroimaging framework, involving unsupervised topographic clustering of the ERP data. Multisensory processing in full-term infants was characterized by a simple linear summation of responses to auditory and somatosensory stimuli alone, which furthermore shared common ERP topographic features. We refer to the ERP ...
Cerebral cortex (New York, N.Y. : 1991), Jan 20, 2018
The perception of an acoustic rhythm is invariant to the absolute temporal intervals constituting... more The perception of an acoustic rhythm is invariant to the absolute temporal intervals constituting a sound sequence. It is unknown where in the brain temporal Gestalt, the percept emerging from the relative temporal proximity between acoustic events, is encoded. Two different relative temporal patterns, each induced by three experimental conditions with different absolute temporal patterns as sensory basis, were presented to participants. A linear support vector machine classifier was trained to differentiate activation patterns in functional magnetic resonance imaging data to the 2 different percepts. Across the sensory constituents the classifier decoded which percept was perceived. A searchlight analysis localized activation patterns specific to the temporal Gestalt bilaterally to the temporoparietal junction, including the planum temporale and supramarginal gyrus, and unilaterally to the right inferior frontal gyrus (pars opercularis). We show that auditory areas not only process...
Most evidence on the neural and perceptual correlates of sensory processing derives from studies ... more Most evidence on the neural and perceptual correlates of sensory processing derives from studies that have focused on only a single sensory modality and averaged the data from groups of participants. Although valuable, such studies ignore the substantial interindividual and intraindividual differences that are undoubtedly at play. Such variability plays an integral role in both the behavioral/perceptual realms and in the neural correlates of these processes, but substantially less is known when compared with group-averaged data. Recently, it has been shown that the presentation of stimuli from two or more sensory modalities (i.e., multisensory stimulation) not only results in the well-established performance gains but also gives rise to reductions in behavioral and neural response variability. To better understand the relationship between neural and behavioral response variability under multisensory conditions, this study investigated both behavior and brain activity in a task requi...
Several studies indicate that the outcome of nutritional and lifestyle interventions can be linke... more Several studies indicate that the outcome of nutritional and lifestyle interventions can be linked to brain 'signatures' in terms of neural reactivity to food cues. However, 'dieting' is often considered in a rather broad sense, and no study so far investigated modulations in brain responses to food cues occurring over an intervention specifically aiming to reduce sugar intake. We studied neural activity and liking in response to visual food cues in 14 intensive consumers of sugar-sweetened beverages before and after a 3-month replacement period by artificially-sweetened equivalents. Each time, participants were presented with images of solid foods differing in fat content and taste quality while high-density electroencephalography was recorded. Contrary to our hypotheses, there was no significant weight loss over the intervention period and no changes were observed in food liking or in neural activity in regions subserving salience and reward attribution. However, n...
Space is a dimension shared by different modalities, but at what stage spatial encoding is affect... more Space is a dimension shared by different modalities, but at what stage spatial encoding is affected by multisensory processes is unclear. Early studies observed attenuation of N1/P2 auditory evoked responses following repetition of sounds from the same location. Here, we asked whether this effect is modulated by audiovisual interactions. In two experiments, using a repetition-suppression paradigm, we presented pairs of tones in free field, where the test stimulus was a tone presented at a fixed lateral location. Experiment 1 established a neural index of auditory spatial sensitivity, by comparing the degree of attenuation of the response to test stimuli when they were preceded by an adapter sound at the same location versus 30° or 60° away. We found that the degree of attenuation at the P2 latency was inversely related to the spatial distance between the test stimulus and the adapter stimulus. In Experiment 2, the adapter stimulus was a tone presented from the same location or a more medial location than the test stimulus. The adapter stimulus was accompanied by a simultaneous flash displayed orthogonally from one of the two locations. Sound-flash incongruence reduced accuracy in a same-different location discrimination task (i.e., the ventriloquism effect) and reduced the location-specific repetition-suppression at the P2 latency. Importantly, this multisensory effect included topographic modulations, indicative of changes in the relative contribution of underlying sources across conditions. Our findings suggest that the auditory response at the P2 latency is affected by spatially selective brain activity, which is affected crossmodally by visual information.
Archives of physical medicine and rehabilitation, Jan 5, 2017
To evaluate the effects of electrically assisted movement therapy (EAMT) in which patients use fu... more To evaluate the effects of electrically assisted movement therapy (EAMT) in which patients use functional electrical stimulation, modulated by a custom device controlled through the patient's unaffected hand, to produce or assist task-specific upper limb movements, which enables them to engage in intensive goal-oriented training. Randomized, crossover, assessor-blinded, 5-week trial with follow-up at 18 weeks. Rehabilitation university hospital. Patients with chronic, severe stroke (N=11; mean age, 47.9y) more than 6 months poststroke (mean time since event, 46.3mo). Both EAMT and the control intervention (dose-matched, goal-oriented standard care) consisted of 10 sessions of 90 minutes per day, 5 sessions per week, for 2 weeks. After the first 10 sessions, group allocation was crossed over, and patients received a 1-week therapy break before receiving the new treatment. Fugl-Meyer Motor Assessment for the Upper Extremity, Wolf Motor Function Test, spasticity, and 28-item Motor ...
Every year, 15 million preterm infants are born, and most spend their first weeks in neonatal int... more Every year, 15 million preterm infants are born, and most spend their first weeks in neonatal intensive care units (NICUs) [1]. Although essential for the support and survival of these infants, NICU sensory environments are dramatically different from those in which full-term infants mature and thus likely impact the development of functional brain organization [2]. Yet the integrity of sensory systems determines effective perception and behavior [3, 4]. In neonates, touch is a cornerstone of interpersonal interactions and sensory-cognitive development [5-7]. NICU treatments used to improve neurodevelopmental outcomes rely heavily on touch [8]. However, we understand little of how brain maturation at birth (i.e., prematurity) and quality of early-life experiences (e.g., supportive versus painful touch) interact to shape the development of the somatosensory system [9]. Here, we identified the spatial, temporal, and amplitude characteristics of cortical responses to light touch that d...
Diagnosing heart conditions by auscultation is an important clinical skill commonly learnt by med... more Diagnosing heart conditions by auscultation is an important clinical skill commonly learnt by medical students. Clinical proficiency for this skill is in decline [1], and new teaching methods are needed. Successful discrimination of heartbeat sounds is believed to benefit mainly from acoustical training [2]. From recent studies of auditory training [3,4] we hypothesized that semantic representations outside the auditory cortex contribute to diagnostic accuracy in cardiac auscultation. To test this hypothesis, we analysed auditory evoked potentials (AEPs) which were recorded from medical students while they diagnosed quadruplets of heartbeat cycles. The comparison of trials with correct (Hits) versus incorrect diagnosis (Misses) revealed a significant difference in brain activity at 280-310 ms after the onset of the second cycle within the left middle frontal gyrus (MFG) and the right prefrontal cortex. This timing and locus suggest that semantic rather than acoustic representations ...
An object&amp... more An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported.
Abstract. We propose and demonstrate a novel method for analyzing human EEG at a single-subject a... more Abstract. We propose and demonstrate a novel method for analyzing human EEG at a single-subject and single-trial level. We focus here on the analysis of data from an auditory object recognition experiment. The analysis is based on the topographic information that can be ...
Http Dx Doi Org 10 1162 089892900562372, Mar 13, 2006
Object recognition is achieved even in circumstances when only partial information is available t... more Object recognition is achieved even in circumstances when only partial information is available to the observer. Perceptual closure processes are essential in enabling such recognitions to occur. We presented successively less fragmented images while recording high-density event-related potentials (ERPs), which permitted us to monitor brain activity during the perceptual closure processes leading up to object recognition. We reveal a bilateral ERP component (Ncl) that tracks these processes (onsets ∼ 230 msec, maximal at ∼290 msec). Scalp-current density mapping of the Ncl revealed bilateral occipito-temporal scalp foci, which are consistent with generators in the human ventral visual stream, and specifically the lateral-occipital or LO complex as defined by hemodynamic studies of object recognition.
Approaching or looming sounds (L-sounds) have been shown to selectively increase visual cortex ex... more Approaching or looming sounds (L-sounds) have been shown to selectively increase visual cortex excitability [Romei, V., Murray, M. M., Cappe, C., & Thut, G. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology, 19, 1799–1805, 2009]. These cross-modal effects start at an early, preperceptual stage of sound processing and persist with increasing sound duration. Here, we identified individual factors contributing to cross-modal effects on visual cortex excitability and studied the persistence of effects after sound offset. To this end, we probed the impact of different L-sound velocities on phosphene perception postsound as a function of individual auditory versus visual preference/dominance using single-pulse TMS over the occipital pole. We found that the boosting of phosphene perception by L-sounds continued for several tens of milliseconds after the end of the L-sound and was temporally sensitive to different L-sound profiles (velocities). In addition, we found that this depended on an individual's preferred sensory modality (auditory vs. visual) as determined through a divided attention task (attentional preference), but not on their simple threshold detection level per sensory modality. Whereas individuals with “visual preference” showed enhanced phosphene perception irrespective of L-sound velocity, those with “auditory preference” showed differential peaks in phosphene perception whose delays after sound-offset followed the different L-sound velocity profiles. These novel findings suggest that looming signals modulate visual cortex excitability beyond sound duration possibly to support prompt identification and reaction to potentially dangerous approaching objects. The observed interindividual differences favor the idea that unlike early effects this late L-sound impact on visual cortex excitability is influenced by cross-modal attentional mechanisms rather than low-level sensory processes.
ABSTRACT This talk summarizes neural bases of multisensory processes in adults alongside improvem... more ABSTRACT This talk summarizes neural bases of multisensory processes in adults alongside improvements in EEG, fMRI, and TMS signal analysis methods. First, primary and other low-level cortices are loci of behaviourally-relevant multisensory processes. Second, multisensory processes based on single-trial learning can enhance later object recognition. Together, these data underscore how multisensory research is changing long-held models of functional brain organization and perception.
Uploads