Key Results

Key Results

We have tested the functional role of dynamic binding for crossmodal interactions in humans using behavioral studies as well as EEG- and MEG-studies in combination with advanced source modelling approaches and neural interaction measures. In several studies, we used a newly developed multisensory paradigm involving visuo-tactile interactions. A first behavioral study showed that stimulus congruence resulted in improvements in visuo-tactile pattern matching. A subsequent EEG study identified task-related power changes, showing that oscillatory power in the theta-, alpha- and beta-bands are functionally relevant for multisensory pattern matching. For analysis of functional connectivity patterns related to multisensory integration, we adapted an approach for purely data-driven analysis of neuronal coupling in source space that has recently been developed within our group. We identified several clusters of interacting sources that synchronized in the beta-band. Two of these clusters suggested an involvement in crossmodal sensory processing, whereas the third cluster appeared to reflect decision-related processes. By directly relating coupling features to task performance, we were able to demonstrate that the phase of neural coherence within the observed networks predicted behavior.

A major focus of this project was to address top-down modulation of multisensory processing. Specifically, we studied the interaction between top-down factors such as spatial attention or stored priors with bottom-up information related to the nature of the stimuli. In a number of behavioral studies we could demonstrate top-down modulation of multisensory experience by visual scene context, activation of the linguistic system, by subliminal priming and emotional aspects. This demonstrates a tight integration of sensory, cognitive and motor systems and necessitates the investigation of the whole system in a ecologically valid context. In a series of behavioral experiments we used a dual task design to address the role of attentional processes in visual-auditory, visuo-tactile and tactile-vestibular interactions. We could demonstrate that the different modalities depend at least partly on shared attentional resources. This is however was dependent on the task demands, e.g. with respect to spatial properties. This lead to the hypothesis that the attentional bottleneck might be better described not by a competition of different modalities, but by a competition of different tasks. Furthermore, even in conditions with high attentional demands due to a dual task, the integration of multisensory information was optimal. Indeed, even task irrelevant tactile information influences visual exploratory behaviour. Furthermore, the spatial properties of this multimodal interaction occurred in a joint allocentric reference frame early on. Thus, multisensory integration appears to follow optimal Bayesian principles even at high attentional load. 

We used a novel multisensory attention paradigm to investigate attention-modulated cortical oscillations over a wide range of frequencies using MEG. We found that attention led to increased high-frequency gamma-band activity and decreased lower frequency theta-, alpha-, and beta-band activity in early sensory cortex areas. Moreover, alpha-band coherence decreased in visual cortex. Frontal cortex was found to exert attentional control through increased low-frequency phase synchronisation. Furthermore, crossmodal congruence modulated beta-band coherence in mid-cingulate and superior temporal cortex. This crossmodal matching paradigm was extended further to include simultaneous stimulation of vision, audition, and somatosensation in order to investigate the interaction between attention and crossmodal congruence. Largest crossmodal effects were seen in visual-tactile matching, intermediate effects for audio-visual and smallest effects for audio-tactile matching. We conclude that differences in crossmodal matching likely reflect characteristics of multisensory neural network architecture. In a separate line of studies we investigated interactions between visual and pain stimuli: "Don't look and it won't hurt" is commonly heard advice when receiving an injection, which implies that observing needle pricks enhances pain perception. How both previous experiences and acute situational expectations related to viewing needle pricks modulate pain perception was previously unknown. We obtained evidence that remote painful experiences with viewing needle pricks, together with information given prior to an injection, differentially shape the impact of viewing a needle prick on pain perception. In a subsequent EEG study we investigated whether anticipatory oscillatory activity predicts the unpleasantness of pain stimuli in a manner related to visual context. We found an anticipatory reduction of alpha-band activity in cingulate cortex and fusiform gyrus which may reflect a neural mechanism that serves to protect the body from forthcoming harm by facilitating the preparation of adequate defense responses.

Another major theme of this project was to investigate multisensory perception-action coupling. We hypothesized that multisensory experience relates to sensorimotor contingencies coupled across multiple input channels. We studied visual-auditory interactions in the context of action generation and the interaction of a newly acquired sense with the natural modalities. This was combined with neurophysiological methods, aiming to identify interaction patterns and networks underlying multisensory and sensorimotor integration. Action perception coupling is most prominent in the study of eye movements. In this project we investigated the relative influence of stimulus dependent properties, task related properties and geometrical constraints. These studies demonstrated and characterised the specific influence of each of the three factors on oculomotor behavior. Furthermore, we could demonstrate the influence of salient motion features in the guidance of eye movements. We used MEG and source analysis to investigate the spectral signatures of human cortical networks engaged in active and intrinsically motivated viewing behavior. Our results suggest that neuronal population activity in the gamma frequency band in a distributed network of fronto-parietal areas reflects the intrinsically driven process of selection among competing behavioral alternatives. Furthermore, we investigated the sense of agency in a multisensory mixed-reality setup in which participants tracked their finger movements by means of a virtual hand. Our results indicate different contributions of movement- versus outcome-related sensory feedback to the sense of agency, irrespective of the sensory modality providing the feedback on the outcome. 

An outstanding feature of the human brain its adaptivity. Indeed, in this project we could demonstrate that uni-modal processing in visual cortex is largely concerned not with a static representation, but with changes of the sensory signals. We expect that this relates to multisensory processing as well, which, however, is left for future investigations. Furthermore, training of recognition tasks leads to a shift engaged cortical areas from the frontal lobe towards more posterior regions. Simultaneously, the recognition process is much speed up. These results demonstrate the adaptivity of the human brain to the specific requirements of the current task. In several studies addressing the results of crossmodal developmental plasticity, we characterised neuronal population activity in visual and auditory cortex of congenitally blind humans. We recorded MEG signals during ongoing activity or during a complex cognitive task involving semantic categorisation of meaningful sounds. Our results suggest that high-frequency oscillatory activity reflects non-visual processing in the visual cortex of blind individuals. Moreover, our results provide evidence that the deprived visual cortex is functionally integrated into a larger network that serves non-visual functions. Furthermore, our data show that the visual cortex of the congenitally blind exhibits a characteristic gain in frequency-specific intrinsic neuronal interactions. Crossmodal reorganisation associated with changes in large-scale functional connectivity was investigated in the blind also in an extensive study using working memory training in the auditory and tactile modalities. In the blind, beta-band connectivity increased between brain areas involved in auditory working memory and the visual cortex, particularly the right fusiform face area, suggesting a task-specific integration of visual cortex. Additional work in the blind also addressed effects of crossmodal developmental plasticity on remapping of reference frames for tactile processing. 


Main Conclusions
  • We have obtained neurophysiological evidence demonstrating that oscillatory brain signals in multiple frequency ranges are modulated by multisensory stimulation and are functionally relevant for implementing interactions across sensory systems in the brain. Importantly, we could also demonstrate that this involves changes in functional connectivity, mediated by phase coupling of neural oscillations.
  • Our results show that factors influencing crossmodal interactions are manifold and operate in a stimulus-driven, bottom-up fashion, as well as via top-down control. Our results extend previous findings from audio-visual studies, showing that stimulus congruence also resulted in behavioral improvements in visuo-tactile interactions. The interplay of stimulus processing and attentional control seems to be organised in a highly flexible fashion, with the integration of signals depending on both bottom-up and top-down factors, rather than occurring in an „all-or-nothing“ manner.
  • Our data on sensorimotor interactions suggest that multisensory interaction and integration can be best understood within an embodied perspective, tying together sensory processing and behaviour in a unified framework. 
  • We could demonstrate top-down modulation of multisensory experience by visual scene context, activation of the linguistic system, by subliminal priming and emotional aspects. These studies included multisensory interactions relevant in clinical settings, such as the modulation of pain by visual contexts that influence pain-predictive expectations or by emotional contextual stimuli in music therapeutic scenarios. These results demonstrate a tight integration of sensory, cognitive and motor systems and the necessity to investigation of whole system in a ecologically valid context.
  • In a set of studies on sensory augmentation we could demonstrate that newly acquired senses are integrated with the classic modalities. Here, learning and plasticity play an important role. We observed sensory processing changes with extended training, which are reflected by concomitant changes in the brain areas involved.
  • Our investigations on adaptivity of multisensory processing show that developmental changes strongly shape the functional dynamics of the cortical networks. In blind participants, we could show that, due to the altered postnatal experience, functional connectivity within occipital cortex and between different cortical systems is profoundly altered. Furthermore, training of working memory in tactile and auditory domains shapes cortical dynamics and functional connectivity in the blind in ways profoundly different from sighted participants.
Share by: