Multisensory Perception - Integrative Research

Our multiple sensory modalities offer great behavioural flexibility, as we can selectively combine or segregate the evidence provided by each sense. For example, visual cues from a speakers face make us 'hear better' in noisy environments, while acoustic cues help us 'see better' in a foggy day. The proper interplay between the senses is critical for perception, and deficits in multisensory integration relate to cognitive disorders or sensory-motor challenges during aging. We believe that it is hence imperative to understand how our brain merges multisensory information into a coherent percept.

We use an integrative approach, combining methods from electrophysiology (single unit recordings), functional imaging (M/EEG), psychophysics and computational approaches (data analysis, simulations). Thereby we aim to bridge between different levels of organization of brain function and to provide a better understanding of the neural mechanisms implementing sensory-cognitive processes.



News

eLife Audio-visual Speech integration. Giordano et al. In Press.

Neuroimage Sensory integration in decision making Kayser et al. 2017