Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information

被引:19
|
作者
Drijvers, Linda [1 ,2 ]
Jensen, Ole [3 ]
Spaak, Eelke [4 ]
机构
[1] Radboud Univ Nijmegen, Donders Inst Brain Cognit & Behav, Ctr Cognit, Montessorilaan 3, Nijmegen, Netherlands
[2] Max Planck Inst Psycholinguist, Nijmegen, Netherlands
[3] Univ Birmingham, Ctr Human Brain Hlth, Sch Psychol, Birmingham, W Midlands, England
[4] Radboud Univ Nijmegen, Donders Inst Brain Cognit & Behav, Ctr Cognit Neuroimaging, Kapittelweg 29, Nijmegen, Netherlands
基金
欧盟地平线“2020”;
关键词
ASSR; audiovisual integration; frequency tagging; gesture; intermodulation frequency; magnetoencephalography; multimodal integration; oscillations; speech; SSVEP; STEADY-STATE RESPONSES; SELECTIVE ATTENTION; NEURAL INTEGRATION; TEMPORAL CORTEX; ICONIC GESTURES; SPEECH; LANGUAGE; BRAIN; OSCILLATIONS; RECOGNITION;
D O I
10.1002/hbm.25282
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
During communication in real-life settings, the brain integrates information from auditory and visual modalities to form a unified percept of our environment. In the current magnetoencephalography (MEG) study, we used rapid invisible frequency tagging (RIFT) to generate steady-state evoked fields and investigated the integration of audiovisual information in a semantic context. We presented participants with videos of an actress uttering action verbs (auditory; tagged at 61 Hz) accompanied by a gesture (visual; tagged at 68 Hz, using a projector with a 1,440 Hz refresh rate). Integration difficulty was manipulated by lower-order auditory factors (clear/degraded speech) and higher-order visual factors (congruent/incongruent gesture). We identified MEG spectral peaks at the individual (61/68 Hz) tagging frequencies. We furthermore observed a peak at the intermodulation frequency of the auditory and visually tagged signals (f(visual) - f(auditory) = 7 Hz), specifically when lower-order integration was easiest because signal quality was optimal. This intermodulation peak is a signature of nonlinear audiovisual integration, and was strongest in left inferior frontal gyrus and left temporal regions; areas known to be involved in speech-gesture integration. The enhanced power at the intermodulation frequency thus reflects the ease of lower-order audiovisual integration and demonstrates that speech-gesture information interacts in higher-order language areas. Furthermore, we provide a proof-of-principle of the use of RIFT to study the integration of audiovisual stimuli, in relation to, for instance, semantic context.
引用
收藏
页码:1138 / 1152
页数:15
相关论文
共 50 条
  • [1] Using Invisible Rapid Frequency Tagging to Investigate Attention and Speech-Gesture Integration
    Jensen, Ole
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 2021, 168 : S79 - S79
  • [2] Nonlinear integration of visual and auditory motion information for human control of posture
    Kitazaki, M
    Kohyama, L
    PERCEPTION, 2005, 34 : 83 - 83
  • [3] Optimal parameters for rapid (invisible) frequency tagging using MEG
    Minarik, Tamas
    Berger, Barbara
    Jensen, Ole
    NEUROIMAGE, 2023, 281
  • [4] Application of rapid invisible frequency tagging for brain computer interfaces
    Brickwedde, Marion
    Bezsudnova, Yulia
    Kowalczyk, Anna
    Jensen, Ole
    Zhigalov, Alexander
    JOURNAL OF NEUROSCIENCE METHODS, 2022, 382
  • [5] SENSORY INTEGRATION OF AUDITORY AND VISUAL INFORMATION
    DOUGHERTY, WG
    JONES, GB
    ENGEL, GR
    CANADIAN JOURNAL OF PSYCHOLOGY, 1971, 25 (06): : 476 - +
  • [6] EEG frequency tagging reveals the integration of dissimilar observed actions
    Formica, Silvia
    Chaiken, Anna
    Wiersema, Jan R.
    Cracco, Emiel
    CORTEX, 2024, 181 : 204 - 215
  • [7] Integration of auditory and visual information in the recognition of realistic objects
    Suied, Clara
    Bonneel, Nicolas
    Viaud-Delmon, Isabelle
    EXPERIMENTAL BRAIN RESEARCH, 2009, 194 (01) : 91 - 102
  • [8] Integration of auditory and visual information in the recognition of realistic objects
    Clara Suied
    Nicolas Bonneel
    Isabelle Viaud-Delmon
    Experimental Brain Research, 2009, 194
  • [9] INTEGRATION OF AUDITORY INFORMATION IN CATS VISUAL-CORTEX
    FISHMAN, MC
    MICHAEL, CR
    VISION RESEARCH, 1973, 13 (08) : 1415 - 1419
  • [10] Automatic integration of auditory and visual information is not simultaneous in Chinese
    Huang Mingjin
    Hasko, Sandra
    Schulte-Koerne, Gerd
    Bruder, Jennifer
    NEUROSCIENCE LETTERS, 2012, 527 (01) : 22 - 27