Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study

被引:19
|
作者
Stekelenburg, Jeroen J. [1 ]
Vroomen, Jean [1 ]
机构
[1] Tilburg Univ, Dept Cognit Neuropsychol, NL-5000 LE Tilburg, Netherlands
关键词
Predictive coding; Stimulus omission; Visual-auditory; Motor-auditory; Event-related potentials; MULTISENSORY INTERACTIONS; SPEECH; CORTEX; INTEGRATION; CONFLICT; NEGATIVITY; ACTIVATION; OMISSIONS; RESPONSES; MMN;
D O I
10.1016/j.brainres.2015.01.036
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The amplitude of auditory components of the event-related potential (ER?) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one's own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual-auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex. This article is part of a Special Issue entitled SI: Prediction and Attention. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:88 / 96
页数:9
相关论文
共 50 条
  • [31] Automatic auditory change detection in humans is influenced by visual-auditory associative learning
    Laine, Matti
    Kwon, Myoung Soo
    Hamalainen, Heikki
    NEUROREPORT, 2007, 18 (16) : 1697 - 1701
  • [32] Presentation Probability of Visual-Auditory Pairs Modulates Visually Induced Auditory Predictions
    Stuckenberg, Maria V.
    Schroeger, Erich
    Widmann, Andreas
    JOURNAL OF COGNITIVE NEUROSCIENCE, 2019, 31 (08) : 1110 - 1125
  • [33] THE ROLE OF ATTENTION IN VISUAL-AUDITORY TEMPORAL RATE PERCEPTION
    WELCH, RB
    DUTTONHURT, LD
    WARREN, DH
    BULLETIN OF THE PSYCHONOMIC SOCIETY, 1984, 22 (04) : 281 - 281
  • [34] Directed Motor-Auditory EEG Connectivity Is Modulated by Music Tempo
    Nicolaou, Nicoletta
    Malik, Asad
    Daly, Ian
    Weaver, James
    Hwang, Faustina
    Kirke, Alexis
    Roesch, Etienne B.
    Williams, Duncan
    Miranda, Eduardo R.
    Nasuto, Slawomir J.
    FRONTIERS IN HUMAN NEUROSCIENCE, 2017, 11
  • [35] A Study and Analysis of the Relationship between Visual-Auditory Logos and Consumer Behavior
    Li, Hui
    Xu, Junping
    Fang, Meichen
    Tang, Lingzi
    Pan, Younghwan
    BEHAVIORAL SCIENCES, 2023, 13 (07)
  • [36] A NEW TYPE OF VISUAL-AUDITORY INTERACTION - VISUAL-MOTION CAN ELICIT A SUBJECTIVE AUDITORY MOTION
    MATEEFF, S
    HOHNSBEIN, J
    NOACK, T
    PFLUGERS ARCHIV-EUROPEAN JOURNAL OF PHYSIOLOGY, 1984, 402 : R53 - R53
  • [37] Multiple kernel visual-auditory representation learning for retrieval
    Hong Zhang
    Wenping Zhang
    Wenhe Liu
    Xin Xu
    Hehe Fan
    Multimedia Tools and Applications, 2016, 75 : 9169 - 9184
  • [38] Visual-auditory integration during speech imitation in autism
    Williams, JHG
    Massaro, DW
    Peel, NJ
    Bosseler, A
    Suddendorf, T
    RESEARCH IN DEVELOPMENTAL DISABILITIES, 2004, 25 (06) : 559 - 575
  • [39] Multiple kernel visual-auditory representation learning for retrieval
    Zhang, Hong
    Zhang, Wenping
    Liu, Wenhe
    Xu, Xin
    Fan, Hehe
    MULTIMEDIA TOOLS AND APPLICATIONS, 2016, 75 (15) : 9169 - 9184
  • [40] Neural Basis of the Time Window for Subjective Motor-Auditory Integration
    Toida, Koichi
    Ueno, Kanako
    Shimada, Sotaro
    FRONTIERS IN HUMAN NEUROSCIENCE, 2016, 9