Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study

被引:19
|
作者
Stekelenburg, Jeroen J. [1 ]
Vroomen, Jean [1 ]
机构
[1] Tilburg Univ, Dept Cognit Neuropsychol, NL-5000 LE Tilburg, Netherlands
关键词
Predictive coding; Stimulus omission; Visual-auditory; Motor-auditory; Event-related potentials; MULTISENSORY INTERACTIONS; SPEECH; CORTEX; INTEGRATION; CONFLICT; NEGATIVITY; ACTIVATION; OMISSIONS; RESPONSES; MMN;
D O I
10.1016/j.brainres.2015.01.036
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The amplitude of auditory components of the event-related potential (ER?) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one's own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual-auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex. This article is part of a Special Issue entitled SI: Prediction and Attention. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:88 / 96
页数:9
相关论文
共 50 条
  • [1] Electrophysiological alterations in motor-auditory predictive coding in autism spectrum disorder
    van Laarhoven, Thijs
    Stekelenburg, Jeroen J.
    Eussen, Mart L. J. M.
    Vroomen, Jean
    AUTISM RESEARCH, 2019, 12 (04) : 589 - 599
  • [2] Atypical visual-auditory predictive coding in autism spectrum disorder: Electrophysiological evidence from stimulus omissions
    van Laarhoven, Thijs
    Stekelenburg, Jeroen J.
    Eussen, Mart L. J. M.
    Vroomen, Jean
    AUTISM, 2020, 24 (07) : 1849 - 1859
  • [3] Memory for visual, auditory and visual-auditory material
    不详
    ANNEE PSYCHOLOGIQUE, 1936, 37 : 655 - 656
  • [4] The Study of Visual-Auditory Interactions on Lower Limb Motor Imagery
    Yu, Zhongliang
    Li, Lili
    Song, Jinchun
    Lv, Hangyuan
    FRONTIERS IN NEUROSCIENCE, 2018, 12
  • [5] Temporal and identity prediction in visual-auditory events: Electrophysiological evidence from stimulus omissions
    van Laarhoven, Thijs
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    BRAIN RESEARCH, 2017, 1661 : 79 - 87
  • [6] Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events
    Stekelenburg, Jeroen J.
    Vroomen, Jean
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2012, 6
  • [7] VISUAL-AUDITORY DISTANCE CONSTANCY
    ENGEL, GR
    DOUGHERTY, WG
    NATURE, 1971, 234 (5327) : 308 - +
  • [8] VISUAL-AUDITORY DISTANCE CONSTANCY
    DAY, RH
    NATURE, 1972, 238 (5361) : 227 - &
  • [9] Visual-auditory spatial processing in auditory cortical neurons
    Bizley, Jennifer K.
    King, Andrew J.
    BRAIN RESEARCH, 2008, 1242 : 24 - 36
  • [10] Adaptation to motor-visual and motor-auditory temporal lags transfer across modalities
    Yoshimori Sugano
    Mirjam Keetels
    Jean Vroomen
    Experimental Brain Research, 2010, 201 : 393 - 399