Electrophysiological evidence for a self-processing advantage during audiovisual speech integration

被引:0
|
作者
Avril Treille
Coriandre Vilain
Sonia Kandel
Marc Sato
机构
[1] CNRS & Grenoble Université,GIPSA
[2] CNRS & Aix-Marseille Université,lab, Département Parole & Cognition
[3] Université Stendhal,Laboratoire Parole & Langage
来源
关键词
Self recognition; Speech perception; Audiovisual integration; EEG;
D O I
暂无
中图分类号
学科分类号
摘要
Previous electrophysiological studies have provided strong evidence for early multisensory integrative mechanisms during audiovisual speech perception. From these studies, one unanswered issue is whether hearing our own voice and seeing our own articulatory gestures facilitate speech perception, possibly through a better processing and integration of sensory inputs with our own sensory-motor knowledge. The present EEG study examined the impact of self-knowledge during the perception of auditory (A), visual (V) and audiovisual (AV) speech stimuli that were previously recorded from the participant or from a speaker he/she had never met. Audiovisual interactions were estimated by comparing N1 and P2 auditory evoked potentials during the bimodal condition (AV) with the sum of those observed in the unimodal conditions (A + V). In line with previous EEG studies, our results revealed an amplitude decrease of P2 auditory evoked potentials in AV compared to A + V conditions. Crucially, a temporal facilitation of N1 responses was observed during the visual perception of self speech movements compared to those of another speaker. This facilitation was negatively correlated with the saliency of visual stimuli. These results provide evidence for a temporal facilitation of the integration of auditory and visual speech signals when the visual situation involves our own speech gestures.
引用
收藏
页码:2867 / 2876
页数:9
相关论文
共 50 条
  • [31] Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception
    Klucharev, V
    Möttönen, R
    Sams, M
    COGNITIVE BRAIN RESEARCH, 2003, 18 (01): : 65 - 75
  • [32] Multisensory Audiovisual Processing in Children With a Sensory Processing Disorder (II): Speech Integration Under Noisy Environmental Conditions
    Foxe, John J.
    Del Bene, Victor A.
    Ross, Lars A.
    Ridgway, Elizabeth M.
    Francisco, Ana A.
    Molholm, Sophie
    FRONTIERS IN INTEGRATIVE NEUROSCIENCE, 2020, 14
  • [33] No electrophysiological evidence for semantic processing during inattentional blindness
    Hutchinson, Brendan T.
    Jack, Bradley N.
    Pammer, Kristen
    Canseco-Gonzalez, Enriqueta
    Pitts, Michael
    NEUROIMAGE, 2024, 299
  • [34] Neural Activity During Audiovisual Speech Processing: Protocol For a Functional Neuroimaging Study
    Balint, Andras
    Wimmer, Wilhelm
    Caversaccio, Marco
    Weder, Stefan
    JMIR RESEARCH PROTOCOLS, 2022, 11 (06):
  • [35] The interaction between stimulus factors and cognitive factors during multisensory integration of audiovisual speech
    Stevenson, Ryan A.
    Wallace, Mark T.
    Altieri, Nicholas
    FRONTIERS IN PSYCHOLOGY, 2014, 5
  • [36] Crystal structure of aspartate decarboxylase at 2.2 Å resolution provides evidence for an ester in protein self-processing
    Albert, A
    Dhanaraj, V
    Genschel, U
    Khan, GL
    Ramjee, MK
    Pulido, R
    Sibanda, BL
    von Delft, F
    Witty, M
    Blundell, TL
    Smith, AG
    Abell, C
    NATURE STRUCTURAL BIOLOGY, 1998, 5 (04) : 289 - 293
  • [37] Audiovisual speech integration in pervasive developmental disorder:: evidence from event-related potentials
    Magnee, Maurice J. C. M.
    de Gelder, Beatrice
    van Engeland, Herman
    Kemner, Chantal
    JOURNAL OF CHILD PSYCHOLOGY AND PSYCHIATRY, 2008, 49 (09) : 995 - 1000
  • [38] Electrophysiological evidence against parallel motor processing during multitasking
    Mittelstaedt, Victor
    Mackenzie, Ian Grant
    Leuthold, Hartmut
    Miller, Jeff
    PSYCHOPHYSIOLOGY, 2022, 59 (01)
  • [39] How are visemes and graphemes integrated with speech sounds during spoken word recognition? ERP evidence for supra-additive responses during audiovisual compared to auditory speech processing
    Pattamadilok, Chotiga
    Sato, Marc
    BRAIN AND LANGUAGE, 2022, 225
  • [40] Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition
    Stevenson, Ryan A.
    James, Thomas W.
    NEUROIMAGE, 2009, 44 (03) : 1210 - 1223