Dynamic Facial Expressions Prime the Processing of Emotional Prosody

被引:20
|
作者
Garrido-Vasquez, Patricia [1 ,2 ]
Pell, Marc D. [3 ]
Paulmann, Silke [4 ]
Kotz, Sonja A. [2 ,5 ]
机构
[1] Justus Liebig Univ Giessen, Dept Expt Psychol & Cognit Sci, Giessen, Germany
[2] Max Planck Inst Human Cognit & Brain Sci, Dept Neuropsychol, Leipzig, Germany
[3] McGill Univ, Sch Commun Sci & Disorders, Montreal, PQ, Canada
[4] Univ Essex, Dept Psychol, Colchester, Essex, England
[5] Univ Maastricht, Dept Neuropsychol & Psychopharmacol, Maastricht, Netherlands
来源
基金
加拿大健康研究院;
关键词
emotion; priming; event-related potentials; cross-modal prediction; dynamic faces; prosody; audiovisual; parahippocampal gyrus; EVENT-RELATED POTENTIALS; AUDIOVISUAL INTEGRATION; BRAIN POTENTIALS; SPEECH PROSODY; TIME-COURSE; PERCEPTION; VOICE; FACE; INFORMATION; REPRESENTATION;
D O I
10.3389/fnhum.2018.00244
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Visual prosody of newsreaders: Effects of information structure, emotional content and intended audience on facial expressions
    Swerts, Marc
    Krahmer, Emiel
    [J]. JOURNAL OF PHONETICS, 2010, 38 (02) : 197 - 206
  • [32] Prosody–face Interactions in Emotional Processing as Revealed by the Facial Affect Decision Task
    Marc D. Pell
    [J]. Journal of Nonverbal Behavior, 2005, 29 : 193 - 215
  • [33] Comprehension of facial expressions and prosody in Asperger Syndrome
    Gyurjyan, G
    Froming, WJ
    Froming, KB
    [J]. CLINICAL NEUROPSYCHOLOGIST, 2005, 19 (3-4) : 531 - 532
  • [34] Unattended Emotional Prosody Affects Visual Processing of Facial Expressions in Mandarin-Speaking Chinese: A Comparison With English-Speaking Canadians
    Liu, Pan
    Rigoulot, Simon
    Jiang, Xiaoming
    Zhang, Shuyi
    Pell, Marc D.
    [J]. JOURNAL OF CROSS-CULTURAL PSYCHOLOGY, 2021, 52 (03) : 275 - 294
  • [35] Anger superiority effect: The importance of dynamic emotional facial expressions
    Ceccarini, Francesco
    Caudek, Corrado
    [J]. VISUAL COGNITION, 2013, 21 (04) : 498 - 540
  • [36] Enhanced experience of emotional arousal in response to dynamic facial expressions
    Sato, Wataru
    Yoshikawa, Sakiko
    [J]. JOURNAL OF NONVERBAL BEHAVIOR, 2007, 31 (02) : 119 - 135
  • [37] Electromyographic responses to static and dynamic avatar emotional facial expressions
    Weyers, Peter
    Muehlberger, Andreas
    Hefele, Carolin
    Pauli, Paul
    [J]. PSYCHOPHYSIOLOGY, 2006, 43 (05) : 450 - 453
  • [38] Enhanced Experience of Emotional Arousal in Response to Dynamic Facial Expressions
    Wataru Sato
    Sakiko Yoshikawa
    [J]. Journal of Nonverbal Behavior, 2007, 31 : 119 - 135
  • [39] The N400 and late occipital positivity in processing dynamic facial expressions with natural emotional voice
    Mori, Kazuma
    Tanaka, Akihiro
    Kawabata, Hideaki
    Arao, Hiroshi
    [J]. NEUROREPORT, 2021, 32 (10) : 858 - 863
  • [40] Implicit and explicit processing of emotional facial expressions in Parkinson's disease
    Wagenbreth, Caroline
    Wattenberg, Lena
    Heinze, Hans-Jochen
    Zaehle, Tino
    [J]. BEHAVIOURAL BRAIN RESEARCH, 2016, 303 : 182 - 190