Multimodal human communication - Targeting facial expressions, speech content and prosody

被引:48
|
作者
Regenbogen, Christina [1 ,2 ]
Schneider, Daniel A. [1 ,2 ]
Gur, Raquel E. [3 ]
Schneider, Frank [1 ,2 ,3 ]
Habel, Ute [1 ,2 ]
Kellermann, Thilo [1 ,2 ]
机构
[1] Rhein Westfal TH Aachen, Sch Med, Dept Psychiat & Psychotherapy & Psychosomat, D-52074 Aachen, Germany
[2] JARA Translat Brain Med, Julich, Germany
[3] Univ Penn, Sch Med, Dept Psychiat, Philadelphia, PA 19104 USA
关键词
SUPERIOR COLLICULUS; EMOTIONAL PROSODY; AUDIOVISUAL INTEGRATION; FUNCTIONAL-ANATOMY; SOCIAL COGNITION; NEURAL RESPONSE; VISUAL-CORTEX; BRAIN; FMRI; EMPATHY;
D O I
10.1016/j.neuroimage.2012.02.043
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Human communication is based on a dynamic information exchange of the communication channels facial expressions, prosody, and speech content. This fMRI study elucidated the impact of multimodal emotion processing and the specific contribution of each channel on behavioral empathy and its prerequisites. Ninety-six video clips displaying actors who told self-related stories were presented to 27 healthy participants. In two conditions, all channels uniformly transported only emotional or neutral information. Three conditions selectively presented two emotional channels and one neutral channel. Subjects indicated the actors' emotional valence and their own while fMRI was recorded. Activation patterns of tri-channel emotional communication reflected multimodal processing and facilitative effects for empathy. Accordingly, subjects' behavioral empathy rates significantly deteriorated once one source was neutral. However, emotionality expressed via two of three channels yielded activation in a network associated with theory-of-mind-processes. This suggested participants' effort to infer mental states of their counterparts and was accompanied by a decline of behavioral empathy, driven by the participants' emotional responses. Channel-specific emotional contributions were present in modality-specific areas. The identification of different network-nodes associated with human interactions constitutes a prerequisite for understanding dynamics that underlie multimodal integration and explain the observed decline in empathy rates. This task might also shed light on behavioral deficits and neural changes that accompany psychiatric diseases. (C) 2012 Elsevier Inc. All rights reserved.
引用
收藏
页码:2346 / 2356
页数:11
相关论文
共 50 条
  • [1] The differential contribution of facial expressions, prosody, and speech content to empathy
    Regenbogen, Christina
    Schneider, Daniel A.
    Finkelmeyer, Andreas
    Kohn, Nils
    Derntl, Birgit
    Kellermann, Thilo
    Gur, Raquel E.
    Schneider, Frank
    Habel, Ute
    [J]. COGNITION & EMOTION, 2012, 26 (06) : 995 - 1014
  • [2] Induction, recording and recognition of natural emotions from facial expressions and speech prosody
    Kostas Karpouzis
    George Caridakis
    Roddy Cowie
    Ellen Douglas-Cowie
    [J]. Journal on Multimodal User Interfaces, 2013, 7 : 195 - 206
  • [3] Multimodal Emotion Recognition Based on Facial Expressions, Speech, and EEG
    Pan, Jiahui
    Fang, Weijie
    Zhang, Zhihang
    Chen, Bingzhi
    Zhang, Zheng
    Wang, Shuihua
    [J]. IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY, 2024, 5 : 396 - 403
  • [4] Induction, recording and recognition of natural emotions from facial expressions and speech prosody
    Karpouzis, Kostas
    Caridakis, George
    Cowie, Roddy
    Douglas-Cowie, Ellen
    [J]. JOURNAL ON MULTIMODAL USER INTERFACES, 2013, 7 (03) : 195 - 206
  • [5] Multimodal Emotion Recognition Based on Facial Expressions, Speech, and Body Gestures
    Yan, Jingjie
    Li, Peiyuan
    Du, Chengkun
    Zhu, Kang
    Zhou, Xiaoyang
    Liu, Ying
    Wei, Jinsheng
    [J]. ELECTRONICS, 2024, 13 (18)
  • [6] Unobtrusive multimodal emotion detection in adaptive interfaces: Speech and facial expressions
    Truong, Khiet P.
    van Leeuwen, David A.
    Neerincx, Mark A.
    [J]. FOUNDATIONS OF AUGMENTED COGNITION, PROCEEDINGS, 2007, 4565 : 354 - +
  • [7] Comprehension of facial expressions and prosody in Asperger Syndrome
    Gyurjyan, G
    Froming, WJ
    Froming, KB
    [J]. CLINICAL NEUROPSYCHOLOGIST, 2005, 19 (3-4) : 531 - 532
  • [8] Human Emotion Detection through Speech and Facial Expressions
    Kudiri, Krishna Mohan
    Said, Abas Md
    Nayan, M. Yunus
    [J]. 2016 3RD INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION SCIENCES (ICCOINS), 2016, : 351 - 356
  • [9] Visual prosody of newsreaders: Effects of information structure, emotional content and intended audience on facial expressions
    Swerts, Marc
    Krahmer, Emiel
    [J]. JOURNAL OF PHONETICS, 2010, 38 (02) : 197 - 206
  • [10] Facial expressions and speech acts
    Yoshikawa, S
    Nakamura, M
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 1996, 31 (3-4) : 18429 - 18429