Multimodal Emotion-Cause Pair Extraction in Conversations

被引:5
|
作者
Wang, Fanfan [1 ]
Ding, Zixiang [1 ]
Xia, Rui [1 ]
Li, Zhaoyu [1 ]
Yu, Jianfei [1 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Jiangsu, Peoples R China
关键词
Affective computing; emotion analysis; emotion cause extraction; emotion-cause pair extraction; multimodal learning; AGREEMENT; RELIABILITY;
D O I
10.1109/TAFFC.2022.3226559
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conversation is an important form of human communication and contains a large number of emotions. It is interesting to discover emotions and their causes in conversations. Conversation in its natural form is multimodal. Many studies have been carried out on multimodal emotion recognition in conversations, yet there is still a lack of work on multimodal emotion cause analysis. In this article, we introduce a new task named Multimodal Emotion-Cause Pair Extraction in Conversations, aiming to jointly extract emotions and the corresponding causes from conversations reflected in multiple modalities (i.e., text, audio and video). We accordingly construct a multimodal conversational emotion cause dataset, Emotion-Cause-in-Friends, which contains 9,794 multimodal emotion-cause pairs among 13,619 utterances in the Friends sitcom. We benchmark the task by establishing two baseline systems including a heuristic approach considering inherent patterns in the location of causes and emotions and a deep learning approach that incorporates multimodal features for emotion-cause pair extraction, and conduct the human performance test for comparison. Furthermore, we investigate the effect of multimodal information, explore the potential of incorporating commonsense knowledge, and perform the task under both Static and Real-time settings.
引用
收藏
页码:1832 / 1844
页数:13
相关论文
共 50 条
  • [1] ECPEC: Emotion-Cause Pair Extraction in Conversations
    Li, Wei
    Li, Yang
    Pandelea, Vlad
    Ge, Mengshi
    Zhu, Luyao
    Cambria, Erik
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 1754 - 1765
  • [2] Recurrent synchronization network for emotion-cause pair extraction
    Chen, Fang
    Shi, Ziwei
    Yang, Zhongliang
    Huang, Yongfeng
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [3] Emotion-cause pair extraction based on interactive attention
    Huang, Weichun
    Yang, Yixue
    Huang, Xiaohui
    Peng, Zhiying
    Xiong, Liyan
    [J]. APPLIED INTELLIGENCE, 2023, 53 (09) : 10548 - 10558
  • [4] Emotion-cause pair extraction based on interactive attention
    Weichun Huang
    Yixue Yang
    Xiaohui Huang
    Zhiying Peng
    Liyan Xiong
    [J]. Applied Intelligence, 2023, 53 : 10548 - 10558
  • [5] Modularized Mutuality Network for Emotion-Cause Pair Extraction
    Shang, Xichen
    Chen, Chuxin
    Chen, Zipeng
    Ma, Qianli
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 539 - 549
  • [6] Recurrent synchronization network for emotion-cause pair extraction
    Chen, Fang
    Shi, Ziwei
    Yang, Zhongliang
    Huang, Yongfeng
    [J]. Knowledge-Based Systems, 2022, 238
  • [7] Emotion-Cause Pair Extraction: A New Task to Emotion Analysis in Texts
    Xia, Rui
    Ding, Zixiang
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1003 - 1012
  • [8] Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts
    Jeong, DongJin
    Bak, JinYeong
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 3288 - 3298
  • [9] An exploration of mutual information based on emotion-cause pair extraction
    Hu, Guimin
    Zhao, Yi
    Lu, Guangming
    Yin, Fanghao
    Chen, Jiashan
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [10] Emotion-Cause Pair Extraction with Graph Attention Neural Network
    Chen, Jiantao
    Shu, Xin
    Chen, Zhichen
    [J]. 2024 7TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND BIG DATA, ICAIBD 2024, 2024, : 518 - 522