Multimodal sentiment analysis based on multi-head attention mechanism

被引:41
|
作者
Xi, Chen [1 ]
Lu, Guanming [1 ]
Yan, Jingjie [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Coll Telecommun & Informat Engn, Nanjing, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Feature extraction; Multimodal sentiment analysis; Multi-head attention mechanism; FUSION;
D O I
10.1145/3380688.3380693
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multimodal sentiment analysis is still a promising area of research, which has many issues needed to be addressed. Among them, extracting reasonable unimodal features and designing a robust multimodal sentiment analysis model is the most basic problem. This paper presents some novel ways of extracting sentiment features from visual, audio and text, furthermore use these features to verify the multimodal sentiment analysis model based on multi-head attention mechanism. The proposed model is evaluated on Multimodal Opinion Utterances Dataset (MOUD) corpus and CMU Multi-modal Opinion-level Sentiment Intensity (CMU-MOSI) corpus for multimodal sentiment analysis. Experimental results prove the effectiveness of the proposed approach. The accuracy of the MOUD and MOSI datasets is 90.43% and 82.71%, respectively. Compared to the state-of-the-art models, the improvement of the performance are approximately 2 and 0.4 points.
引用
收藏
页码:34 / 39
页数:6
相关论文
共 50 条
  • [21] Combining Multi-Head Attention and Sparse Multi-Head Attention Networks for Session-Based Recommendation
    Zhao, Zhiwei
    Wang, Xiaoye
    Xiao, Yingyuan
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [22] Fiber communication receiver models based on the multi-head attention mechanism
    臧裕斌
    于振明
    徐 坤
    陈明华
    杨四刚
    陈宏伟
    [J]. Chinese Optics Letters, 2023, 21 (03) : 34 - 39
  • [23] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    [J]. PLOS ONE, 2024, 19 (06):
  • [24] Machine Reading Comprehension Model Based on Multi-head Attention Mechanism
    Xue, Yong
    [J]. ADVANCED INTELLIGENT TECHNOLOGIES FOR INDUSTRY, 2022, 285 : 45 - 58
  • [25] A Specific Emitter Identification Approach Based on Multi-Head Attention Mechanism
    Bo, Yulian
    Zhang, Wensheng
    Yang, Tongtong
    Jiang, Mingyan
    Sun, Jian
    Wang, Cheng-Xiang
    [J]. 2023 INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, IWCMC, 2023, : 953 - 958
  • [26] Multi-Task Multi-Head Attention Memory Network for Fine-Grained Sentiment Analysis
    Dai, Zehui
    Dai, Wei
    Liu, Zhenhua
    Rao, Fengyun
    Chen, Huajie
    Zhang, Guangpeng
    Ding, Yadong
    Liu, Jiyang
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 609 - 620
  • [27] Sentiment Analysis Using Multi-Head Attention Capsules With Multi-Channel CNN and Bidirectional GRU
    Cheng, Yan
    Sun, Huan
    Chen, Haomai
    Li, Meng
    Cai, Yingying
    Cai, Zhuang
    Huang, Jing
    [J]. IEEE ACCESS, 2021, 9 : 60383 - 60395
  • [28] Multi-Head Attention with Diversity for Learning Grounded Multilingual Multimodal Representations
    Huang, Po-Yao
    Chang, Xiaojun
    Hauptmann, Alexander
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1461 - 1467
  • [29] Filter gate network based on multi-head attention for aspect-level sentiment classification
    Zhou, Ziyu
    Liu, Fang'ai
    [J]. NEUROCOMPUTING, 2021, 441 (441) : 214 - 225
  • [30] Attention-Enhanced Graph Convolutional Networks for Aspect-Based Sentiment Classification with Multi-Head Attention
    Xu, Guangtao
    Liu, Peiyu
    Zhu, Zhenfang
    Liu, Jie
    Xu, Fuyong
    [J]. APPLIED SCIENCES-BASEL, 2021, 11 (08):