Multimodal sentiment analysis based on multi-head attention mechanism

被引:41
|
作者
Xi, Chen [1 ]
Lu, Guanming [1 ]
Yan, Jingjie [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Coll Telecommun & Informat Engn, Nanjing, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Feature extraction; Multimodal sentiment analysis; Multi-head attention mechanism; FUSION;
D O I
10.1145/3380688.3380693
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multimodal sentiment analysis is still a promising area of research, which has many issues needed to be addressed. Among them, extracting reasonable unimodal features and designing a robust multimodal sentiment analysis model is the most basic problem. This paper presents some novel ways of extracting sentiment features from visual, audio and text, furthermore use these features to verify the multimodal sentiment analysis model based on multi-head attention mechanism. The proposed model is evaluated on Multimodal Opinion Utterances Dataset (MOUD) corpus and CMU Multi-modal Opinion-level Sentiment Intensity (CMU-MOSI) corpus for multimodal sentiment analysis. Experimental results prove the effectiveness of the proposed approach. The accuracy of the MOUD and MOSI datasets is 90.43% and 82.71%, respectively. Compared to the state-of-the-art models, the improvement of the performance are approximately 2 and 0.4 points.
引用
收藏
页码:34 / 39
页数:6
相关论文
共 50 条
  • [41] Speech enhancement method based on the multi-head self-attention mechanism
    Chang X.
    Zhang Y.
    Yang L.
    Kou J.
    Wang X.
    Xu D.
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [42] Convolutional multi-head self-attention on memory for aspect sentiment classification
    Zhang, Yaojie
    Xu, Bing
    Zhao, Tiejun
    [J]. IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (04) : 1038 - 1044
  • [43] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    [J]. NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [44] Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification
    Yaojie Zhang
    Bing Xu
    Tiejun Zhao
    [J]. IEEE/CAA Journal of Automatica Sinica, 2020, 7 (04) : 1038 - 1044
  • [45] Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification
    Zhang, Qiuyue
    Lu, Ran
    Wang, Qicai
    Zhu, Zhenfang
    Liu, Peiyu
    [J]. IEEE ACCESS, 2019, 7 : 160017 - 160028
  • [46] Affective-Knowledge-Enhanced Graph Convolutional Networks for Aspect-Based Sentiment Analysis with Multi-Head Attention
    Cui, Xiaodong
    Tao, Wenbiao
    Cui, Xiaohui
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (07):
  • [47] Leveraging Multi-head Attention Mechanism to Improve Event Detection
    Tong, Meihan
    Xu, Bin
    Hou, Lei
    Li, Juanzi
    Wang, Shuai
    [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 268 - 280
  • [48] Automatic scene generation using sentiment analysis and bidirectional recurrent neural network with multi-head attention
    Dharaniya, R.
    Indumathi, J.
    Uma, G. V.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (19): : 16945 - 16958
  • [49] Acoustic Scene Analysis with Multi-head Attention Networks
    Wang, Weimin
    Wang, Weiran
    Sun, Ming
    Wang, Chao
    [J]. INTERSPEECH 2020, 2020, : 1191 - 1195
  • [50] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Xue-Liang Leng
    Xiao-Ai Miao
    Tao Liu
    [J]. Multimedia Tools and Applications, 2021, 80 : 12581 - 12600