Context-Dependent Multimodal Sentiment Analysis Based on a Complex Attention Mechanism

被引:2
|
作者
Deng, Lujuan [1 ]
Liu, Boyi [1 ]
Li, Zuhe [1 ]
Ma, Jiangtao [1 ]
Li, Hanbing [2 ]
机构
[1] Zhengzhou Univ Light Ind, Sch Comp & Commun Engn, Zhengzhou 450002, Peoples R China
[2] Songshan Lab, Zhengzhou 450000, Peoples R China
关键词
sentiment analysis; deep learning; complex attention mechanism; CLASSIFICATION;
D O I
10.3390/electronics12163516
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multimodal sentiment analysis aims to understand people's attitudes and opinions from different data forms. Traditional modality fusion methods for multimodal sentiment analysis con-catenate or multiply various modalities without fully utilizing context information and the correlation between modalities. To solve this problem, this article provides a new model based on a multimodal sentiment analysis framework based on a recurrent neural network with a complex attention mechanism. First, after the raw data is preprocessed, the numerical feature representation is obtained using feature extraction. Next, the numerical features are input into the recurrent neural network, and the output results are multimodally fused using a complex attention mechanism layer. The objective of the complex attention mechanism is to leverage enhanced non-linearity to more effectively capture the inter-modal correlations, thereby improving the performance of multimodal sentiment analysis. Finally, the processed results are fed into the classification layer and the sentiment output is obtained using the classification layer. This process can effectively capture the semantic information and contextual relationship of the input sequence and fuse different pieces of modal information. Our model was tested on the CMU-MOSEI datasets, achieving an accuracy of 82.04%.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] GATED MECHANISM FOR ATTENTION BASED MULTIMODAL SENTIMENT ANALYSIS
    Kumar, Ayush
    Vepa, Jithendra
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4477 - 4481
  • [2] Multimodal Sentiment Analysis Based on Bidirectional Mask Attention Mechanism
    Zhang Y.
    Zhang H.
    Liu Y.
    Liang K.
    Wang Y.
    Data Analysis and Knowledge Discovery, 2023, 7 (04) : 46 - 55
  • [3] Emoji multimodal microblog sentiment analysis based on mutual attention mechanism
    Lou, Yinxia
    Zhou, Junxiang
    Zhou, Jun
    Ji, Donghong
    Zhang, Qing
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [4] Multimodal sentiment analysis based on multi-head attention mechanism
    Xi, Chen
    Lu, Guanming
    Yan, Jingjie
    ICMLSC 2020: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, 2020, : 34 - 39
  • [5] Multimodal Sentiment Analysis Based on Attention Mechanism and Tensor Fusion Network
    Zhang, Kang
    Geng, Yushui
    Zhao, Jing
    Li, Wenxiao
    Liu, Jianxin
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 1473 - 1477
  • [6] Multimodal sentiment analysis based on multiple attention
    Wang, Hongbin
    Ren, Chun
    Yu, Zhengtao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 140
  • [7] Context-Dependent Sentiment Analysis in User-Generated Videos
    Poria, Soujanya
    Cambria, Erik
    Hazarika, Devamanyu
    Mazumder, Navonil
    Zadeh, Amir
    Morency, Louis-Philippe
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 873 - 883
  • [8] Multimodal Sentiment Analysis Based on a Cross-Modal Multihead Attention Mechanism
    Deng, Lujuan
    Liu, Boyi
    Li, Zuhe
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (01): : 1157 - 1170
  • [9] A Multimodal Sentiment Analysis Approach Based on a Joint Chained Interactive Attention Mechanism
    Qiu, Keyuan
    Zhang, Yingjie
    Zhao, Jiaxu
    Zhang, Shun
    Wang, Qian
    Chen, Feng
    ELECTRONICS, 2024, 13 (10)
  • [10] Dynamic and context-dependent stock price prediction using attention modules and news sentiment
    Nicole Königstein
    Digital Finance, 2023, 5 (3-4): : 449 - 481