Attention fusion network for multimodal sentiment analysis

被引:0
|
作者
Yuanyi Luo
Rui Wu
Jiafeng Liu
Xianglong Tang
机构
[1] Harbin Institute of Technology,
来源
关键词
Multimodal sentiment analysis; Attention mechanism; Multimodal fusion;
D O I
暂无
中图分类号
学科分类号
摘要
The main research problem in multimodal sentiment analysis is to model inter-modality dynamics. However, most of the current work cannot consider enough in this aspect. In this study, we propose a multimodal fusion network MSA-AFN, which considers both multimodal relationships and differences in modal contributions to the task. Specifically, in the feature extraction process, we consider not only the relationship between audio and text, but also the contribution of temporal features to the task. In the process of multimodal fusion, based on the soft attention mechanism, the feature representation of each modality is weighted and connected according to their contribution to the task. We evaluate our proposed approach on the Chinese multimodal sentiment analysis dataset: CH-SIMS. Results show that our model achieves better results than comparison models. Moreover, the performance of some baselines has been improved by 0.28% to 9.5% after adding the component of our network.
引用
收藏
页码:8207 / 8217
页数:10
相关论文
共 50 条
  • [1] Attention fusion network for multimodal sentiment analysis
    Luo, Yuanyi
    Wu, Rui
    Liu, Jiafeng
    Tang, Xianglong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (03) : 8207 - 8217
  • [2] Graph Reconstruction Attention Fusion Network for Multimodal Sentiment Analysis
    Hu, Ronglong
    Yi, Jizheng
    Chen, Lijiang
    Jin, Ze
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (01) : 297 - 306
  • [3] SKEAFN: Sentiment Knowledge Enhanced Attention Fusion Network for multimodal sentiment analysis
    Zhu, Chuanbo
    Chen, Min
    Zhang, Sheng
    Sun, Chao
    Liang, Han
    Liu, Yifan
    Chen, Jincai
    INFORMATION FUSION, 2023, 100
  • [4] Multimodal Sentiment Analysis Based on Attention Mechanism and Tensor Fusion Network
    Zhang, Kang
    Geng, Yushui
    Zhao, Jing
    Li, Wenxiao
    Liu, Jianxin
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 1473 - 1477
  • [5] Gated attention fusion network for multimodal sentiment classification
    Du, Yongping
    Liu, Yang
    Peng, Zhi
    Jin, Xingnan
    KNOWLEDGE-BASED SYSTEMS, 2022, 240
  • [6] A multimodal fusion network with attention mechanisms for visual-textual sentiment analysis
    Gan, Chenquan
    Fu, Xiang
    Feng, Qingdong
    Zhu, Qingyi
    Cao, Yang
    Zhu, Ye
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 242
  • [7] Bimodal Fusion Network with Multi-Head Attention for Multimodal Sentiment Analysis
    Zhang, Rui
    Xue, Chengrong
    Qi, Qingfu
    Lin, Liyuan
    Zhang, Jing
    Zhang, Lun
    APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [8] Sentiment analysis of social media comments based on multimodal attention fusion network
    Liu, Ziyu
    Yang, Tao
    Chen, Wen
    Chen, Jiangchuan
    Li, Qinru
    Zhang, Jun
    APPLIED SOFT COMPUTING, 2024, 164
  • [9] BAFN: Bi-Direction Attention Based Fusion Network for Multimodal Sentiment Analysis
    Tang, Jiajia
    Liu, Dongjun
    Jin, Xuanyu
    Peng, Yong
    Zhao, Qibin
    Ding, Yu
    Kong, Wanzeng
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (04) : 1966 - 1978
  • [10] TeFNA: Text-centered fusion network with crossmodal attention for multimodal sentiment analysis
    Huang, Changqin
    Zhang, Junling
    Wu, Xuemei
    Wang, Yi
    Li, Ming
    Huang, Xiaodi
    KNOWLEDGE-BASED SYSTEMS, 2023, 269