Dual-Perspective Fusion Network for Aspect-Based Multimodal Sentiment Analysis

被引:1
|
作者
Wang, Di [1 ]
Tian, Changning [1 ]
Liang, Xiao [1 ]
Zhao, Lin [2 ]
He, Lihuo [1 ]
Wang, Quan [1 ]
机构
[1] Xidian Univ, Key Lab Smart Human Comp Interact & Wearable Techn, Xian 710071, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Jiangsu Key Lab Image & Video Understanding Social, Nanjing 210094, Peoples R China
基金
中国国家自然科学基金;
关键词
Sentiment analysis; Task analysis; Data mining; Semantics; Syntactics; Feature extraction; Visualization; Aspect-based sentiment analysis; multimodal sentiment analysis; graph neural network;
D O I
10.1109/TMM.2023.3321435
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Aspect-based multimodal sentiment analysis (ABMSA) is an important sentiment analysis task that analyses aspect-specific sentiment in data with different modalities (usually multimodal data with text and images). Previous works usually ignore the overall sentiment tendency when analyzing the sentiment of each aspect term. However, the overall sentiment tendency is highly correlated with aspect-specific sentiment. In addition, existing methods neglect to explore and make full use of the fine-grained multimodal information closely related to aspect terms. To address these limitations, we propose a dual-perspective fusion network (DPFN) that considers both global and local fine-grained sentiment information in multimodal data. From the global perspective, we use text-image caption pairs to obtain a global representation containing information about the overall sentiment tendencies. From the local fine-grained perspective, we construct two graph structures to explore the fine-grained information in texts and images. Finally, aspect-level sentiment polarities can be obtained by analyzing the combination of global and local fine-grained sentiment information. Experimental results on two multimodal Twitter datasets show that the proposed DPFN model outperforms state-of-the-art methods.
引用
收藏
页码:4028 / 4038
页数:11
相关论文
共 50 条
  • [1] MSFNet: modality smoothing fusion network for multimodal aspect-based sentiment analysis
    Xiang, Yan
    Cai, Yunjia
    Guo, Junjun
    [J]. FRONTIERS IN PHYSICS, 2023, 11
  • [2] Interactive Fusion Network with Recurrent Attention for Multimodal Aspect-based Sentiment Analysis
    Wang, Jun
    Wang, Qianlong
    Wen, Zhiyuan
    Liang, Xingwei
    Xu, Ruifeng
    [J]. ARTIFICIAL INTELLIGENCE, CICAI 2022, PT III, 2022, 13606 : 298 - 309
  • [3] An adaptive dual graph convolution fusion network for aspect-based sentiment analysis
    Wang, Chunmei
    Luo, Yuan
    Meng, Chunli
    Yuan, Feiniu
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (06)
  • [4] Position Perceptive Multi-Hop Fusion Network for Multimodal Aspect-Based Sentiment Analysis
    Fan, Hao
    Chen, Junjie
    [J]. IEEE ACCESS, 2024, 12 : 90586 - 90595
  • [5] A Survey on Multimodal Aspect-Based Sentiment Analysis
    Zhao, Hua
    Yang, Manyu
    Bai, Xueyang
    Liu, Han
    [J]. IEEE ACCESS, 2024, 12 : 12039 - 12052
  • [6] Visual Enhancement Capsule Network for Aspect-based Multimodal Sentiment Analysis
    Zhang, Yifei
    Zhang, Zhiqing
    Feng, Shi
    Wang, Daling
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (23):
  • [7] Self-adaptive attention fusion for multimodal aspect-based sentiment analysis
    Wang, Ziyue
    Guo, Junjun
    [J]. MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2024, 21 (01) : 1305 - 1320
  • [8] AMIFN: Aspect-guided multi-view interactions and fusion network for multimodal aspect-based sentiment analysis°
    Yang, Juan
    Xu, Mengya
    Xiao, Yali
    Du, Xu
    [J]. NEUROCOMPUTING, 2024, 573
  • [9] AMIFN: Aspect-guided multi-view interactions and fusion network for multimodal aspect-based sentiment analysis
    Yang, Juan
    Xu, Mengya
    Xiao, Yali
    Du, Xu
    [J]. Neurocomputing, 2024, 573
  • [10] Multi-grained fusion network with self-distillation for aspect-based multimodal sentiment analysis
    Yang, Juan
    Xiao, Yali
    Du, Xu
    [J]. KNOWLEDGE-BASED SYSTEMS, 2024, 293