Relevance-aware visual entity filter network for multimodal aspect-based sentiment analysis

被引:0
|
作者
Chen, Yifan [1 ]
Xiong, Haoliang [1 ]
Li, Kuntao [1 ]
Mai, Weixing [1 ]
Xue, Yun [1 ]
Cai, Qianhua [1 ]
Li, Fenghuan [2 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Foshan 528225, Guangdong, Peoples R China
[2] Guangdong Univ Technol, Sch Comp Sci & Technol, Guangzhou 510006, Guangdong, Peoples R China
关键词
Multimodal aspect-based sentiment analysis (MABSA); Relevance-aware visual entity filter; External knowledge; Image-aspect relevance; Cross-modal alignment;
D O I
10.1007/s13042-024-02342-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multimodal aspect-based sentiment analysis, which aims to identify the sentiment polarities over each aspect mentioned in an image-text pair, has sparked considerable research interest in the field of multimodal analysis. Despite existing approaches have shown remarkable results in incorporating external knowledge to enhance visual entity information, they still suffer from two problems: (1) the image-aspect global relevance. (2) the entity-aspect local alignment. To tackle these issues, we propose a Relevance-Aware Visual Entity Filter Network (REF) for MABSA. Specifically, we utilize the nouns of ANPs extracted from the given image as bridges to facilitate cross-modal feature alignment. Moreover, we introduce an additional "UNRELATED" marker word and utilize Contrastive Content Re-sourcing (CCR) and Contrastive Content Swapping (CCS) constraints to obtain accurate attention weight to identify image-aspect relevance for dynamically controlling the contribution of visual information. We further adopt the accurate reversed attention weight distributions to selectively filter out aspect-unrelated visual entities for better entity-aspect alignment. Comprehensive experimental results demonstrate the consistent superiority of our REF model over state-of-the-art approaches on the Twitter-2015 and Twitter-2017 datasets.
引用
收藏
页码:1389 / 1402
页数:14
相关论文
共 50 条
  • [31] A relative position attention network for aspect-based sentiment analysis
    Wu, Chao
    Xiong, Qingyu
    Gao, Min
    Li, Qiude
    Yu, Yang
    Wang, Kaige
    KNOWLEDGE AND INFORMATION SYSTEMS, 2021, 63 (02) : 333 - 347
  • [32] A relative position attention network for aspect-based sentiment analysis
    Chao Wu
    Qingyu Xiong
    Min Gao
    Qiude Li
    Yang Yu
    Kaige Wang
    Knowledge and Information Systems, 2021, 63 : 333 - 347
  • [33] AMIFN: Aspect-guided multi-view interactions and fusion network for multimodal aspect-based sentiment analysis°
    Yang, Juan
    Xu, Mengya
    Xiao, Yali
    Du, Xu
    NEUROCOMPUTING, 2024, 573
  • [34] AMIFN: Aspect-guided multi-view interactions and fusion network for multimodal aspect-based sentiment analysis
    Yang, Juan
    Xu, Mengya
    Xiao, Yali
    Du, Xu
    Neurocomputing, 2024, 573
  • [35] Aspect-Based Sentiment Analysis With Heterogeneous Graph Neural Network
    An, Wenbin
    Tian, Feng
    Chen, Ping
    Zheng, Qinghua
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2023, 10 (01) : 403 - 412
  • [36] Polarity enriched attention network for aspect-based sentiment analysis
    Wadawadagi R.
    Pagi V.
    International Journal of Information Technology, 2022, 14 (6) : 2767 - 2778
  • [37] Semantics perception and refinement network for aspect-based sentiment analysis
    Song, Wei
    Wen, Zijian
    Xiao, Zhiyong
    Park, Soon Cheol
    KNOWLEDGE-BASED SYSTEMS, 2021, 214
  • [38] Aspect-based sentiment analysis with gated alternate neural network
    Liu, Ning
    Shen, Bo
    KNOWLEDGE-BASED SYSTEMS, 2020, 188
  • [39] Relational Graph Attention Network for Aspect-based Sentiment Analysis
    Wang, Kai
    Shen, Weizhou
    Yang, Yunyi
    Quan, Xiaojun
    Wang, Rui
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3229 - 3238
  • [40] Filter channel network based on contextual position weight for aspect-based sentiment classification
    Zhu, Chao
    Yi, Benshun
    Luo, Laigan
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (12): : 17874 - 17894