Prompt-Oriented Fine-Tuning Dual Bert for Aspect-Based Sentiment Analysis

被引:0
|
作者
Yin, Wen [1 ,2 ]
Xu, Yi [1 ,2 ]
Liu, Cencen [1 ,2 ]
Zheng, Dezhang [1 ,2 ]
Wang, Qi [3 ]
Liu, Chuanjie [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 610054, Peoples R China
[2] Trusted Cloud Comp & Big Data Key Lab Sichuan Pro, Chengdu 611731, Peoples R China
[3] Chengdu Jiuzhou Elect Informat Syst Co Ltd, Chengdu, Peoples R China
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X | 2023年 / 14263卷
基金
中国国家自然科学基金;
关键词
ABSA; BERT; Prompt learning;
D O I
10.1007/978-3-031-44204-9_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect-Based Sentiment Analysis (ABSA) is a fine-grained sentiment analysis task that aims to predict sentiment polarity towards a specific aspect occurring in the given sentence. Recently, pre-trained language models such asBERThave shown great progress in this regard. However, due to the mismatch between pre-training and fine-tuning, dealing with informal expressions and complex sentences is facing challenges and it is worthwhile devoting much effort to this. To tackle this, in this paper, we propose a Prompt-oriented Fine-tuning Dual BERT (PFDualBERT) model that considers the complex semantic relevance and the scarce data samples simultaneously. To reduce the impact of such mismatches, we design a ProBERT influenced by the idea of prompt Learning. Specifically, we design a SemBERT module to capture semantic correlations. We refit SemBERT with aspect-based self-attention. The experimental results on three datasets certify that our PFDualBERT model outperforms state-of-the-artmethods, and our further analysis substantiates that our model can exhibit stable performance in low-resource environments.
引用
收藏
页码:505 / 517
页数:13
相关论文
共 50 条
  • [21] Data Augmentation Using BERT-Based Models for Aspect-Based Sentiment Analysis
    Hollander, Bron
    Frasincar, Flavius
    van der Knaap, Finn
    WEB ENGINEERING, ICWE 2024, 2024, 14629 : 115 - 122
  • [22] A BERT Fine-tuning Model for Targeted Sentiment Analysis of Chinese Online Course Reviews
    Zhang, Huibing
    Dong, Junchao
    Min, Liang
    Bi, Peng
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2020, 29 (7-8)
  • [23] Sentiment Difficulty in Aspect-Based Sentiment Analysis
    Chifu, Adrian-Gabriel
    Fournier, Sebastien
    MATHEMATICS, 2023, 11 (22)
  • [24] Aspect-based sentiment analysis via dual residual networks with sentiment knowledge
    Zhu, Chao
    Ding, Qiang
    JOURNAL OF SUPERCOMPUTING, 2025, 81 (01):
  • [25] Aspect-based sentiment analysis using adaptive aspect-based lexicons
    Mowlaei, Mohammad Erfan
    Abadeh, Mohammad Saniee
    Keshavarz, Hamidreza
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 148
  • [26] Fine-Tuning BERT for Aspect Extraction in Multi-domain ABSA
    Akram, Arwa
    Sabir, Aliea
    Informatica (Slovenia), 2023, 47 (09): : 123 - 131
  • [27] Enhancing BERT Representation With Context-Aware Embedding for Aspect-Based Sentiment Analysis
    Li, Xinlong
    Fu, Xingyu
    Xu, Guangluan
    Yang, Yang
    Wang, Jiuniu
    Jin, Li
    Liu, Qing
    Xiang, Tianyuan
    IEEE ACCESS, 2020, 8 : 46868 - 46876
  • [28] tRF-BERT: A transformative approach to aspect-based sentiment analysis in the bengali language
    Ahmed, Shihab
    Samia, Moythry Manir
    Sayma, Maksuda Haider
    Kabir, Md. Mohsin
    Mridha, M. F.
    PLOS ONE, 2024, 19 (09):
  • [29] Optimizing Aspect-Based Sentiment Analysis Using BERT for Comprehensive Analysis of Indonesian Student Feedback
    Jazuli, Ahmad
    Widowati, Retno
    Kusumaningrum, Retno
    APPLIED SCIENCES-BASEL, 2025, 15 (01):
  • [30] AoM: Detecting Aspect-oriented Information for Multimodal Aspect-Based Sentiment Analysis
    Zhou, Ru
    Guo, Wenya
    Liu, Xumeng
    Yu, Shenglong
    Zhang, Ying
    Yuan, Xiaojie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8184 - 8196