GeDa: Improving training data with large language models for Aspect Sentiment Triplet Extraction

被引:0
|
作者
Mai, Weixing [1 ]
Zhang, Zhengxuan [1 ]
Chen, Yifan [1 ]
Li, Kuntao [1 ]
Xue, Yun [1 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Guangdong Prov Key Lab Quantum Engn & Quantum Mat, Foshan 528225, Peoples R China
基金
中国国家自然科学基金;
关键词
Aspect sentiment triplet extraction; Improving training data; Large language models; Targeted data selection;
D O I
10.1016/j.knosys.2024.112289
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect Sentiment Triplet Extraction (ASTE) is a subtask of Aspect-based Sentiment Analysis (ABSA). Recently, ASTE methods have achieved promising results. However, the performance of ASTE models is restricted to both the quantity and the quality of training data. As such, challenges lie in collecting valuable data and selecting targeted data for diversified ASTE model architecture. To this end, we propose a novel Ge neral Da ta-Centric Framework (GeDa), GeDa ), which is capable of improving the training data for ASTE models accurately and efficiently. Specifically, two types of prompts are designed to guide large language models in synthetic candidates synthesizing for ASTE task. Then, the Characteristic-Driven Iterative Strategy is put forward to optimize the interaction between the model and the training data. The data is iteratively selected from the synthetic candidates, aiming to improve the quantity and the quality of training data. With multiple iterations, a targeted training set can be obtained to benefit ASTE model learning. Extensive experiments reveal that ASTE models with GeDa reach a more than 5% increment on average F1 by adding only a small amount of training data.
引用
下载
收藏
页数:9
相关论文
共 50 条
  • [21] Multiscale feature aggregation network for aspect sentiment triplet extraction
    Linan Zhu
    Minhao Xu
    Zhechao Zhu
    Yifei Xu
    Xiangjie Kong
    Applied Intelligence, 2023, 53 : 17762 - 17777
  • [22] Document-Level Sentiment Knowledge Transfer Network for Aspect Sentiment Triplet Extraction
    Tan, Long
    Su, Zixian
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 377 - 382
  • [23] Neural transition model for aspect-based sentiment triplet extraction with triplet memory
    Wu, Shengqiong
    Li, Bobo
    Xie, Dongdong
    Teng, Chong
    Ji, Donghong
    NEUROCOMPUTING, 2021, 463 : 45 - 58
  • [24] Enhanced Packed Marker with Entity Information for Aspect Sentiment Triplet Extraction
    Li, You
    Zeng, Xupeng
    Zeng, Yixiao
    Lin, Yuming
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 619 - 629
  • [25] INTEGRATED KNOWLEDGE GUIDANCE AND DEPENDENCY ENHANCEMENT FOR ASPECT SENTIMENT TRIPLET EXTRACTION
    Jia, Xian
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2024, 25 (06) : 1325 - 1342
  • [26] Learning Span-Level Interactions for Aspect Sentiment Triplet Extraction
    Xu, Lu
    Chia, Yew Ken
    Bing, Lidong
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4755 - 4766
  • [27] Aspect Sentiment Triplet Extraction Based on Deep Relationship Enhancement Networks
    Peng, Jun
    Su, Baohua
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [28] A dual relation-encoder network for aspect sentiment triplet extraction
    Xia, Tian
    Sun, Xia
    Yang, Yidong
    Long, Yunfei
    Sutcliffe, Richard
    NEUROCOMPUTING, 2024, 597
  • [29] Improving span-based Aspect Sentiment Triplet Extraction with part-of-speech filtering and contrastive learning
    Li Q.
    Wen W.
    Qin J.
    Neural Networks, 2024, 177
  • [30] Aspect-Based Sentiment Analysis of Patient Feedback Using Large Language Models
    Alkhnbashi, Omer S.
    Mohammad, Rasheed
    Hammoudeh, Mohammad
    Big Data and Cognitive Computing, 2024, 8 (12)