GeDa: Improving training data with large language models for Aspect Sentiment Triplet Extraction

被引:0
|
作者
Mai, Weixing [1 ]
Zhang, Zhengxuan [1 ]
Chen, Yifan [1 ]
Li, Kuntao [1 ]
Xue, Yun [1 ]
机构
[1] South China Normal Univ, Sch Elect & Informat Engn, Guangdong Prov Key Lab Quantum Engn & Quantum Mat, Foshan 528225, Peoples R China
基金
中国国家自然科学基金;
关键词
Aspect sentiment triplet extraction; Improving training data; Large language models; Targeted data selection;
D O I
10.1016/j.knosys.2024.112289
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect Sentiment Triplet Extraction (ASTE) is a subtask of Aspect-based Sentiment Analysis (ABSA). Recently, ASTE methods have achieved promising results. However, the performance of ASTE models is restricted to both the quantity and the quality of training data. As such, challenges lie in collecting valuable data and selecting targeted data for diversified ASTE model architecture. To this end, we propose a novel Ge neral Da ta-Centric Framework (GeDa), GeDa ), which is capable of improving the training data for ASTE models accurately and efficiently. Specifically, two types of prompts are designed to guide large language models in synthetic candidates synthesizing for ASTE task. Then, the Characteristic-Driven Iterative Strategy is put forward to optimize the interaction between the model and the training data. The data is iteratively selected from the synthetic candidates, aiming to improve the quantity and the quality of training data. With multiple iterations, a targeted training set can be obtained to benefit ASTE model learning. Extensive experiments reveal that ASTE models with GeDa reach a more than 5% increment on average F1 by adding only a small amount of training data.
引用
下载
收藏
页数:9
相关论文
共 50 条
  • [31] Sentiment trading with large language models
    Kirtac, Kemal
    Germano, Guido
    FINANCE RESEARCH LETTERS, 2024, 62
  • [32] Leveraging hierarchical language models for aspect-based sentiment analysis on financial data
    Lengkeek, Matteo
    Knaap, Finn van der
    Frasincar, Flavius
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (05)
  • [33] Dependency graph enhanced interactive attention network for aspect sentiment triplet extraction
    Shi, Lingling
    Han, Donghong
    Han, Jiayi
    Qiao, Baiyou
    Wu, Gang
    NEUROCOMPUTING, 2022, 507 : 315 - 324
  • [34] Bi-syntax guided transformer network for aspect sentiment triplet extraction
    Hao, Shufeng
    Zhou, Yu
    Liu, Ping
    Xu, Shuang
    NEUROCOMPUTING, 2024, 594
  • [35] Span-level bidirectional retention scheme for aspect sentiment triplet extraction
    Yang, Xuan
    Peng, Tao
    Bi, Haijia
    Han, Jiayu
    INFORMATION PROCESSING & MANAGEMENT, 2024, 61 (05)
  • [36] Dual-Encoder Attention Fusion Model for Aspect Sentiment Triplet Extraction
    Zhang, Yunqi
    Li, Songda
    Lan, Yuquan
    Zhao, Hui
    Zhao, Gang
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [37] Integration of Multi-Branch GCNs Enhancing Aspect Sentiment Triplet Extraction
    Shi, Xuefeng
    Hu, Min
    Deng, Jiawen
    Ren, Fuji
    Shi, Piao
    Yang, Jiaoyun
    APPLIED SCIENCES-BASEL, 2023, 13 (07):
  • [38] Word-Pair Relation Learning Method for Aspect Sentiment Triplet Extraction
    Xia H.
    Li Q.
    Xiao Y.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2022, 35 (03): : 262 - 270
  • [39] Affective Commonsense Knowledge Enhanced Dependency Graph for aspect sentiment triplet extraction
    Xiaowen Sun
    Zhenfang Zhu
    Jiangtao Qi
    Zhen Zhao
    Hongli Pei
    The Journal of Supercomputing, 2024, 80 : 8614 - 8636
  • [40] IDCN: A Novel Interactive Dual Channel Network for Aspect Sentiment Triplet Extraction
    Liu, Ning
    Hu, Jie
    Yao, Shunyu
    Liu, Dan
    Yang, Mingchuan
    IEEE ACCESS, 2022, 10 : 116453 - 116466