A span-level Interactive Network for Aspect Sentiment Triplet Extraction Based on Learning Automated Concatenation of Embeddings

被引:0
|
作者
Lu, Heping [1 ]
Wang, Siyi [2 ]
Zhou, Yanquan [2 ]
Li, Lei [2 ]
Wang, Kai [2 ]
机构
[1] China Elect Power Res Inst, Beijing, Peoples R China
[2] Beijing Univ Posts & Telecommun, Sch Artificial Intelligence, Beijing, Peoples R China
关键词
ASTE; Word Representation; Reinforcement Learning; Information Fusion;
D O I
10.1109/ICNLP60986.2024.10692935
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ASTE task aims to extract sentiment triplets, where word embeddings are crucial for semantic expression. Existing models typically rely on a single pre-training model for context embeddings. However, recent research indicates that better word representations can be obtained by combining various types of embeddings. At the same time, most existing task models do not take into account the full use of the hidden semantic associations in the word representations of aspect items and opinion items. To address these issues, the Span-ACE (Automated Concatenation of Embedding) model is introduced. Span-ACE employs different pre-trained models for initial word embedding and dynamically selects embedding connections using reinforcement learning. It enhances the interaction between aspect and opinion items by employing self-information and dot product fusion for embedding. The model also screens and minimizes low-scoring embeddings based on predefined thresholds, resulting in improved performance. The fianl experimental results demonstrate that the Span-ACE model surpasses the state-of-the-art (SOTA) model on a subset of the ASTE-Data-V2-EMNLP2020 dataset for the ASTE task.
引用
收藏
页码:157 / 161
页数:5
相关论文
共 50 条
  • [41] SCL-SKG: software knowledge extraction with span-level contrastive learning
    Mingjing Tang
    Shu Zhang
    Ming Zheng
    Zifei Ma
    Wei Gao
    Artificial Intelligence Review, 2023, 56 : 2383 - 2406
  • [42] Learning Word Embeddings for Aspect-Based Sentiment Analysis
    Duc-Hong Pham
    Anh-Cuong Le
    Thi-Kim-Chung Le
    COMPUTATIONAL LINGUISTICS, PACLING 2017, 2018, 781 : 28 - 40
  • [43] Simple Approach for Aspect Sentiment Triplet Extraction Using Span-Based Segment Tagging and Dual Extractors
    Li, Dongxu
    Yang, Zhihao
    Lan, Yuquan
    Zhang, Yunqi
    Zhao, Hui
    Zhao, Gang
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 2374 - 2378
  • [44] A Relation-Aware Span-Level Transformer Network for Joint Entity and Relation Extraction
    Li, Zhe
    Fu, Luoyi
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [45] Neural transition model for aspect-based sentiment triplet extraction with triplet memory
    Wu, Shengqiong
    Li, Bobo
    Xie, Dongdong
    Teng, Chong
    Ji, Donghong
    NEUROCOMPUTING, 2021, 463 : 45 - 58
  • [46] DRHGNN: a dynamic residual hypergraph neural network for aspect sentiment triplet extraction
    Guo, Peng
    Yu, Zihao
    Li, Chao
    Sun, Jun
    APPLIED INTELLIGENCE, 2025, 55 (07)
  • [47] Bi-syntax guided transformer network for aspect sentiment triplet extraction
    Hao, Shufeng
    Zhou, Yu
    Liu, Ping
    Xu, Shuang
    NEUROCOMPUTING, 2024, 594
  • [48] Deep Interactive Memory Network for Aspect-Level Sentiment Analysis
    Sun, Chengai
    Lv, Liangyu
    Tian, Gang
    Liu, Tailu
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (01)
  • [49] Capsule Network with Interactive Attention for Aspect-Level Sentiment Classification
    Du, Chunning
    Sun, Haifeng
    Wang, Jingyu
    Qi, Qi
    Liao, Jianxin
    Xu, Tong
    Liu, Ming
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 5489 - 5498
  • [50] Aspect Sentiment Triplet Extraction Based on Deep Relationship Enhancement Networks
    Peng, Jun
    Su, Baohua
    APPLIED SCIENCES-BASEL, 2024, 14 (05):