Boundary-Aware Abstractive Summarization with Entity-Augmented Attention for Enhancing Faithfulness

被引:0
|
作者
Li, Jiuyi [1 ]
Liu, Junpeng [1 ]
Ma, Jianjun [1 ]
Yang, Wei [1 ]
Huang, Degen [1 ]
机构
[1] Dalian Univ Technol, Dalian, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
Abstractive text summarization; factual consistency; entity-augmented;
D O I
10.1145/3641278
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the successful application of deep learning, document summarization systems can produce more readable results. However, abstractive summarization still suffers from unfaithful outputs and factual errors, especially in named entities. Current approaches tend to employ external knowledge to improve model performance while neglecting the boundary information and the semantics of the entities. In this article, we propose an entity-augmented method (EAM) to encourage the model to make full use of the entity boundary information and pay more attention to the critical entities. Experimental results on three Chinese and English summarization datasets show that our method outperforms several strong baselines and achieves state-of-the-art performance on the CLTS dataset. Our method can also improve the faithfulness of the summary and generalize well to different pre-trained language models. Moreover, we propose a method to evaluate the integrity of generated entities. Besides, we adapt the data augmentation method in the FactCC model according to the difference between Chinese and English in grammar and train a new evaluation model for factual consistency evaluation in Chinese summarization.
引用
收藏
页数:18
相关论文
共 23 条
  • [1] Faithfulness-Aware Decoding Strategies for Abstractive Summarization
    Wan, David
    Liu, Mengwen
    McKeown, Kathleen
    Dreyer, Markus
    Bansal, Mohit
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2864 - 2880
  • [2] Entity-Aware Abstractive Multi-Document Summarization
    Zhou, Hao
    Ren, Weidong
    Liu, Gongshen
    Su, Bo
    Lu, Wei
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 351 - 362
  • [3] Enhancing abstractive summarization of implicit datasets with contrastive attention
    Soonki Kwon
    Younghoon Lee
    [J]. Neural Computing and Applications, 2024, 36 (25) : 15337 - 15351
  • [4] Boundary-aware small object detection with attention and interaction
    Feng, Qihan
    Shao, Zhiwen
    Wang, Zhixiao
    [J]. VISUAL COMPUTER, 2024, 40 (09): : 5921 - 5934
  • [5] Entity-Driven Fact-Aware Abstractive Summarization of Biomedical Literature
    Alambo, Amanuel
    Banerjee, Tanvi
    Thirunarayan, Krishnaprasad
    Raymer, Michael
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 613 - 620
  • [6] A Boundary-aware Neural Model for Nested Named Entity Recognition
    Zheng, Changmeng
    Cai, Yi
    Xu, Jingyun
    Leung, Ho-fung
    Xu, Guandong
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 357 - 366
  • [7] ABANet: Attention boundary-aware network for image segmentation
    Rezvani, Sadjad
    Fateh, Mansoor
    Khosravi, Hossein
    [J]. EXPERT SYSTEMS, 2024, 41 (09)
  • [8] Summary-aware attention for social media short text abstractive summarization
    Wang, Qianlong
    Ren, Jiangtao
    [J]. NEUROCOMPUTING, 2021, 425 : 290 - 299
  • [9] KAAS: A Keyword-Aware Attention Abstractive Summarization Model for Scientific Articles
    Li, Shuaimin
    Xu, Jungang
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 263 - 271
  • [10] Effective Named Entity Recognition with Boundary-aware Bidirectional Neural Networks
    Li, Fei
    Wang, Zheng
    Hui, Siu Cheung
    Liao, Lejian
    Song, Dandan
    Xu, Jing
    [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1695 - 1703