Extractive Elementary Discourse Units for Improving Abstractive Summarization

被引:0
|
作者
Xiong, Ye [1 ]
Racharak, Teeradaj [1 ]
Minh Le Nguyen [1 ]
机构
[1] Japan Adv Inst Sci & Technol, Nomi, Ishikawa, Japan
关键词
Abstractive summarization; text generation; two-stage summarization;
D O I
10.1145/3477495.3531916
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Abstractive summarization focuses on generating concise and fluent text from an original document while maintaining the original intent and containing the new words that do not appear in the original document. Recent studies point out that rewriting extractive summaries help improve the performance with a more concise and comprehensible output summary, which uses a sentence as a textual unit. However, a single document sentence normally cannot supply sufficient information. In this paper, we apply elementary discourse unit (EDU) as textual unit of content selection. In order to utilize EDU for generating a high quality summary, we propose a novel summarization model that first designs an EDU selector to choose salient content. Then, the generator model rewrites the selected EDUs as the final summary. To determine the relevancy of each EDU on the entire document, we choose to apply group tag embedding, which can establish the connection between summary sentences and relevant EDUs, so that our generator does not only focus on selected EDUs, but also ingest the entire original document. Extensive experiments on the CNN/Daily Mail dataset have demonstrated the effectiveness of our model.
引用
收藏
页码:2675 / 2679
页数:5
相关论文
共 50 条
  • [1] Abstractive Summarization with the Aid of Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. WEB AND BIG DATA (APWEB-WAIM 2018), PT I, 2018, 10987 : 3 - 15
  • [2] A Combined Extractive With Abstractive Model for Summarization
    Liu, Wenfeng
    Gao, Yaling
    Li, Jinming
    Yang, Yuzhen
    [J]. IEEE ACCESS, 2021, 9 : 43970 - 43980
  • [3] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [4] Integrating Extractive and Abstractive Models for Long Text Summarization
    Wang, Shuai
    Zhao, Xiang
    Li, Bo
    Ge, Bin
    Tang, Daquan
    [J]. 2017 IEEE 6TH INTERNATIONAL CONGRESS ON BIG DATA (BIGDATA CONGRESS 2017), 2017, : 305 - 312
  • [5] A Survey of Extractive and Abstractive Automatic Text Summarization Techniques
    Dalal, Vipul
    Malik, Latesh
    [J]. 2013 Sixth International Conference on Emerging Trends in Engineering and Technology (ICETET 2013), 2013, : 109 - 110
  • [6] Multi-Task Learning for Abstractive and Extractive Summarization
    Yangbin Chen
    Yun Ma
    Xudong Mao
    Qing Li
    [J]. Data Science and Engineering, 2019, 4 (1) : 14 - 23
  • [7] Abstractive vs. Extractive Summarization: An Experimental Review
    Giarelis, Nikolaos
    Mastrokostas, Charalampos
    Karacapilidis, Nikos
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [8] An Extractive-and-Abstractive Framework for Source Code Summarization
    Sun, Weisong
    Fang, Chunrong
    Chen, Yuchen
    Zhang, Quanjun
    Tao, Guanhong
    You, Yudu
    Han, Tingxu
    Ge, Yifei
    Hu, Yuling
    Luo, Bin
    Chen, Zhenyu
    [J]. ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, 2024, 33 (03)
  • [9] Discourse understanding and factual consistency in abstractive summarization
    Paul G. Allen School of Computer Science and Engineering, University of Washington, United States
    不详
    不详
    不详
    不详
    [J]. EACL - Conf. Eur. Chapter Assoc. Comput. Linguist., Proc. Conf., (435-447):
  • [10] eTowards Improving Faithfulness in Abstractive Summarization
    Chen, Xiuying
    Li, Mingzhe
    Gao, Xin
    Zhang, Xiangliang
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,