Abstractive Summarization with the Aid of Extractive Summarization

被引:2
|
作者
Chen, Yangbin [1 ]
Ma, Yun [1 ]
Mao, Xudong [1 ]
Li, Qing [1 ]
机构
[1] City Univ Hong Kong, Hong Kong, Peoples R China
来源
关键词
Abstractive document summarization; Squence-to-sequence; Joint learning;
D O I
10.1007/978-3-319-96890-2_1
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently the abstractive method and extractive method are two main approaches for automatic document summarization. To fully integrate the relatedness and advantages of both approaches, we propose in this paper a general framework for abstractive summarization which incorporates extractive summarization as an auxiliary task. In particular, our framework is composed of a shared hierarchical document encoder, an attention-based decoder for abstractive summarization, and an extractor for sentence-level extractive summarization. Learning these two tasks jointly with the shared encoder allows us to better capture the semantics in the document. Moreover, we constrain the attention learned in the abstractive task by the salience estimated in the extractive task to strengthen their consistency. Experiments on the CNN/DailyMail dataset demonstrate that both the auxiliary task and the attention constraint contribute to improve the performance significantly, and our model is comparable to the state-of-the-art abstractive models.
引用
收藏
页码:3 / 15
页数:13
相关论文
共 50 条
  • [1] A Combined Extractive With Abstractive Model for Summarization
    Liu, Wenfeng
    Gao, Yaling
    Li, Jinming
    Yang, Yuzhen
    [J]. IEEE ACCESS, 2021, 9 : 43970 - 43980
  • [2] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [3] Extractive Elementary Discourse Units for Improving Abstractive Summarization
    Xiong, Ye
    Racharak, Teeradaj
    Minh Le Nguyen
    [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2675 - 2679
  • [4] A Survey of Extractive and Abstractive Automatic Text Summarization Techniques
    Dalal, Vipul
    Malik, Latesh
    [J]. 2013 Sixth International Conference on Emerging Trends in Engineering and Technology (ICETET 2013), 2013, : 109 - 110
  • [5] Multi-Task Learning for Abstractive and Extractive Summarization
    Yangbin Chen
    Yun Ma
    Xudong Mao
    Qing Li
    [J]. Data Science and Engineering, 2019, 4 (1) : 14 - 23
  • [6] Integrating Extractive and Abstractive Models for Long Text Summarization
    Wang, Shuai
    Zhao, Xiang
    Li, Bo
    Ge, Bin
    Tang, Daquan
    [J]. 2017 IEEE 6TH INTERNATIONAL CONGRESS ON BIG DATA (BIGDATA CONGRESS 2017), 2017, : 305 - 312
  • [7] Abstractive vs. Extractive Summarization: An Experimental Review
    Giarelis, Nikolaos
    Mastrokostas, Charalampos
    Karacapilidis, Nikos
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [8] An Extractive-and-Abstractive Framework for Source Code Summarization
    Sun, Weisong
    Fang, Chunrong
    Chen, Yuchen
    Zhang, Quanjun
    Tao, Guanhong
    You, Yudu
    Han, Tingxu
    Ge, Yifei
    Hu, Yuling
    Luo, Bin
    Chen, Zhenyu
    [J]. ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, 2024, 33 (03)
  • [9] A Faster Method For Generating Chinese Text Summaries-Combining Extractive Summarization And Abstractive Summarization
    Yang, Wenchuan
    Gu, Tianyu
    Sui, Runqi
    [J]. 2022 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND NATURAL LANGUAGE PROCESSING, MLNLP 2022, 2022, : 54 - 58
  • [10] Abstractive Summarization Improved by WordNet-Based Extractive Sentences
    Xie, Niantao
    Li, Sujian
    Ren, Huiling
    Zhai, Qibin
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, PT I, 2018, 11108 : 404 - 415