A Cascade Approach to Neural Abstractive Summarization with Content Selection and Fusion

被引:0
|
作者
Lebanoff, Logan [1 ]
Dernoncourt, Franck [2 ]
Kim, Doo Soon [2 ]
Chang, Walter [2 ]
Liu, Fei [1 ]
机构
[1] Univ Cent Florida, Dept Comp Sci, Orlando, FL 32816 USA
[2] Adobe Res, San Jose, CA USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present an empirical study in favor of a cascade architecture to neural text summarization. Summarization practices vary widely but few other than news summarization can provide a sufficient amount of training data enough to meet the requirement of end-to-end neural abstractive systems which perform content selection and surface realization jointly to generate abstracts. Such systems also pose a challenge to summarization evaluation, as they force content selection to be evaluated along with text generation, yet evaluation of the latter remains an unsolved problem. In this paper, we present empirical results showing that the performance of a cascaded pipeline that separately identifies important content pieces and stitches them together into a coherent text is comparable to or outranks that of end-to-end systems, whereas a pipeline architecture allows for flexible content selection. We finally discuss how we can take advantage of a cascaded pipeline in neural text summarization and shed light on important directions for future research.
引用
收藏
页码:529 / 535
页数:7
相关论文
共 50 条
  • [1] Attend to Medical Ontologies: Content Selection for Clinical Abstractive Summarization
    Sotudeh, Sajad
    Goharian, Nazli
    Filice, Ross W.
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1899 - 1905
  • [2] A Few Good Sentences: Content Selection for Abstractive Text Summarization
    Srivastava, Vivek
    Bhat, Savita
    Pedanekar, Niranjan
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT IV, 2023, 14172 : 124 - 141
  • [3] Abstractive Summarization by Neural Attention Model with Document Content Memory
    Choi, Yunseok
    Kim, Dahae
    Lee, Jee-Hyong
    [J]. PROCEEDINGS OF THE 2018 CONFERENCE ON RESEARCH IN ADAPTIVE AND CONVERGENT SYSTEMS (RACS 2018), 2018, : 11 - 16
  • [4] Attention Head Masking for Inference Time Content Selection in Abstractive Summarization
    Cao, Shuyang
    Wang, Lu
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5008 - 5016
  • [5] Improving Neural Abstractive Document Summarization with Explicit Information Selection Modeling
    Li, Wei
    Xiao, Xinyan
    Lyu, Yajuan
    Wang, Yuanzhuo
    [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1787 - 1796
  • [6] An approach to Abstractive Text Summarization
    Huong Thanh Le
    Tien Manh Le
    [J]. 2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR), 2013, : 371 - 376
  • [7] Neural Abstractive Summarization with Structural Attention
    Chowdhury, Tanya
    Kumar, Sachin
    Chakraborty, Tanmoy
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3716 - 3722
  • [8] Neural sentence fusion for diversity driven abstractive multi-document summarization
    Fuad, Tanvir Ahmed
    Nayeem, Mir Tafseer
    Mahmud, Asif
    Chali, Yllias
    [J]. COMPUTER SPEECH AND LANGUAGE, 2019, 58 : 216 - 230
  • [9] FCSF-TABS: two-stage abstractive summarization with fact-aware reinforced content selection and fusion
    Zhang, Mengli
    Zhou, Gang
    Yu, Wanting
    Liu, Wenfen
    Huang, Ningbo
    Yu, Ze
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (13): : 10547 - 10560
  • [10] FCSF-TABS: two-stage abstractive summarization with fact-aware reinforced content selection and fusion
    Mengli Zhang
    Gang Zhou
    Wanting Yu
    Wenfen Liu
    Ningbo Huang
    Ze Yu
    [J]. Neural Computing and Applications, 2022, 34 : 10547 - 10560