Abstractive Summarization Improved by WordNet-Based Extractive Sentences

被引:3
|
作者
Xie, Niantao [1 ]
Li, Sujian [1 ]
Ren, Huiling [2 ]
Zhai, Qibin [3 ]
机构
[1] Peking Univ, MOE Key Lab Computat Linguist, Beijing, Peoples R China
[2] Chinese Acad Med Sci, Inst Med Informat, Beijing, Peoples R China
[3] Peking Univ, Sch Software & Microelectron, MOE Informat Secur Lab, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Abstractive summarization; Seq2seq model; Dual attention; Extractive summarization; WordNet;
D O I
10.1007/978-3-319-99495-6_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, the seq2seq abstractive summarization models have achieved good results on the CNN/Daily Mail dataset. Still, how to improve abstractive methods with extractive methods is a good research direction, since extractive methods have their potentials of exploiting various efficient features for extracting important sentences in one text. In this paper, in order to improve the semantic relevance of abstractive summaries, we adopt the WordNet based sentence ranking algorithm to extract the sentences which are most semantically to one text. Then, we design a dual attentional seq2seq framework to generate summaries with consideration of the extracted information. At the same time, we combine pointer-generator and coverage mechanisms to solve the problems of out-of-vocabulary (OOV) words and duplicate words which exist in the abstractive models. Experiments on the CNN/Daily Mail dataset show that our models achieve competitive performance with the state-of-the-art ROUGE scores. Human evaluations also show that the summaries generated by our models have high semantic relevance to the original text.
引用
收藏
页码:404 / 415
页数:12
相关论文
共 50 条
  • [1] WordNet-based document summarization
    Dang, Chenghua
    Luo, Xinjun
    [J]. WSEAS: ADVANCES ON APPLIED COMPUTER AND APPLIED COMPUTATIONAL SCIENCE, 2008, : 383 - +
  • [2] WordNet-based Summarization to Enhance Learning Interaction Tutoring
    Carbonaro, Antonella
    [J]. JOURNAL OF E-LEARNING AND KNOWLEDGE SOCIETY, 2010, 6 (02): : 67 - 74
  • [3] Abstractive Summarization with the Aid of Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. WEB AND BIG DATA (APWEB-WAIM 2018), PT I, 2018, 10987 : 3 - 15
  • [4] A Combined Extractive With Abstractive Model for Summarization
    Liu, Wenfeng
    Gao, Yaling
    Li, Jinming
    Yang, Yuzhen
    [J]. IEEE ACCESS, 2021, 9 : 43970 - 43980
  • [5] Sentence Pair Embeddings Based Evaluation Metric for Abstractive and Extractive Summarization
    Akula, Ramya
    Garibay, Ivan
    [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6009 - 6017
  • [6] Understanding Points of Correspondence between Sentences for Abstractive Summarization
    Lebanoff, Logan
    Muchovej, John
    Dernoncourt, Franck
    Kim, Doo Soon
    Wang, Lidan
    Chang, Walter
    Liu, Fei
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020): STUDENT RESEARCH WORKSHOP, 2020, : 191 - 198
  • [7] Multi-Task Learning for Abstractive and Extractive Summarization
    Chen, Yangbin
    Ma, Yun
    Mao, Xudong
    Li, Qing
    [J]. DATA SCIENCE AND ENGINEERING, 2019, 4 (01) : 14 - 23
  • [8] Extractive Elementary Discourse Units for Improving Abstractive Summarization
    Xiong, Ye
    Racharak, Teeradaj
    Minh Le Nguyen
    [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2675 - 2679
  • [9] Integrating Extractive and Abstractive Models for Long Text Summarization
    Wang, Shuai
    Zhao, Xiang
    Li, Bo
    Ge, Bin
    Tang, Daquan
    [J]. 2017 IEEE 6TH INTERNATIONAL CONGRESS ON BIG DATA (BIGDATA CONGRESS 2017), 2017, : 305 - 312
  • [10] A Survey of Extractive and Abstractive Automatic Text Summarization Techniques
    Dalal, Vipul
    Malik, Latesh
    [J]. 2013 Sixth International Conference on Emerging Trends in Engineering and Technology (ICETET 2013), 2013, : 109 - 110