Dual Encoding for Abstractive Text Summarization

被引:38
|
作者
Yao, Kaichun [1 ]
Zhang, Libo [2 ]
Du, Dawei [1 ]
Luo, Tiejian [1 ]
Tao, Lili [3 ]
Wu, Yanjun [2 ]
机构
[1] Univ Chinese Acad Sci, Sch Comp Control Engn, Beijing 100049, Peoples R China
[2] Chinese Acad Sci, State Key Lab Comp Sci, Inst Software, Beijing 100190, Peoples R China
[3] Univ West England, Dept Engn Design & Math, Bristol BS16 1QY, Avon, England
基金
中国国家自然科学基金;
关键词
Decoding; Encoding; Task analysis; Semantics; Recurrent neural networks; Computational modeling; Abstractive text summarization; dual encoding; primary encoder; recurrent neural network (RNN); secondary encoder;
D O I
10.1109/TCYB.2018.2876317
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recurrent neural network-based sequence-to-sequence attentional models have proven effective in abstractive text summarization. In this paper, we model abstractive text summarization using a dual encoding model. Different from the previous works only using a single encoder, the proposed method employs a dual encoder including the primary and the secondary encoders. Specifically, the primary encoder conducts coarse encoding in a regular way, while the secondary encoder models the importance of words and generates more fine encoding based on the input raw text and the previously generated output text summarization. The two level encodings are combined and fed into the decoder to generate more diverse summary that can decrease repetition phenomenon for long sequence generation. The experimental results on two challenging datasets (i.e., CNN/DailyMail and DUC 2004) demonstrate that our dual encoding model performs against existing methods.
引用
收藏
页码:985 / 996
页数:12
相关论文
共 50 条
  • [1] Highlighted Word Encoding for Abstractive Text Summarization
    Lal, Daisy Monika
    Singh, Krishna Pratap
    Tiwary, Uma Shanker
    [J]. INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2019), 2020, 11886 : 77 - 86
  • [2] Global Encoding for Abstractive Summarization
    Lin, Junyang
    Sun, Xu
    Ma, Shuming
    Su, Qi
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 163 - 169
  • [3] Abstractive text summarization for Hungarian
    Yang, Zijian Gyozo
    Agocs, Adam
    Kusper, Gabor
    Varadi, Tamas
    [J]. ANNALES MATHEMATICAE ET INFORMATICAE, 2021, 53 : 299 - 316
  • [4] A Survey on Abstractive Text Summarization
    Moratanch, N.
    Chitrakala, S.
    [J]. PROCEEDINGS OF IEEE INTERNATIONAL CONFERENCE ON CIRCUIT, POWER AND COMPUTING TECHNOLOGIES (ICCPCT 2016), 2016,
  • [5] An approach to Abstractive Text Summarization
    Huong Thanh Le
    Tien Manh Le
    [J]. 2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR), 2013, : 371 - 376
  • [6] Survey on Abstractive Text Summarization
    Raphal, Nithin
    Duwarah, Hemanta
    Daniel, Philemon
    [J]. PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATION AND SIGNAL PROCESSING (ICCSP), 2018, : 513 - 517
  • [7] Abstractive Text Summarization Using Hybrid Technique of Summarization
    Liaqat, Muhammad Irfan
    Hamid, Isma
    Nawaz, Qamar
    Shafique, Nida
    [J]. 2022 14TH INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS (ICCSN 2022), 2022, : 141 - 144
  • [8] Selective Encoding for Abstractive Sentence Summarization
    Zhou, Qingyu
    Yang, Nan
    Wei, Furu
    Zhou, Ming
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1095 - 1104
  • [9] Variational Neural Decoder for Abstractive Text Summarization
    Zhao, Huan
    Cao, Jie
    Xu, Mingquan
    Lu, Jian
    [J]. COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2020, 17 (02) : 537 - 552
  • [10] Reinforcement Learning Models for Abstractive Text Summarization
    Buciumas, Sergiu
    [J]. PROCEEDINGS OF THE 2019 ANNUAL ACM SOUTHEAST CONFERENCE (ACMSE 2019), 2019, : 270 - 271