Dual Encoding for Abstractive Text Summarization

被引:39
|
作者
Yao, Kaichun [1 ]
Zhang, Libo [2 ]
Du, Dawei [1 ]
Luo, Tiejian [1 ]
Tao, Lili [3 ]
Wu, Yanjun [2 ]
机构
[1] Univ Chinese Acad Sci, Sch Comp Control Engn, Beijing 100049, Peoples R China
[2] Chinese Acad Sci, State Key Lab Comp Sci, Inst Software, Beijing 100190, Peoples R China
[3] Univ West England, Dept Engn Design & Math, Bristol BS16 1QY, Avon, England
基金
中国国家自然科学基金;
关键词
Decoding; Encoding; Task analysis; Semantics; Recurrent neural networks; Computational modeling; Abstractive text summarization; dual encoding; primary encoder; recurrent neural network (RNN); secondary encoder;
D O I
10.1109/TCYB.2018.2876317
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recurrent neural network-based sequence-to-sequence attentional models have proven effective in abstractive text summarization. In this paper, we model abstractive text summarization using a dual encoding model. Different from the previous works only using a single encoder, the proposed method employs a dual encoder including the primary and the secondary encoders. Specifically, the primary encoder conducts coarse encoding in a regular way, while the secondary encoder models the importance of words and generates more fine encoding based on the input raw text and the previously generated output text summarization. The two level encodings are combined and fed into the decoder to generate more diverse summary that can decrease repetition phenomenon for long sequence generation. The experimental results on two challenging datasets (i.e., CNN/DailyMail and DUC 2004) demonstrate that our dual encoding model performs against existing methods.
引用
收藏
页码:985 / 996
页数:12
相关论文
共 50 条
  • [41] An abstractive text summarization using deep learning in Assamese
    Goutom P.J.
    Baruah N.
    Sonowal P.
    [J]. International Journal of Information Technology, 2023, 15 (5) : 2365 - 2372
  • [42] Abstractive Text Summarization for the Urdu Language: Data and Methods
    Awais, Muhammad
    Muhammad Adeel Nawab, Rao
    [J]. IEEE ACCESS, 2024, 12 : 61198 - 61210
  • [43] Reinforced Abstractive Text Summarization With Semantic Added Reward
    Jang, Heewon
    Kim, Wooju
    [J]. IEEE ACCESS, 2021, 9 : 103804 - 103810
  • [44] Graph-based abstractive biomedical text summarization
    Givchi, Azadeh
    Ramezani, Reza
    Baraani-Dastjerdi, Ahmad
    [J]. JOURNAL OF BIOMEDICAL INFORMATICS, 2022, 132
  • [45] Statistical and analytical study of guided abstractive text summarization
    Kallimani, Jagadish S.
    Srinivasa, K. G.
    Reddy, B. Eswara
    [J]. CURRENT SCIENCE, 2016, 110 (01): : 69 - 72
  • [46] Deep reinforcement and transfer learning for abstractive text summarization: A review
    Alomari, Ayham
    Idris, Norisma
    Sabri, Aznul Qalid Md
    Alsmadi, Izzat
    [J]. COMPUTER SPEECH AND LANGUAGE, 2022, 71
  • [47] Domain-Aware Abstractive Text Summarization for Medical Documents
    Gigioli, Paul
    Sagar, Nikhita
    Voyles, Joseph
    Rao, Anand
    [J]. PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 1155 - 1162
  • [48] A Comprehensive Survey of Abstractive Text Summarization Based on Deep Learning
    Zhang, Mengli
    Zhou, Gang
    Yu, Wanting
    Huang, Ningbo
    Liu, Wenfen
    [J]. Computational Intelligence and Neuroscience, 2022, 2022
  • [49] Abstractive text summarization using deep learning with a new Turkish summarization benchmark dataset
    Ertam, Fatih
    Aydin, Galip
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2022, 34 (09):
  • [50] A Multi-Task Learning Framework for Abstractive Text Summarization
    Lu, Yao
    Liu, Linqing
    Jiang, Zhile
    Yang, Min
    Goebel, Randy
    [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9987 - 9988